Skip to content

Commit

Permalink
fix styl
Browse files Browse the repository at this point in the history
  • Loading branch information
MeouSker77 committed Dec 25, 2024
1 parent a6aeac7 commit c51b103
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion python/llm/src/ipex_llm/transformers/models/llama.py
Original file line number Diff line number Diff line change
Expand Up @@ -163,7 +163,8 @@ def llama_attention_forward(
cos, sin = self.rotary_emb(value_states, position_ids)
else:
cos, sin = position_embeddings
query_states, key_states = apply_rotary_pos_emb(query_states, key_states, cos, sin, position_ids)
query_states, key_states = apply_rotary_pos_emb(query_states, key_states,
cos, sin, position_ids)

if past_key_value is not None:
# [CompressKV]
Expand Down

0 comments on commit c51b103

Please sign in to comment.