Skip to content

Commit

Permalink
LLM: Fix vLLM CPU convert error (#11926)
Browse files Browse the repository at this point in the history
  • Loading branch information
xiangyuT authored Aug 27, 2024
1 parent 5a8fc1b commit 7ca557a
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions python/llm/src/ipex_llm/vllm/cpu/model_convert.py
Original file line number Diff line number Diff line change
Expand Up @@ -254,8 +254,8 @@ def _ipex_llm_load_model(self) -> None:
scheduler_config=self.scheduler_config)
return

_model_mlp_convert()
_model_attention_convert()
# _model_mlp_convert()
# _model_attention_convert()

self.model = get_model(
model_config=self.model_config,
Expand Down

0 comments on commit 7ca557a

Please sign in to comment.