-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Ollama] Update ipex-llm ollama readme to v0.4.6 #12542
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we also need to update llama.cpp version. others LGTM.
@@ -19,7 +19,7 @@ See the demo of running LLaMA2-7B on Intel Arc GPU below. | |||
> [!NOTE] | |||
> `ipex-llm[cpp]==2.2.0b20240826` is consistent with [v0.1.39](https://github.com/ollama/ollama/releases/tag/v0.1.39) of ollama. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please also update old version here.
@@ -80,6 +80,7 @@ You may launch the Ollama service as below: | |||
export ZES_ENABLE_SYSMAN=1 | |||
source /opt/intel/oneapi/setvars.sh | |||
export SYCL_CACHE_PERSISTENT=1 | |||
export LD_LIBRARY_PATH=.:$LD_LIBRARY_PATH |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why we need this ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks not very friendly for users. It would be better if we can hidden this for users.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is to link to the shared library when running ollama
. I can try to hide this by modifying the compilation method.
@@ -19,7 +19,7 @@ | |||
> [!NOTE] | |||
> `ipex-llm[cpp]==2.2.0b20240826` 版本与官方 ollama 版本 [v0.1.39](https://github.com/ollama/ollama/releases/tag/v0.1.39) 一致。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please also update old version here.
Description
1. Why the change?
Update ipex-llm ollama readme to v0.4.6.
2. User API changes
On Linux, we have the following changes:
2.1 Launch ollama service
2.2 Run ollama serve
3. Summary of the change
4. How to test?