Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Ollama] Update ipex-llm ollama readme to v0.4.6 #12542

Merged
merged 2 commits into from
Dec 13, 2024

Conversation

sgwhat
Copy link
Contributor

@sgwhat sgwhat commented Dec 13, 2024

Description

1. Why the change?

Update ipex-llm ollama readme to v0.4.6.

2. User API changes

On Linux, we have the following changes:

2.1 Launch ollama service

# before `ollama serve`
export LD_LIBRARY_PATH=.:$LD_LIBRARY_PATH

2.2 Run ollama serve

# before `ollama run <model name>`
export LD_LIBRARY_PATH=.:$LD_LIBRARY_PATH
source /opt/intel/oneapi/setvars.sh

3. Summary of the change

4. How to test?

@sgwhat sgwhat requested a review from rnwang04 December 13, 2024 08:03
Copy link
Contributor

@rnwang04 rnwang04 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we also need to update llama.cpp version. others LGTM.

@@ -19,7 +19,7 @@ See the demo of running LLaMA2-7B on Intel Arc GPU below.
> [!NOTE]
> `ipex-llm[cpp]==2.2.0b20240826` is consistent with [v0.1.39](https://github.com/ollama/ollama/releases/tag/v0.1.39) of ollama.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please also update old version here.

@@ -80,6 +80,7 @@ You may launch the Ollama service as below:
export ZES_ENABLE_SYSMAN=1
source /opt/intel/oneapi/setvars.sh
export SYCL_CACHE_PERSISTENT=1
export LD_LIBRARY_PATH=.:$LD_LIBRARY_PATH
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why we need this ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks not very friendly for users. It would be better if we can hidden this for users.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is to link to the shared library when running ollama. I can try to hide this by modifying the compilation method.

@@ -19,7 +19,7 @@
> [!NOTE]
> `ipex-llm[cpp]==2.2.0b20240826` 版本与官方 ollama 版本 [v0.1.39](https://github.com/ollama/ollama/releases/tag/v0.1.39) 一致。
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please also update old version here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants