-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create a BattleMage QuickStart #12663
Conversation
|
||
--- | ||
|
||
### 1.2 Install IPEX-LLM |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In addition to ipex-llm[xpu]
for PyTorch, need to also add ipex-llm[cpp]
for llama.cpp/ollama
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
``` | ||
|
||
#### For llama.cpp and Ollama: | ||
Install the `ipex-llm[cpp]` package. Choose either the US or CN website for `extra-index-url`: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ipex is not needed
|
||
--- | ||
|
||
### 1.3 Verify Installation |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This only works for PyTorch
``` | ||
|
||
#### For llama.cpp and Ollama: | ||
Install the `ipex-llm[cpp]` package. Choose either the US or CN website for `extra-index-url`: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ipex not needed
|
||
--- | ||
|
||
### 2.3 Verify Installation |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
only works for pytorch
|
||
--- | ||
|
||
## 3. Run a Quick Example |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
only works for pytorch
@@ -0,0 +1,230 @@ | |||
# Quickstart: Install and Use IPEX-LLM on Intel Battlemage B580 GPU |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Intel Arc B-Series GPU (code-named Battlemage)
@@ -1,4 +1,4 @@ | |||
# Quickstart: Install and Use IPEX-LLM on Intel Battlemage B580 GPU | |||
# Quickstart: Install and Use IPEX-LLM on Intel Arc B-Series GPU (code-named Battlemage) | |||
|
|||
This guide demonstrates how to install and use IPEX-LLM on the Intel Battlemage B580 GPU. It covers both **Linux** and **Windows** operating systems. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Intel Arc B-Series GPU (code-named Battlemage)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Description
1. Why the change?
This PR provides a QuickStart guide for users to set up IPEX-LLM on the Intel Battlemage B580 GPU. It includes installation instructions for both Linux and Windows, verification steps, and a sample test. Additionally, it offers usage references for benchmark, llama.cpp, Ollama, and vLLM, providing a comprehensive entry point for BattleMage users.
2. User API changes
3. Summary of the change
4. How to test?
1234
). And paste your action link here once it has been successfully finished.5. New dependencies
- Dependency1
- Dependency2
- ...
- Dependency1 and license1
- Dependency2 and license2
- ...