diff --git a/docs/readthedocs/source/doc/LLM/Overview/FAQ/faq.md b/docs/readthedocs/source/doc/LLM/Overview/FAQ/faq.md index 5377276f330c..c86d109ed12c 100644 --- a/docs/readthedocs/source/doc/LLM/Overview/FAQ/faq.md +++ b/docs/readthedocs/source/doc/LLM/Overview/FAQ/faq.md @@ -9,7 +9,7 @@ Please also refer to [here](https://github.com/intel-analytics/ipex-llm?tab=read ## How to Resolve Errors -### Fail to install `ipex-llm` through `pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/` +### Fail to install `ipex-llm` through `pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/` or `pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/` You could try to install IPEX-LLM dependencies for Intel XPU from source archives: - For Windows system, refer to [here](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Overview/install_gpu.html#install-ipex-llm-from-wheel) for the steps. diff --git a/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md b/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md index 63b76886bb2d..b283aa98992a 100644 --- a/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md +++ b/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md @@ -50,13 +50,28 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t ``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11. Python 3.9 is recommended for best practices. ``` -The easiest ways to install `ipex-llm` is the following commands: +The easiest ways to install `ipex-llm` is the following commands, +choosing either US or CN website for `extra-index-url`: -``` -conda create -n llm python=3.9 libuv -conda activate llm +```eval_rst +.. tabs:: + .. tab:: US + + .. code-block:: cmd + + conda create -n llm python=3.9 libuv + conda activate llm + + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ -pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + .. tab:: CN + + .. code-block:: cmd + + conda create -n llm python=3.9 libuv + conda activate llm + + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/ ``` ### Install IPEX-LLM From Wheel @@ -388,20 +403,40 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t .. tabs:: .. tab:: PyTorch 2.1 - .. code-block:: bash + .. tabs:: + .. tab:: US - conda create -n llm python=3.9 - conda activate llm + .. code-block:: bash - pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + conda create -n llm python=3.9 + conda activate llm - .. note:: + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + + .. note:: + + The ``xpu`` option will install IPEX-LLM with PyTorch 2.1 by default, which is equivalent to + + .. code-block:: bash + + pip install --pre --upgrade ipex-llm[xpu_2.1] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ - The ``xpu`` option will install IPEX-LLM with PyTorch 2.1 by default, which is equivalent to + .. tab:: CN - .. code-block:: bash + .. code-block:: bash + + conda create -n llm python=3.9 libuv + conda activate llm + + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/ + + .. note:: + + The ``xpu`` option will install IPEX-LLM with PyTorch 2.1 by default, which is equivalent to + + .. code-block:: bash - pip install --pre --upgrade ipex-llm[xpu_2.1] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + pip install --pre --upgrade ipex-llm[xpu_2.1] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/ .. tab:: PyTorch 2.0 diff --git a/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md b/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md index 0599eb64dbfc..0eb17aa8c584 100644 --- a/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md +++ b/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md @@ -94,10 +94,29 @@ conda activate llm ## Install `ipex-llm` -* With the `llm` environment active, use `pip` to install `ipex-llm` for GPU: - ``` - pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ - ``` +With the `llm` environment active, use `pip` to install `ipex-llm` for GPU: +Choose either US or CN website for `extra-index-url`: + +```eval_rst +.. tabs:: + .. tab:: US + + .. code-block:: cmd + + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + + .. tab:: CN + + .. code-block:: cmd + + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/ +``` + +```eval_rst +.. note:: + + If you encounter network issues while installing IPEX, refer to `this guide `_ for troubleshooting advice. +``` ## Verify Installation * You can verify if `ipex-llm` is successfully installed by simply importing a few classes from the library. For example, execute the following import command in the terminal: