From ed3c0614823b753c437b02c4dd33e4e9580ce5c8 Mon Sep 17 00:00:00 2001 From: Cheen Hau <33478814+chtanch@users.noreply.github.com> Date: Wed, 27 Mar 2024 18:12:55 +0800 Subject: [PATCH] Add links for ipex US and CN servers --- .../source/doc/LLM/Overview/FAQ/faq.md | 2 +- .../source/doc/LLM/Overview/install_gpu.md | 83 +++++++++++++++---- .../doc/LLM/Quickstart/bigdl_llm_migration.md | 1 + .../doc/LLM/Quickstart/install_linux_gpu.md | 27 +++++- 4 files changed, 91 insertions(+), 22 deletions(-) diff --git a/docs/readthedocs/source/doc/LLM/Overview/FAQ/faq.md b/docs/readthedocs/source/doc/LLM/Overview/FAQ/faq.md index 5377276f330..c86d109ed12 100644 --- a/docs/readthedocs/source/doc/LLM/Overview/FAQ/faq.md +++ b/docs/readthedocs/source/doc/LLM/Overview/FAQ/faq.md @@ -9,7 +9,7 @@ Please also refer to [here](https://github.com/intel-analytics/ipex-llm?tab=read ## How to Resolve Errors -### Fail to install `ipex-llm` through `pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/` +### Fail to install `ipex-llm` through `pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/` or `pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/` You could try to install IPEX-LLM dependencies for Intel XPU from source archives: - For Windows system, refer to [here](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Overview/install_gpu.html#install-ipex-llm-from-wheel) for the steps. diff --git a/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md b/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md index 63b76886bb2..5cb05378202 100644 --- a/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md +++ b/docs/readthedocs/source/doc/LLM/Overview/install_gpu.md @@ -50,13 +50,28 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t ``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11. Python 3.9 is recommended for best practices. ``` -The easiest ways to install `ipex-llm` is the following commands: +The easiest ways to install `ipex-llm` is the following commands, +choosing either US or CN website for `extra-index-url`: -``` -conda create -n llm python=3.9 libuv -conda activate llm +```eval_rst +.. tabs:: + .. tab:: US + + .. code-block:: cmd + + conda create -n llm python=3.9 libuv + conda activate llm + + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + + .. tab:: CN + + .. code-block:: cmd + + conda create -n llm python=3.9 libuv + conda activate llm -pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/ ``` ### Install IPEX-LLM From Wheel @@ -387,31 +402,65 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t ```eval_rst .. tabs:: .. tab:: PyTorch 2.1 + Choose either US or CN website for `extra-index-url`: - .. code-block:: bash + .. tabs:: + .. tab:: US - conda create -n llm python=3.9 - conda activate llm + .. code-block:: bash - pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + conda create -n llm python=3.9 + conda activate llm - .. note:: + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + + .. note:: - The ``xpu`` option will install IPEX-LLM with PyTorch 2.1 by default, which is equivalent to + The ``xpu`` option will install IPEX-LLM with PyTorch 2.1 by default, which is equivalent to - .. code-block:: bash + .. code-block:: bash + + pip install --pre --upgrade ipex-llm[xpu_2.1] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + + .. tab:: CN + + .. code-block:: bash + + conda create -n llm python=3.9 + conda activate llm + + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/ + + .. note:: + + The ``xpu`` option will install IPEX-LLM with PyTorch 2.1 by default, which is equivalent to + + .. code-block:: bash - pip install --pre --upgrade ipex-llm[xpu_2.1] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + pip install --pre --upgrade ipex-llm[xpu_2.1] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/ .. tab:: PyTorch 2.0 + Choose either US or CN website for `extra-index-url`: - .. code-block:: bash + .. tabs:: + .. tab:: US - conda create -n llm python=3.9 - conda activate llm + .. code-block:: bash + + conda create -n llm python=3.9 + conda activate llm + + pip install --pre --upgrade ipex-llm[xpu_2.0] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + + .. tab:: CN + + .. code-block:: bash + + conda create -n llm python=3.9 + conda activate llm - pip install --pre --upgrade ipex-llm[xpu_2.0] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + pip install --pre --upgrade ipex-llm[xpu_2.0] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/ ``` diff --git a/docs/readthedocs/source/doc/LLM/Quickstart/bigdl_llm_migration.md b/docs/readthedocs/source/doc/LLM/Quickstart/bigdl_llm_migration.md index 5ccdd4570c4..420fc9f6389 100644 --- a/docs/readthedocs/source/doc/LLM/Quickstart/bigdl_llm_migration.md +++ b/docs/readthedocs/source/doc/LLM/Quickstart/bigdl_llm_migration.md @@ -18,6 +18,7 @@ pip install --pre --upgrade ipex-llm[all] # for cpu ``` ### For GPU +Choose either US or CN website for `extra-index-url`: ```eval_rst .. tabs:: diff --git a/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md b/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md index 0599eb64dbf..6f2a0103b53 100644 --- a/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md +++ b/docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md @@ -94,10 +94,29 @@ conda activate llm ## Install `ipex-llm` -* With the `llm` environment active, use `pip` to install `ipex-llm` for GPU: - ``` - pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ - ``` +With the `llm` environment active, use `pip` to install `ipex-llm` for GPU. +Choose either US or CN website for `extra-index-url`: + +```eval_rst +.. tabs:: + .. tab:: US + + .. code-block:: cmd + + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ + + .. tab:: CN + + .. code-block:: cmd + + pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/ +``` + +```eval_rst +.. note:: + + If you encounter network issues while installing IPEX, refer to `this guide `_ for troubleshooting advice. +``` ## Verify Installation * You can verify if `ipex-llm` is successfully installed by simply importing a few classes from the library. For example, execute the following import command in the terminal: