Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Installation]: cannot install vllm with openvino backend #9092

Closed
1 task done
guanxiang opened this issue Oct 5, 2024 · 1 comment · Fixed by #9121
Closed
1 task done

[Installation]: cannot install vllm with openvino backend #9092

guanxiang opened this issue Oct 5, 2024 · 1 comment · Fixed by #9121
Labels
installation Installation problems

Comments

@guanxiang
Copy link

Your current environment

ERROR: Cannot install openvino-tokenizers[transformers]==2024.4.0.0, optimum-intel and vllm==0.6.3.dev100+g15986f59.openvino because these package versions have conflicting dependencies.

The conflict is caused by:
    vllm 0.6.3.dev100+g15986f59.openvino depends on transformers>=4.45.0
    openvino-tokenizers[transformers] 2024.4.0.0 depends on transformers>=4.36.0; extra == "transformers"
    optimum-intel 1.19.0 depends on transformers<4.45 and >=4.36

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip to attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

How you are installing vllm

git clone https://github.com/vllm-project/vllm.git
cd vllm
pip install -r requirements-build.txt --extra-index-url https://download.pytorch.org/whl/cpu
PIP_EXTRA_INDEX_URL="https://download.pytorch.org/whl/cpu" VLLM_TARGET_DEVICE=openvino python -m pip install -v .
follow the offical step
https://docs.vllm.ai/en/latest/getting_started/openvino-installation.html

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@DarkLight1337
Copy link
Member

Can you test out #9121 and see if it enables you to install the latest vLLM?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
installation Installation problems
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants