-
-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hardware Backend Deprecation Policy #8932
Comments
see https://dev-discuss.pytorch.org/t/pytorch-2-5-rc1-is-produced-for-pytorch-audio-vision/2460 for pytorch release schedule |
vLLM will try to upgrade to pytorch 2.5 at first, and we will leave 1~2 weeks for hardware vendors to catch up. |
OpenVINO has 2.1.2 as lower bound version vllm/requirements-openvino.txt Line 5 in 6c9ba48
which means any newer can also work. We just rely on PyTorch version supported by HF itself (e.g. transformers, tokenizers, optimum, etc) |
@ilya-lavrenov it is good to know, can you change openvino to use the same pytorch version (currently 2.4) as the default case? |
Re: "intel xpu (2.3.1)" I don't know almost any context here wrt. many vllm, pytorch specifics. But I believe my understanding is correct that in fact starting with pytorch v2.5 and becoming more complete in some gap areas in v2.6 pytorch will be supporting intel xpu devices including both data center models (I think some such was supported in v2.4) and client ARC / flex series etc. GPUs natively in pytorch without (AFAICT) depending on the IPEX intel pytorch extensions based XPU support. So if there is anything that is worse supported in pytorch v2.5 than v2.3.1 for intel xpu I don't know or expect it to be so except I do not know what utility could be had if any for using the IPEX based pytorch extensions wrt. xpu in v2.5+ since they might be not so much the relevant / necessary provider of xpu optimized support starting in v2.5. |
@youkaichao , why 2.5 and not 2.4? |
we need to leave some time for hardware vendors to catch up. |
@youkaichao @ghchris2021 CC @tye1 thanks, -yuan |
after discussing with hardware vendors, the final process would be:
|
Sure, please, have a look #9121 |
defining this "when" , it should be the first vllm release that comes with the newest pytorch version. |
Anything you want to discuss about vllm.
vLLM heavily depends on PyTorch, and also actively works with PyTorch team to leverage their new features. When a new PyTorch version comes out, vLLM usually upgrades to the latest PyTorch directly.
Meanwhile, vLLM supports diverse hardware backends from different vendors. They often require their own PyTorch versions.
In order to speed up the development of vLLM, hereby we require all vendors to keep up with PyTorch.
Starting from PyTorch 2.5 (Release Day (10/17/24)), vLLM will drop hardware support if it cannot support PyTorch 2.5.
potentially affected vendors and the current PyTorch version they require:
Note that latest pytorch support is a necessary condition for vLLM's hardware vendors. It is not sufficient. The vLLM team considers adding new hardware support depending on the community interest, the priority of the main branch, and the bandwidth of the team.
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: