-
-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v0.5.0 Release Tracker #5224
Comments
Hi, is it possible to include the following PRs? |
Yes, #4409 please. |
if you can include #5164 in the release it would be great |
Major PRs for supporting VLMs:
It would be ideal if we can add at least two more VLMs outside LLaVA. |
What's going on with the releases? It shows up under releases but it's also not the latest release. Docker vllm/vllm-openai also hasn't been updated? |
It has not been released yet. The pre-release note in GitHub is generated as part of build process. When the release is out on PyPI and docker, we will close this issue and mark the release latest on GitHub |
@simon-mo Sorry for tagging you. Would you mind telling me if vllm support aarch-64 now or not? |
No. Contribution welcomed. |
ETA Monday 06/10.
This is a version bump in minor because we expect the following major features to informally enter beta stage:
Blockers:
Optional:
FSM
toGuide
#4109The text was updated successfully, but these errors were encountered: