Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v0.5.0 Release Tracker #5224

Closed
2 tasks
simon-mo opened this issue Jun 3, 2024 · 9 comments
Closed
2 tasks

v0.5.0 Release Tracker #5224

simon-mo opened this issue Jun 3, 2024 · 9 comments

Comments

@simon-mo
Copy link
Collaborator

simon-mo commented Jun 3, 2024

ETA Monday 06/10.

This is a version bump in minor because we expect the following major features to informally enter beta stage:

  • Chunked Prefill
  • Speculative Decode
  • FP8
  • VLM

Blockers:

  • [ ]

Optional:

@simon-mo simon-mo added the misc label Jun 3, 2024
@simon-mo simon-mo mentioned this issue Jun 3, 2024
6 tasks
@simon-mo simon-mo removed the misc label Jun 3, 2024
@simon-mo simon-mo pinned this issue Jun 3, 2024
@DarkLight1337
Copy link
Member

DarkLight1337 commented Jun 5, 2024

Until I complete the chain of PRs related to #5214, imo the dev API for VLMs is still too unstable to be considered "beta".

On the other hand, the user API should be stable once #5237 is merged.

@sasha0552
Copy link
Contributor

Hi, is it possible to include the following PRs?

@chemistry-rocks
Copy link

Yes, #4409 please.

@flexorRegev
Copy link

if you can include #5164 in the release it would be great

@DarkLight1337
Copy link
Member

DarkLight1337 commented Jun 11, 2024

Major PRs for supporting VLMs:

It would be ideal if we can add at least two more VLMs outside LLaVA.

@w013nad
Copy link

w013nad commented Jun 11, 2024

What's going on with the releases? It shows up under releases but it's also not the latest release. Docker vllm/vllm-openai also hasn't been updated?

@simon-mo
Copy link
Collaborator Author

It has not been released yet. The pre-release note in GitHub is generated as part of build process. When the release is out on PyPI and docker, we will close this issue and mark the release latest on GitHub

@cyc00518
Copy link

@simon-mo Sorry for tagging you. Would you mind telling me if vllm support aarch-64 now or not?

@simon-mo
Copy link
Collaborator Author

No. Contribution welcomed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants