-
-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[New Model]: Jamba (MoE Mamba from AI21) #3690
Labels
new model
Requests to new models
Comments
pretty please |
Plsss |
7 tasks
Jamba is a really interesting model. It’s an MoE+Transformer+Mamba hybrid so I’m not sure how that would work with vllm. I’d love to add support for it. thoughts / pointers? @WoosukKwon @zhuohan123 |
Closed as completed by #4115. |
Merged
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The model to consider.
https://huggingface.co/ai21labs/Jamba-v0.1
The closest model vllm already supports.
No response
What's your difficulty of supporting the model you want?
Jamba is a new model and interesting hybrid architecture MoE Mamba model that look very promising.
https://www.ai21.com/blog/announcing-jamba
The text was updated successfully, but these errors were encountered: