Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for CodeLlama #854

Merged
merged 2 commits into from
Aug 25, 2023
Merged

Conversation

Yard1
Copy link
Collaborator

@Yard1 Yard1 commented Aug 24, 2023

Needs huggingface/transformers#25740 to land first

Signed-off-by: Antoni Baum <[email protected]>
@Yard1 Yard1 changed the title Add support for llama code Add support for CodeLlama Aug 25, 2023
@Yard1
Copy link
Collaborator Author

Yard1 commented Aug 25, 2023

@WoosukKwon @zhuohan123 I think this can be merged now!

Copy link
Member

@zhuohan123 zhuohan123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for the quick PR and left a small comment. BTW does this changes our requirement for the version of the transformers library?

vllm/model_executor/models/llama.py Outdated Show resolved Hide resolved
@Yard1
Copy link
Collaborator Author

Yard1 commented Aug 25, 2023

@zhuohan123 The code is backwards compatible, but in order for someone to use CodeLlama with longer context, they will need to install (as of yet unreleased) transformers==4.33.0

@zhuohan123
Copy link
Member

Sounds good! Let me merge this PR first and we can bump up the requirements once transformers releases a new version.

@zhuohan123 zhuohan123 merged commit 4b6f069 into vllm-project:main Aug 25, 2023
2 checks passed
@Yard1 Yard1 deleted the llama_code_support branch August 25, 2023 19:49
randxie pushed a commit to randxie/vllm that referenced this pull request Aug 29, 2023
liuyanyi pushed a commit to liuyanyi/vllm that referenced this pull request Sep 12, 2023
hongxiayang pushed a commit to hongxiayang/vllm that referenced this pull request Feb 13, 2024
sjchoi1 pushed a commit to casys-kaist-internal/vllm that referenced this pull request May 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants