Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump exllamav2 from 0.0.8 to 0.0.10 & Fix code change #4782

Merged
merged 6 commits into from
Dec 5, 2023

Conversation

yhyu13
Copy link
Contributor

@yhyu13 yhyu13 commented Dec 1, 2023

Bump exllamav2 to 0.0.10 since it now support GPTQ 4 bit models w/o tokenizer.model

https://huggingface.co/TheBloke/deepseek-llm-67b-chat-GPTQ/discussions/1

Can confirm TheBloke's deepseek llm GPTQ is working with exllamav2 0.0.10

Checklist:

@waters222
Copy link
Contributor

found out u did the same thing after I create my PR
sad pepe

@oobabooga oobabooga changed the base branch from main to dev December 5, 2023 00:09
@oobabooga
Copy link
Owner

Thank you for the fix and the update.

@oobabooga oobabooga merged commit ac9f154 into oobabooga:dev Dec 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants