Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about UQFF #836

Open
schnapper79 opened this issue Oct 8, 2024 · 5 comments
Open

Question about UQFF #836

schnapper79 opened this issue Oct 8, 2024 · 5 comments

Comments

@schnapper79
Copy link

I used your example to create a UQFF file from Lama-3.2-90B.

Now i am really struggling with loading the model from this UQFF-file. i try to use --from-uqff, but without adding -m and a model, it wont run. But if a add -m model it loads this model and safetensor files. So what's the trick?

@EricLBuehler
Copy link
Owner

Hi @schnapper79! Yes, you need the -a and -m to load this model. Could you please paste the command if there is still an issue?

@gfxenjoyer
Copy link

Trying to run mistralrs-server on Windows with

cargo run --features cuda -- -i vision-plain -m EricB/Llama-3.2-11B-Vision-Instruct-UQFF --from-uqff llam3.2-vision-instruct-q4k.uqff

results in Error: Expected file with extension one of .safetensors, .pth, .pt, .bin. I even tried adding -a vllama and still got an error.

@gfxenjoyer
Copy link

Ignore the comment above. I was able to get it running with the latest commits. I did notice that residual.safetensors was downloaded before the uqff model.

@Jonghwan-hong
Copy link

@gfxenjoyer I have same issue. How can i solve this issue? Can you share the solution?

@EricLBuehler
Copy link
Owner

@Jonghwan-hong can you please share the command you ran?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants