-
-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to load LoRA fine-tuned LLM from HF (AssertionError) #3404
Comments
I also encountered the same error. This happens because (#2816) the
|
Thanks @sagar-deepscribe! In my case, I need new special tokens but this is good stuff for me to learn. |
The adapter path is usually a file name "adapter_model.safetensors" |
thanks, @sagar-deepscribe - that was helpful. Here is your code slightly edited and w/ copy-n-paste instructions to run:
|
This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you! |
Following the docs about Using LoRA Adapters, I am finding an assert problem. My code:
The error:
By comparing the model found in the aforementioned documentation, I realized my model is "exporting" a couple of tensors (found in the
adapter_model.safetensors
file) that are not expected by vLLM code to be there, namely:base_model.model.lm_head.base_layer.weight
, andbase_model.model.model.embed_tokens.base_layer.weight
.This code will crash if
weight
-named tensors are not "coming" from lora (by looking at the tensor name).In the model used for the documentation, all tensors contain 'lora' in their names.
I am pretty new to this and followed this fine-tuning guide.
The question is how can I "fix" this issue? Is the problem related to the fine-tuning guide? Maybe because the
LoRAConfig
is not correct or because the way the model is persisted. Is it instead related to vLLM?Thanks!
The text was updated successfully, but these errors were encountered: