-
Notifications
You must be signed in to change notification settings - Fork 27.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[LLAVA-NEXT] Warning: The model weights are not tied. Please use the tie_weightsmethod before using the infer_auto_device function. #30001
Comments
Hi @aliencaocao, thanks for opening this issue! I'm unable to replicate this error when running the snippet on The checkpoint in the example panoyo9829/llava-v1.6-mistral-7b-bnb-4bit doesn't appear to be one compatible with a model in the transformers library. It maps to LlavaMistralForCausalLM which isn't defined in transformers. When running the snippet, the warning I get is about weights in the model being randomly initialized. |
I can reproduce this with the original untouched fp16 model as long as I use device_map: from transformers import LlavaNextForConditionalGeneration
model = LlavaNextForConditionalGeneration.from_pretrained('llava-hf/llava-v1.6-mistral-7b-hf', low_cpu_mem_usage=True, device_map='auto')
Note the notebook cannot be executed as it OOM RAM when loading fp16 weights. Notheless, the warning still came out: |
Thanks for sharing the details. The error indicates that the check_tied_parameters_in_config(model) for the model is evaluating as True. @muellerzr Any suggestion on how to handle this? The warning indicates calling |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
issue still there |
@SunMarc if you have a second to check this :) |
Hi @aliencaocao, the above PR should fix this. This happens because a config attribute was not set correctly but there was no impact apart from triggering a warning message. |
Glad to have that confirmation. Thanks for the fix! Would not have noticed this as I also checked the source code and saw the |
System Info
transformers
version: 4.40.0.dev0Who can help?
@amyeroberts
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Running this will yield a
The model weights are not tied. Please use the tie_weightsmethod before using theinfer_auto_device function.
warning. Inference result seems correct though, but I'm not sure if the warning should still be paid attention to.Expected behavior
No warning
The text was updated successfully, but these errors were encountered: