-
Notifications
You must be signed in to change notification settings - Fork 27.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to load models with adapter weights in offline mode #31700
Comments
🫠 sounds like kwargs getting lost maybe? |
It's being triggered here in the PEFT library cc @BenjaminBossan Essentially, |
Thanks for flagging this, indeed, this breaks with offline mode. @Wauplin do you have a suggestion how we can correctly check if the file has already been locally cached? |
cc @Wauplin if you have the bw :) |
Easiest way to do that is to use And sorry I missed this notification 🙈 |
I worked on a fix: huggingface/peft#1976. It resolves the issue for me but I had trouble unit testing it, as dynamically setting offline mode in the unit test seems to have no effect :( I think it would still be okay to merge the fix without test but if anyone has an idea how to test it correctly, please LMK. |
System Info
transformers
version: 4.42.0.dev0Who can help?
Probably me @amyeroberts or @ArthurZucker.
PEFT weight loading code was originally added by @younesbelkada
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Unable to load models in offline model, even when the adapter weights are cache locally
This model uses
haoranxu/ALMA-13B-Pretrain
as adapter weights.If you first load the model s.t. the model and adapter weights are available in the cache, and then re-run in offline mode, the following error occurs:
Expected behavior
Can load the model in online and offline mode
The text was updated successfully, but these errors were encountered: