-
Notifications
You must be signed in to change notification settings - Fork 27.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PEFT: Access active_adapters as a property in Trainer #30790
PEFT: Access active_adapters as a property in Trainer #30790
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for fixing !
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the quick fix!
Thanks :) now I can stop hitting my head on the table. |
I thought this was the solution to my similar error, File "/home/jimmy/pyvenv/310/lib/python3.10/site-packages/transformers/trainer.py", line 2694, in _load_best_model I am trying to run this example: Thanks huggingface for these libraries! |
Hey @speculaas Which |
thanks for your help! My peft and transformers versions are:
|
Thanks for reporting the problem. The problematic line is the following: The issue stems from the fact that if the Anyway, this is an issue that should be fixed on the PEFT side. I'll work on a PR. |
Fixes the error reported here: huggingface/transformers#30790 (comment) Unfortunately, transformers models have an active_adapters method but it's 1) not a property and 2) calling it fails because the base model (usually) has no loaded adapter. The base model can be a transformers model for prompt learning, where the base model is not wrapped in a LoraModel or similar. Therefore, this special case needs to be handled separately.
Fixes the error reported here: huggingface/transformers#30790 (comment) Unfortunately, transformers models have an active_adapters method but it's 1) not a property and 2) calling it fails because the base model (usually) has no loaded adapter. The base model can be a transformers model for prompt learning, where the base model is not wrapped in a LoraModel or similar. Therefore, this special case needs to be handled separately.
The PR is merged. |
Thanks for the fix and explanation! |
Do you recommend install it from source? |
Installing from source is always a little bit more risky when it comes to breaking things, so if you can wait, I would wait for the next release. That said, if you fix the commit hash, you'll be most likely fine when installing from source. |
PEFTModel has a property called
active_adapters
. It was being accessed as a function which led to the below error.Seeing issues such as
Issue seems similar to #30754 and was likely introduced in #30738
Fixes: #30754
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@younesbelkada