Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: lora+: include lr in optimizer kwargs #1973

Merged
merged 4 commits into from
Jul 30, 2024

Conversation

kallewoof
Copy link
Contributor

@kallewoof kallewoof commented Jul 30, 2024

Playing with the LoRA+ optimizer, I realized that the underlying optimizer will complain about the required positional lr argument if it is not included in the optimizer kwargs.

In the original implementation, this was actually already the case, but somehow this fell out along the way.

@kallewoof kallewoof mentioned this pull request Jul 30, 2024
3 tasks
@BenjaminBossan
Copy link
Member

Thanks for fixing this issue. But isn't the better fix to add lr to the kwargs? I.e.

- lr = kwargs["lr"]
+ kwargs["lr"] = lr

Then we can leave lr in the function signature, which is more user friendly than raising a KeyError down the line.

@kallewoof
Copy link
Contributor Author

Ah, to make it a forced argument to create_loraplus_optimizer. Yes.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing this oversight.

@BenjaminBossan BenjaminBossan merged commit 7e7b558 into huggingface:main Jul 30, 2024
14 checks passed
@kallewoof kallewoof deleted the 202407-loraplus-lr-fix branch July 30, 2024 12:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants