Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update peft requirement from ==0.5.* to ==0.6.* #4494

Merged
merged 2 commits into from
Nov 7, 2023

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Nov 6, 2023

Updates the requirements on peft to permit the latest version.

Release notes

Sourced from peft's releases.

🧨 Diffusers now uses 🤗 PEFT, new tuning methods, better quantization support, higher flexibility and more

Highlights

Integration with diffusers

🧨 Diffusers now leverage PEFT as a backend for LoRA inference for Stable Diffusion models (#873, #993, #961). Relevant PRs on 🧨 Diffusers are huggingface/diffusers#5058, huggingface/diffusers#5147, huggingface/diffusers#5151 and huggingface/diffusers#5359. This helps in unlocking a vast number of practically demanding use cases around adapter-based inference 🚀. Now you can do the following with easy-to-use APIs and it supports different checkpoint formats (Diffusers format, Kohya format ...):

  1. use multiple LoRAs
  2. switch between them instantaneously
  3. scale and combine them
  4. merge/unmerge
  5. enable/disable

For details, refer to the documentation at Inference with PEFT.

New tuning methods

Other notable additions

  • Allow merging of LoRA weights when using 4bit and 8bit quantization (bitsandbytes), thanks to @​jiqing-feng (#851, #875)
  • IA³ now supports 4bit quantization thanks to @​His-Wardship (#864)
  • We increased the speed of adapter layer initialization: This should be most notable when creating a PEFT LoRA model on top of a large base model (#887, #915, #994)
  • More fine-grained control when configuring LoRA: It is now possible to have different ranks and alpha values for different layers (#873)

Experimental features

  • For some adapters like LoRA, it is now possible to activate multiple adapters at the same time (#873)

Breaking changes

  • It is no longer allowed to create a LoRA adapter with rank 0 (r=0). This used to be possible, in which case the adapter was ignored.

What's Changed

As always, a bunch of small improvements, bug fixes and doc improvements were added. We thank all the external contributors, both new and recurring. Below is the list of all changes since the last release.

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

oobabooga and others added 2 commits November 6, 2023 12:18
Updates the requirements on [peft](https://github.com/huggingface/peft) to permit the latest version.
- [Release notes](https://github.com/huggingface/peft/releases)
- [Commits](huggingface/peft@v0.5.0...v0.6.0)

---
updated-dependencies:
- dependency-name: peft
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Nov 6, 2023
@oobabooga oobabooga changed the base branch from main to dev November 7, 2023 03:12
@oobabooga oobabooga merged commit 18739c8 into dev Nov 7, 2023
@dependabot dependabot bot deleted the dependabot/pip/peft-eq-0.6.star branch November 7, 2023 03:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant