Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIX: Error in forward of 4bit linear lora layer #878

Merged

Commits on Aug 29, 2023

  1. FIX: Error in forward of 4bit linear lora layer

    This was introduced during the refactoring of the forward function. It
    should now be fixed and be equivalent to the forward function before the
    refactoring:
    
    https://github.com/huggingface/peft/blob/4df9c5a243194b03e703c1dd526d64163f9b4fd2/src/peft/tuners/lora.py#L1207
    
    Bug reported by @jiqing-feng
    BenjaminBossan committed Aug 29, 2023
    Configuration menu
    Copy the full SHA
    5f7fdb0 View commit details
    Browse the repository at this point in the history