Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix built-in lora system bugs caused by torch.nn.MultiheadAttention #15190

Merged
merged 1 commit into from
Mar 9, 2024

Conversation

KohakuBlueleaf
Copy link
Collaborator

Description

Fix the bugs in torch.nn.MultiheadAttention since it doesn't have "weight" attribute

also improve the logic in sending orig_weight into network_module

Checklist:

@KohakuBlueleaf KohakuBlueleaf changed the title Fix bugs for torch.nn.MultiheadAttention Fix bugs in built-in lora system for torch.nn.MultiheadAttention Mar 9, 2024
@KohakuBlueleaf KohakuBlueleaf changed the title Fix bugs in built-in lora system for torch.nn.MultiheadAttention Fix built-in lora system bugs caused by torch.nn.MultiheadAttention Mar 9, 2024
@AUTOMATIC1111 AUTOMATIC1111 merged commit 4c9a7b8 into dev Mar 9, 2024
6 checks passed
@AUTOMATIC1111 AUTOMATIC1111 deleted the dora-weight-decompose branch March 9, 2024 05:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants