-
-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Explicit packed params in preparation for more LoRA support #2843
base: main
Are you sure you want to change the base?
Conversation
I think they are overall orthogonal but it will cause conflict for you once #2637 is merged. |
Yes, let's merge #2637 first -- I just wanted to make sure we have a path forward here to unify the LoRA handling and the weight loading :) |
This pull request has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this pull request should remain open. Thank you! |
This pull request has merge conflicts that must be resolved before it can be |
This PR is in preparation for #2831, to make it easier to implement LoRA for different models without too much code duplication with model weight loading.
It factors out which parameters are packed together and makes it part of the model class for the models that use parameter packing.