Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Rope scaling #3598

Merged
merged 5 commits into from
Jul 25, 2023
Merged

Fix Rope scaling #3598

merged 5 commits into from
Jul 25, 2023

Conversation

shahules786
Copy link
Collaborator

@shahules786 shahules786 commented Jul 24, 2023

What

Fixed rope scaling for all models.

Why

Earlier the model config was being ignored during patching which might cause issues like initializing with wrong max_position_embeddings

How

Gets required args from model_config and passes it to patching functions.

Copy link
Collaborator

@andreaskoepf andreaskoepf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot.

@andreaskoepf andreaskoepf merged commit a32993d into LAION-AI:main Jul 25, 2023
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants