Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feat] Add lru_cache to import_utils calls that did not previously have it #1584

Merged
merged 4 commits into from
Mar 26, 2024

Conversation

tisles
Copy link
Contributor

@tisles tisles commented Mar 25, 2024

Fixes #1576

Adding @cache to the functions in import_utils results in a constant time for LORA loading, regardless of the length of python path. Have verified this using a python path with 50,000 directories - previously the lora load time was 259 seconds, now it's down to a constant time of 1.3 seconds 👍🏻

With some flamegraphs to prove! Before:
profile

After:
profile_fixed

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@tisles tisles changed the title [feat] add cache to remaining import utils [feat] Add lru_cache to import_utils calls that did not previously have it Mar 25, 2024
@BenjaminBossan
Copy link
Member

Thanks a lot for the PR and testing the solution, great that it solves the issue. Just out of curiosity, is there a specific reason for setting maxsize=None? From my understanding, just using the default should work equally well here.

@tisles
Copy link
Contributor Author

tisles commented Mar 26, 2024

Thanks a lot for the PR and testing the solution, great that it solves the issue. Just out of curiosity, is there a specific reason for setting maxsize=None? From my understanding, just using the default should work equally well here.

Yep, good point - you're right. I've updated it to remove the None setting.

Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for identifying the issue and providing a fix, LGTM.

@BenjaminBossan BenjaminBossan merged commit d582b68 into huggingface:main Mar 26, 2024
14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants