Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Support for Gradient Checkpointing #759

Merged
merged 13 commits into from
Jan 26, 2025

Conversation

lenglaender
Copy link
Member

@lenglaender lenglaender commented Nov 7, 2024

Add Support for Gradient Checkpointing

This PR adds support for gradient checkpointing Gradient checkpointing is a technique that trades computation for memory by recomputing intermediate activations during the backward pass instead of storing them. This is particularly useful when training large models. Because we recompute values during the backpropagation, we need to preserve the original ForwardContext in this phase. I solved this by overwriting the gradient_checkpointing_enable function so that the checkpoint function receives the current ForwardContext as the backward pass context manager.

- oerwrite the gradient_checkpointing_enable to provide our ForwardContext during the recomputation of values during backpropagation
- 2 bugs remaining: bottleneck adapter for models with the legacy implementation (BERT) & Parallel. Parallel has the problem that we manipulate the batch dimension and this currently leads to an error
akatief added a commit to akatief/adapters that referenced this pull request Nov 11, 2024
@calpt calpt force-pushed the fix/gradient_checkpointing branch from 5edfd4d to 94df2fe Compare November 25, 2024 22:09
@calpt calpt linked an issue Dec 7, 2024 that may be closed by this pull request
docs & style & fixes

- albert: skip unsupported tests
- deberta(V2): fix embedding bug with inplace operations.
- deberta: fix LoRAMergedLinear Bug with device mismatch
@lenglaender lenglaender marked this pull request as ready for review January 14, 2025 10:49
@lenglaender lenglaender changed the title WIP: Add Support for Gradient Checkpointing Add Support for Gradient Checkpointing Jan 14, 2025
Copy link
Member

@calpt calpt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for digging into this and enabling compatibility w adapters, this looks great! Just a couple of small comments before we're good to merge

tests/methods/base.py Show resolved Hide resolved
src/adapters/model_mixin.py Show resolved Hide resolved
@lenglaender
Copy link
Member Author

lenglaender commented Jan 21, 2025

Hey @calpt, can you quickly review my replies to your comments and approve the PR if everything is alright so I can merge this? We need to merge this PR first, then add all new tests to test refactoring #740, and then we can merge #763 (because Julian has already merged the test refactoring into his PR). So, currently, this PR is blocking everything we want to have in the next release.

@lenglaender lenglaender merged commit adef6dc into adapter-hub:main Jan 26, 2025
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

ForwardContext is None with gradient checkpointing enabled
2 participants