Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1014 learning rate finder #1454

Merged
merged 35 commits into from
Jan 26, 2021
Merged

1014 learning rate finder #1454

merged 35 commits into from
Jan 26, 2021

Conversation

rijobro
Copy link
Contributor

@rijobro rijobro commented Jan 15, 2021

Fixes #1014.

Description

Implements calculation of optimal learning rate based on https://github.com/davidtvs/pytorch-lr-finder.

image

Status

Ready

Types of changes

  • Non-breaking change (fix or new feature that would not break existing functionality).
  • New tests added to cover the changes.
  • Integration tests passed locally by running ./runtests.sh --codeformat --coverage.
  • Quick tests passed locally by running ./runtests.sh --quick.
  • In-line docstrings updated.
  • Documentation updated, tested make html command in the docs/ folder.

Signed-off-by: Richard Brown <[email protected]>
Signed-off-by: Richard Brown <[email protected]>
@Nic-Ma Nic-Ma changed the title learning rate finder 1014 learning rate finder Jan 18, 2021
@Nic-Ma Nic-Ma changed the title 1014 learning rate finder [WIP] 1014 learning rate finder Jan 18, 2021
Signed-off-by: Richard Brown <[email protected]>
Signed-off-by: Richard Brown <[email protected]>
@rijobro rijobro marked this pull request as ready for review January 18, 2021 17:20
@rijobro rijobro changed the title [WIP] 1014 learning rate finder 1014 learning rate finder Jan 20, 2021
Signed-off-by: Richard Brown <[email protected]>
Signed-off-by: Richard Brown <[email protected]>
@rijobro
Copy link
Contributor Author

rijobro commented Jan 21, 2021

@wyli @Nic-Ma @ericspod this part of the PR saves the state of the network and optimiser to disk or memory, such that they can be restored at the end.

Does this functionality live somewhere else in MONAI?

@ericspod
Copy link
Member

We had that kind of functionality as part of an Ignite handler for saving checkpoints, but that's not quite how you're doing things here.

Copy link
Contributor

@wyli wyli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @rijobro this is very useful!

Perhaps the way the training losses is computed/accumulated (self._train_batch) should be decoupled from the lr_finder. Intuitively I think the LR finder should work fine as long as the user provides a black box that takes lr/step as input, and returns a total_loss (e.g. self._train_batch and self._validate). Do you want to refactor this PR to decouple those? otherwise we could file a ticket and have another iteration.

pls see also some minor suggestions inline, they are mostly optional

@Can-Zhao would be great to have your comments as well!

monai/optimizers/lr_finder.py Outdated Show resolved Hide resolved
monai/optimizers/lr_finder.py Show resolved Hide resolved
monai/optimizers/lr_finder.py Outdated Show resolved Hide resolved
monai/optimizers/lr_finder.py Outdated Show resolved Hide resolved
monai/optimizers/lr_finder.py Outdated Show resolved Hide resolved
monai/optimizers/lr_finder.py Outdated Show resolved Hide resolved
monai/optimizers/lr_finder.py Outdated Show resolved Hide resolved
monai/optimizers/lr_finder.py Outdated Show resolved Hide resolved
monai/optimizers/lr_finder.py Outdated Show resolved Hide resolved
@rijobro
Copy link
Contributor Author

rijobro commented Jan 21, 2021

@wyli thanks, I'll get to it!

@rijobro rijobro mentioned this pull request Jan 22, 2021
6 tasks
@rijobro
Copy link
Contributor Author

rijobro commented Jan 25, 2021

@wyli this is ready if you want to review again, thanks!

Copy link
Contributor

@wyli wyli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks, it looks good, except some minor warnings on (object) https://deepsource.io/gh/Project-MONAI/MONAI/run/a53a8313-ab56-4aea-8a8b-e6fd4941f978/python/PYL-R0205 this new feature needs another iteration to decouple the actual training/validation logic_train_batch and _validate from the LearningRateFinder class

@rijobro rijobro merged commit a2da8a1 into Project-MONAI:master Jan 26, 2021
@rijobro rijobro deleted the learning_rate branch January 26, 2021 11:31
@rijobro rijobro mentioned this pull request Jan 26, 2021
7 tasks
@ShadowTwin41
Copy link
Contributor

I am having trouble using this class for GANs. Is there any way to do something similar to the "GANLearner" fastai?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

learning rate finder
4 participants