From 74e029151d3fb50c6418ff0993ddc111913415f2 Mon Sep 17 00:00:00 2001 From: scap3yvt <149599669+scap3yvt@users.noreply.github.com> Date: Thu, 11 Jan 2024 08:49:54 -0500 Subject: [PATCH 1/3] added documentation for extending optimizer submodule --- GANDLF/optimizers/README.md | 13 ++++++++----- 1 file changed, 8 insertions(+), 5 deletions(-) diff --git a/GANDLF/optimizers/README.md b/GANDLF/optimizers/README.md index df0f02639..e8291637c 100644 --- a/GANDLF/optimizers/README.md +++ b/GANDLF/optimizers/README.md @@ -2,8 +2,11 @@ ## Adding a new algorithm -- Define a new submodule under `GANDLF.optimizers`. -- Ensure that the new algorithm is wrapped in a function which returns a scheduler, by following one of the examples in `GANDLF.optimizers.sgd`. - - If the new function is from a pre-defined package, put it under `GANDLF.optimizers.wrap_${package_name}.py`. -- Add the algorithm's identifier to `GANDLF.optimizers.__init__.global_optimizer_dict` as appropriate. -- Call the new algorithm from the config using the `optimizer` key. \ No newline at end of file +- For an optimizer defined in PyTorch [[ref](https://pytorch.org/docs/stable/optim.html#algorithms)], update the `GANDLF.optimizers.wrap_torch.py` submodule. +- For a custom optimizer, create a new submodule called `GANDLF.optimizers.${awesome_optimizer}.py`. Ensure that it inherits from PyTorch's base optimizer class [[ref](https://pytorch.org/docs/stable/optim.html#base-class)] +- If a new dependency needs to be used, update GaNDLF's [`setup.py`](https://github.com/mlcommons/GaNDLF/blob/master/setup.py) with the new requirement. + - Define a new submodule under `GANDLF.optimizers` as `GANDLF.optimizers.wrap_${package_name}.py`. + - Ensure that the new algorithm is wrapped in a function which returns an object with the PyTorch optimizer type. Use any of the optimizers in `GANDLF.optimizers.wrap_torch.py` as an example. +- Add the algorithm's identifier to `GANDLF.optimizers.__init__.global_optimizer_dict` with an appropriate key. +- Call the new algorithm from the config using the `optimizer` key. +- [Update the tests!](https://mlcommons.github.io/GaNDLF/extending/#update-tests)https://mlcommons.github.io/GaNDLF/extending/#update-tests From 7ee7409ea7ad670fe5cab16fb1a2d45b31780628 Mon Sep 17 00:00:00 2001 From: scap3yvt <149599669+scap3yvt@users.noreply.github.com> Date: Thu, 11 Jan 2024 08:53:56 -0500 Subject: [PATCH 2/3] Update README.md --- GANDLF/optimizers/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/GANDLF/optimizers/README.md b/GANDLF/optimizers/README.md index e8291637c..b12a61f12 100644 --- a/GANDLF/optimizers/README.md +++ b/GANDLF/optimizers/README.md @@ -1,4 +1,4 @@ -# GANDLF Preprocessing +# GANDLF Optimizers ## Adding a new algorithm From c65c80a47390a11659ef2d0dd4aa8a5b611e66b9 Mon Sep 17 00:00:00 2001 From: scap3yvt <149599669+scap3yvt@users.noreply.github.com> Date: Thu, 11 Jan 2024 08:57:38 -0500 Subject: [PATCH 3/3] added documentation for scheduler --- GANDLF/schedulers/README.md | 13 ++++++++----- 1 file changed, 8 insertions(+), 5 deletions(-) diff --git a/GANDLF/schedulers/README.md b/GANDLF/schedulers/README.md index c27dda744..30d5e3fc0 100644 --- a/GANDLF/schedulers/README.md +++ b/GANDLF/schedulers/README.md @@ -2,8 +2,11 @@ ## Adding a new algorithm -- Define a new submodule under `GANDLF.schedulers`. -- Ensure that the new algorithm is wrapped in a function which returns a schedulers, by following one of the examples in `GANDLF.schedulers.triangle`. - - If the new function is from a pre-defined package, put it under `GANDLF.schedulers.wrap_${package_name}.py`. -- Add the algorithm's identifier to `GANDLF.schedulers.__init__.global_schedulerss_dict` as appropriate. -- Call the new algorithm from the config using the `scheduler` key. \ No newline at end of file +- For a scheduler defined in PyTorch [[ref](https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate)], update the `GANDLF.schedulers.wrap_torch.py` submodule. +- For a custom scheduler, create a new submodule called `GANDLF.schedulers.${awesome_optimizer}.py`. Ensure that it inherits from PyTorch's base optimizer class [[ref](https://pytorch.org/docs/stable/optim.html#base-class)]. Follow the example of [`torch.optim.lr_scheduler.LinearLR`](https://pytorch.org/docs/stable/_modules/torch/optim/lr_scheduler.html#LinearLR) as an example. +- If a new dependency needs to be used, update GaNDLF's [`setup.py`](https://github.com/mlcommons/GaNDLF/blob/master/setup.py) with the new requirement. + - Define a new submodule under `GANDLF.schedulers` as `GANDLF.schedulers.wrap_${package_name}.py`. + - Ensure that the new algorithm is wrapped in a function which returns an object with the PyTorch optimizer type. Use any of the optimizers in `GANDLF.schedulers.wrap_torch.py` as an example. +- Add the algorithm's identifier to `GANDLF.schedulers.__init__.global_optimizer_dict` with an appropriate key. +- Call the new algorithm from the config using the `scheduler` key. +- [Update the tests!](https://mlcommons.github.io/GaNDLF/extending/#update-tests)https://mlcommons.github.io/GaNDLF/extending/#update-tests