Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Promote the CLI out of utilities #13767

Merged
merged 13 commits into from
Jul 23, 2022
2 changes: 1 addition & 1 deletion docs/source-pytorch/cli/lightning_cli_advanced_2.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
pass


class LightningCLI(pl.utilities.cli.LightningCLI):
class LightningCLI(pl.cli.LightningCLI):
def __init__(self, *args, trainer_class=NoFitTrainer, run=False, **kwargs):
super().__init__(*args, trainer_class=trainer_class, run=run, **kwargs)

Expand Down
10 changes: 5 additions & 5 deletions docs/source-pytorch/cli/lightning_cli_advanced_3.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
pass


class LightningCLI(pl.utilities.cli.LightningCLI):
class LightningCLI(pl.cli.LightningCLI):
def __init__(self, *args, trainer_class=NoFitTrainer, run=False, **kwargs):
super().__init__(*args, trainer_class=trainer_class, run=run, **kwargs)

Expand Down Expand Up @@ -88,7 +88,7 @@ Similar to the callbacks, any parameter in :class:`~pytorch_lightning.trainer.tr
:class:`~pytorch_lightning.core.module.LightningModule` and
:class:`~pytorch_lightning.core.datamodule.LightningDataModule` classes that have as type hint a class, can be
configured the same way using :code:`class_path` and :code:`init_args`. If the package that defines a subclass is
imported before the :class:`~pytorch_lightning.utilities.cli.LightningCLI` class is run, the name can be used instead of
imported before the :class:`~pytorch_lightning.cli.LightningCLI` class is run, the name can be used instead of
the full import path.

From command line the syntax is the following:
Expand Down Expand Up @@ -117,7 +117,7 @@ callback appended. Here is an example:

.. note::

Serialized config files (e.g. ``--print_config`` or :class:`~pytorch_lightning.utilities.cli.SaveConfigCallback`)
Serialized config files (e.g. ``--print_config`` or :class:`~pytorch_lightning.cli.SaveConfigCallback`)
always have the full ``class_path``'s, even when class name shorthand notation is used in command line or in input
config files.

Expand Down Expand Up @@ -306,7 +306,7 @@ example can be when one wants to add support for multiple optimizers:

.. code-block:: python

from pytorch_lightning.utilities.cli import instantiate_class
from pytorch_lightning.cli import instantiate_class


class MyModel(LightningModule):
Expand All @@ -330,7 +330,7 @@ example can be when one wants to add support for multiple optimizers:
cli = MyLightningCLI(MyModel)

The value given to :code:`optimizer*_init` will always be a dictionary including :code:`class_path` and
:code:`init_args` entries. The function :func:`~pytorch_lightning.utilities.cli.instantiate_class`
:code:`init_args` entries. The function :func:`~pytorch_lightning.cli.instantiate_class`
takes care of importing the class defined in :code:`class_path` and instantiating it using some positional arguments,
in this case :code:`self.parameters()`, and the :code:`init_args`.
Any number of optimizers and learning rate schedulers can be added when using :code:`link_to`.
Expand Down
18 changes: 9 additions & 9 deletions docs/source-pytorch/cli/lightning_cli_expert.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
pass


class LightningCLI(pl.utilities.cli.LightningCLI):
class LightningCLI(pl.cli.LightningCLI):
def __init__(self, *args, trainer_class=NoFitTrainer, run=False, **kwargs):
super().__init__(*args, trainer_class=trainer_class, run=run, **kwargs)

Expand Down Expand Up @@ -62,23 +62,23 @@ Eliminate config boilerplate (Advanced)
Customize the LightningCLI
**************************

The init parameters of the :class:`~pytorch_lightning.utilities.cli.LightningCLI` class can be used to customize some
The init parameters of the :class:`~pytorch_lightning.cli.LightningCLI` class can be used to customize some
things, namely: the description of the tool, enabling parsing of environment variables and additional arguments to
instantiate the trainer and configuration parser.

Nevertheless the init arguments are not enough for many use cases. For this reason the class is designed so that can be
extended to customize different parts of the command line tool. The argument parser class used by
:class:`~pytorch_lightning.utilities.cli.LightningCLI` is
:class:`~pytorch_lightning.utilities.cli.LightningArgumentParser` which is an extension of python's argparse, thus
:class:`~pytorch_lightning.cli.LightningCLI` is
:class:`~pytorch_lightning.cli.LightningArgumentParser` which is an extension of python's argparse, thus
adding arguments can be done using the :func:`add_argument` method. In contrast to argparse it has additional methods to
add arguments, for example :func:`add_class_arguments` adds all arguments from the init of a class, though requiring
parameters to have type hints. For more details about this please refer to the `respective documentation
<https://jsonargparse.readthedocs.io/en/stable/#classes-methods-and-functions>`_.

The :class:`~pytorch_lightning.utilities.cli.LightningCLI` class has the
:meth:`~pytorch_lightning.utilities.cli.LightningCLI.add_arguments_to_parser` method which can be implemented to include
The :class:`~pytorch_lightning.cli.LightningCLI` class has the
:meth:`~pytorch_lightning.cli.LightningCLI.add_arguments_to_parser` method which can be implemented to include
more arguments. After parsing, the configuration is stored in the :code:`config` attribute of the class instance. The
:class:`~pytorch_lightning.utilities.cli.LightningCLI` class also has two methods that can be used to run code before
:class:`~pytorch_lightning.cli.LightningCLI` class also has two methods that can be used to run code before
and after the trainer runs: :code:`before_<subcommand>` and :code:`after_<subcommand>`.
A realistic example for these would be to send an email before and after the execution.
The code for the :code:`fit` subcommand would be something like:
Expand All @@ -104,7 +104,7 @@ instantiating the trainer class can be found in :code:`self.config['fit']['train

.. tip::

Have a look at the :class:`~pytorch_lightning.utilities.cli.LightningCLI` class API reference to learn about other
Have a look at the :class:`~pytorch_lightning.cli.LightningCLI` class API reference to learn about other
methods that can be extended to customize a CLI.

----
Expand Down Expand Up @@ -211,7 +211,7 @@ A more compact version that avoids writing a dictionary would be:
************************
Connect two config files
************************
Another case in which it might be desired to extend :class:`~pytorch_lightning.utilities.cli.LightningCLI` is that the
Another case in which it might be desired to extend :class:`~pytorch_lightning.cli.LightningCLI` is that the
model and data module depend on a common parameter. For example in some cases both classes require to know the
:code:`batch_size`. It is a burden and error prone giving the same value twice in a config file. To avoid this the
parser can be configured so that a value is only given once and then propagated accordingly. With a tool implemented
Expand Down
6 changes: 3 additions & 3 deletions docs/source-pytorch/cli/lightning_cli_faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
pass


class LightningCLI(pl.utilities.cli.LightningCLI):
class LightningCLI(pl.cli.LightningCLI):
def __init__(self, *args, trainer_class=NoFitTrainer, run=False, **kwargs):
super().__init__(*args, trainer_class=trainer_class, run=run, **kwargs)

Expand Down Expand Up @@ -65,7 +65,7 @@ there is a failure an exception is raised and the full stack trace printed.
Reproducibility with the LightningCLI
*************************************
The topic of reproducibility is complex and it is impossible to guarantee reproducibility by just providing a class that
people can use in unexpected ways. Nevertheless, the :class:`~pytorch_lightning.utilities.cli.LightningCLI` tries to
people can use in unexpected ways. Nevertheless, the :class:`~pytorch_lightning.cli.LightningCLI` tries to
give a framework and recommendations to make reproducibility simpler.

When an experiment is run, it is good practice to use a stable version of the source code, either being a released
Expand All @@ -85,7 +85,7 @@ For every CLI implemented, users are encouraged to learn how to run it by readin
:code:`--help` option and use the :code:`--print_config` option to guide the writing of config files. A few more details
that might not be clear by only reading the help are the following.

:class:`~pytorch_lightning.utilities.cli.LightningCLI` is based on argparse and as such follows the same arguments style
:class:`~pytorch_lightning.cli.LightningCLI` is based on argparse and as such follows the same arguments style
as many POSIX command line tools. Long options are prefixed with two dashes and its corresponding values should be
provided with an empty space or an equal sign, as :code:`--option value` or :code:`--option=value`. Command line options
are parsed from left to right, therefore if a setting appears multiple times the value most to the right will override
Expand Down
2 changes: 1 addition & 1 deletion docs/source-pytorch/cli/lightning_cli_intermediate.rst
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ The simplest way to control a model with the CLI is to wrap it in the LightningC

# main.py
import torch
from pytorch_lightning.utilities.cli import LightningCLI
from pytorch_lightning.cli import LightningCLI

# simple demo classes for your convenience
from pytorch_lightning.demos.boring_classes import DemoModel, BoringDataModule
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,9 @@
from torchmetrics import Accuracy

from pytorch_lightning import cli_lightning_logo, LightningModule
from pytorch_lightning.cli import LightningCLI
from pytorch_lightning.demos.boring_classes import Net
from pytorch_lightning.demos.mnist_datamodule import MNIST
from pytorch_lightning.utilities.cli import LightningCLI

DATASETS_PATH = path.join(path.dirname(__file__), "..", "..", "Datasets")

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,9 @@
from torchmetrics import Accuracy

from pytorch_lightning import cli_lightning_logo, LightningDataModule, LightningModule
from pytorch_lightning.cli import LightningCLI
from pytorch_lightning.demos.boring_classes import Net
from pytorch_lightning.demos.mnist_datamodule import MNIST
from pytorch_lightning.utilities.cli import LightningCLI

DATASETS_PATH = path.join(path.dirname(__file__), "..", "..", "Datasets")

Expand Down
2 changes: 1 addition & 1 deletion examples/pl_basics/autoencoder.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,8 @@
from torch.utils.data import DataLoader, random_split

from pytorch_lightning import callbacks, cli_lightning_logo, LightningDataModule, LightningModule, Trainer
from pytorch_lightning.cli import LightningCLI
from pytorch_lightning.demos.mnist_datamodule import MNIST
from pytorch_lightning.utilities.cli import LightningCLI
from pytorch_lightning.utilities.imports import _TORCHVISION_AVAILABLE
from pytorch_lightning.utilities.rank_zero import rank_zero_only

Expand Down
2 changes: 1 addition & 1 deletion examples/pl_basics/backbone_image_classifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,8 @@
from torch.utils.data import DataLoader, random_split

from pytorch_lightning import cli_lightning_logo, LightningDataModule, LightningModule
from pytorch_lightning.cli import LightningCLI
from pytorch_lightning.demos.mnist_datamodule import MNIST
from pytorch_lightning.utilities.cli import LightningCLI
from pytorch_lightning.utilities.imports import _TORCHVISION_AVAILABLE

if _TORCHVISION_AVAILABLE:
Expand Down
2 changes: 1 addition & 1 deletion examples/pl_basics/profiler_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,8 @@
import torchvision.transforms as T

from pytorch_lightning import cli_lightning_logo, LightningDataModule, LightningModule
from pytorch_lightning.cli import LightningCLI
from pytorch_lightning.profilers.pytorch import PyTorchProfiler
from pytorch_lightning.utilities.cli import LightningCLI

DEFAULT_CMD_LINE = (
"fit",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@

from pytorch_lightning import cli_lightning_logo, LightningDataModule, LightningModule
from pytorch_lightning.callbacks.finetuning import BaseFinetuning
from pytorch_lightning.utilities.cli import LightningCLI
from pytorch_lightning.cli import LightningCLI
from pytorch_lightning.utilities.rank_zero import rank_zero_info

log = logging.getLogger(__name__)
Expand Down
2 changes: 1 addition & 1 deletion examples/pl_domain_templates/imagenet.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,8 +47,8 @@

from pytorch_lightning import LightningModule
from pytorch_lightning.callbacks import ModelCheckpoint, TQDMProgressBar
from pytorch_lightning.cli import LightningCLI
from pytorch_lightning.strategies import ParallelStrategy
from pytorch_lightning.utilities.cli import LightningCLI


class ImageNetLightningModel(LightningModule):
Expand Down
2 changes: 1 addition & 1 deletion examples/pl_hpu/mnist_sample.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,9 @@
from torch.nn import functional as F

from pytorch_lightning import LightningModule
from pytorch_lightning.cli import LightningCLI
from pytorch_lightning.demos.mnist_datamodule import MNISTDataModule
from pytorch_lightning.plugins import HPUPrecisionPlugin
from pytorch_lightning.utilities.cli import LightningCLI


class LitClassifier(LightningModule):
Expand Down
2 changes: 1 addition & 1 deletion examples/pl_integrations/dali_image_classifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,8 @@
from torch.utils.data import random_split

from pytorch_lightning import cli_lightning_logo, LightningDataModule, LightningModule
from pytorch_lightning.cli import LightningCLI
from pytorch_lightning.demos.mnist_datamodule import MNIST
from pytorch_lightning.utilities.cli import LightningCLI
from pytorch_lightning.utilities.imports import _DALI_AVAILABLE, _TORCHVISION_AVAILABLE

if _TORCHVISION_AVAILABLE:
Expand Down
2 changes: 1 addition & 1 deletion examples/pl_servable_module/production.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@
from PIL import Image as PILImage

from pytorch_lightning import cli_lightning_logo, LightningDataModule, LightningModule
from pytorch_lightning.cli import LightningCLI
from pytorch_lightning.serve import ServableModule, ServableModuleValidator
from pytorch_lightning.utilities.cli import LightningCLI

DATASETS_PATH = path.join(path.dirname(__file__), "..", "..", "Datasets")

Expand Down
2 changes: 2 additions & 0 deletions src/pytorch_lightning/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -173,6 +173,8 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Deprecated LightningCLI's registries in favor of importing the respective package ([#13221](https://github.com/PyTorchLightning/pytorch-lightning/pull/13221))


- Deprecated public utilities in `pytorch_lightning.utilities.cli.LightningCLI` in favor of equivalent copies in `pytorch_lightning.cli.LightningCLI` ([#13767](https://github.com/PyTorchLightning/pytorch-lightning/pull/13767))


- Deprecated `pytorch_lightning.profiler` in favor of `pytorch_lightning.profilers` ([#12308](https://github.com/PyTorchLightning/pytorch-lightning/pull/12308))

Expand Down
Loading