Skip to content

Commit

Permalink
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Update plugins doc
Browse files Browse the repository at this point in the history
kaushikb11 committed Mar 29, 2022
1 parent 83c2def commit de8ed15
Showing 2 changed files with 11 additions and 32 deletions.
2 changes: 1 addition & 1 deletion docs/source/common/checkpointing.rst
Original file line number Diff line number Diff line change
@@ -392,7 +392,7 @@ Custom Checkpoint IO Plugin
.. note::

Some ``TrainingTypePlugins`` like ``DeepSpeedStrategy`` do not support custom ``CheckpointIO`` as checkpointing logic is not modifiable.
Some strategies like :class:`~pytorch_lightning.strategies.deepspeed.DeepSpeedStrategy` do not support custom :class:`~pytorch_lightning.plugins.io.checkpoint_plugin.CheckpointIO` as checkpointing logic is not modifiable.

-----------

41 changes: 10 additions & 31 deletions docs/source/extensions/plugins.rst
Original file line number Diff line number Diff line change
@@ -10,50 +10,27 @@ Plugins allow custom integrations to the internals of the Trainer such as a cust
cluster environment implementation.

Under the hood, the Lightning Trainer is using plugins in the training routine, added automatically
depending on the provided Trainer arguments. For example:
depending on the provided Trainer arguments.

.. code-block:: python
# accelerator: GPUAccelerator
# training strategy: DDPStrategy
# precision: NativeMixedPrecisionPlugin
trainer = Trainer(accelerator="gpu", devices=4, precision=16)
We expose Accelerators and Plugins mainly for expert users that want to extend Lightning for:
There are three types of Plugins in Lightning with different responsibilities:

- New hardware (like TPU plugin)
- Distributed backends (e.g. a backend not yet supported by
`PyTorch <https://pytorch.org/docs/stable/distributed.html#backends>`_ itself)
- Clusters (e.g. customized access to the cluster's environment interface)
- Precision Plugins
- CheckpointIO Plugins
- Cluster Environments (e.g. customized access to the cluster's environment interface)

There are three types of Plugins in Lightning with different responsibilities:

Precision Plugins
-----------------

We expose precision plugins for the users

We provide precision plugins for the users so that they can benefit from numerical representations with lower precision than
32-bit floating-point or higher precision, such as 64-bit floating-point.

.. code-block:: python
# precision: FP16Plugin
trainer = Trainer(precision=16)
- Precision Plugins
- CheckpointIO Plugins
- Cluster Environments (e.g. customized access to the cluster's environment interface)


The full list of built-in plugins is listed below.


.. warning:: The Plugin API is in beta and subject to change.
For help setting up custom plugins/accelerators, please reach out to us at **support@pytorchlightning.ai**


Precision Plugins
-----------------
The full list of built-in precision plugins is listed below.

.. currentmodule:: pytorch_lightning.plugins.precision

@@ -97,6 +74,8 @@ You could learn more about custom checkpointing with Lightning :ref:`here <../co
Cluster Environments
--------------------

Clusters (e.g. customized access to the cluster's environment interface)

.. currentmodule:: pytorch_lightning.plugins.environments

.. autosummary::

0 comments on commit de8ed15

Please sign in to comment.