Skip to content

Commit

Permalink
add tag
Browse files Browse the repository at this point in the history
  • Loading branch information
rohitgr7 committed Mar 29, 2022
1 parent 4aa05f3 commit 6ce3a1e
Show file tree
Hide file tree
Showing 4 changed files with 5 additions and 10 deletions.
10 changes: 2 additions & 8 deletions docs/source/advanced/model_parallel.rst
Original file line number Diff line number Diff line change
Expand Up @@ -296,7 +296,6 @@ Below we show an example of running `ZeRO-Offload <https://www.deepspeed.ai/tuto
.. code-block:: python
from pytorch_lightning import Trainer
from pytorch_lightning.strategies import DeepSpeedStrategy
model = MyModel()
trainer = Trainer(accelerator="gpu", devices=4, strategy="deepspeed_stage_2_offload", precision=16)
Expand Down Expand Up @@ -341,7 +340,6 @@ For even more speed benefit, DeepSpeed offers an optimized CPU version of ADAM c
import pytorch_lightning
from pytorch_lightning import Trainer
from pytorch_lightning.strategies import DeepSpeedStrategy
from deepspeed.ops.adam import DeepSpeedCPUAdam
Expand Down Expand Up @@ -385,7 +383,6 @@ Also please have a look at our :ref:`deepspeed-zero-stage-3-tips` which contains
.. code-block:: python
from pytorch_lightning import Trainer
from pytorch_lightning.strategies import DeepSpeedStrategy
from deepspeed.ops.adam import FusedAdam
Expand All @@ -409,7 +406,6 @@ You can also use the Lightning Trainer to run predict or evaluate with DeepSpeed
.. code-block:: python
from pytorch_lightning import Trainer
from pytorch_lightning.strategies import DeepSpeedStrategy
class MyModel(pl.LightningModule):
Expand All @@ -435,7 +431,6 @@ This reduces the time taken to initialize very large models, as well as ensure w
import torch.nn as nn
from pytorch_lightning import Trainer
from pytorch_lightning.strategies import DeepSpeedStrategy
from deepspeed.ops.adam import FusedAdam
Expand Down Expand Up @@ -549,7 +544,6 @@ This saves memory when training larger models, however requires using a checkpoi
.. code-block:: python
from pytorch_lightning import Trainer
from pytorch_lightning.strategies import DeepSpeedStrategy
import deepspeed
Expand Down Expand Up @@ -686,7 +680,7 @@ In some cases you may want to define your own DeepSpeed Config, to access all pa
}
model = MyModel()
trainer = Trainer(accelerator="gpu", devices=4, strategy=DeepSpeedStrategy(deepspeed_config), precision=16)
trainer = Trainer(accelerator="gpu", devices=4, strategy=DeepSpeedStrategy(config=deepspeed_config), precision=16)
trainer.fit(model)
Expand All @@ -699,7 +693,7 @@ We support taking the config as a json formatted file:
model = MyModel()
trainer = Trainer(
accelerator="gpu", devices=4, strategy=DeepSpeedStrategy("/path/to/deepspeed_config.json"), precision=16
accelerator="gpu", devices=4, strategy=DeepSpeedStrategy(config="/path/to/deepspeed_config.json"), precision=16
)
trainer.fit(model)
Expand Down
1 change: 1 addition & 0 deletions docs/source/common/checkpointing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -315,6 +315,7 @@ and the Lightning Team will be happy to integrate/help integrate it.

-----------

.. _customize_checkpointing:

***********************
Customize Checkpointing
Expand Down
2 changes: 1 addition & 1 deletion docs/source/common/lightning_module.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1056,7 +1056,7 @@ automatic_optimization
When set to ``False``, Lightning does not automate the optimization process. This means you are responsible for handling
your optimizers. However, we do take care of precision and any accelerators used.

See :ref:`manual optimization<common/optimization:Manual optimization>` for details.
See :ref:`manual optimization <common/optimization:Manual optimization>` for details.

.. code-block:: python
Expand Down
2 changes: 1 addition & 1 deletion docs/source/extensions/plugins.rst
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ Below is a list of built-in plugins for checkpointing.
TorchCheckpointIO
XLACheckpointIO

You could learn more about custom checkpointing with Lightning :ref:`here <../common/checkpointing:Customize Checkpointing>`.
You could learn more about custom checkpointing with Lightning :ref:`here <customize_checkpointing>`.

Cluster Environments
--------------------
Expand Down

0 comments on commit 6ce3a1e

Please sign in to comment.