Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docs new section #2236

Merged
merged 8 commits into from
Jun 18, 2020
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 15 additions & 4 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,17 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

## [unreleased] - YYYY-MM-DD

### Added

### Changed

### Deprecated

### Removed

### Fixed

## [0.8.0] - 2020-06-18

Expand All @@ -25,11 +36,11 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Added a model hook `transfer_batch_to_device` that enables moving custom data structures to the target device ([1756](https://github.com/PyTorchLightning/pytorch-lightning/pull/1756))
- Added [black](https://black.readthedocs.io/en/stable/) formatter for the code with code-checker on pull ([1610](https://github.com/PyTorchLightning/pytorch-lightning/pull/1610))
- Added back the slow spawn ddp implementation as `ddp_spawn` ([#2115](https://github.com/PyTorchLightning/pytorch-lightning/pull/2115))
- Added loading checkpoints from URLs ([#1667](https://github.com/PyTorchLightning/pytorch-lightning/issues/1667))
- Added loading checkpoints from URLs ([#1667](https://github.com/PyTorchLightning/pytorch-lightning/pull/1667))
- Added a callback method `on_keyboard_interrupt` for handling KeyboardInterrupt events during training ([#2134](https://github.com/PyTorchLightning/pytorch-lightning/pull/2134))
- Added a decorator `auto_move_data` that moves data to the correct device when using the LightningModule for inference ([#1905](https://github.com/PyTorchLightning/pytorch-lightning/pull/1905))
- Added `ckpt_path` option to `LightningModule.test(...)` to load particular checkpoint ([#2190](https://github.com/PyTorchLightning/pytorch-lightning/issues/2190))
- Added `setup` and `teardown` hooks for model ([#2229](https://github.com/PyTorchLightning/pytorch-lightning/issues/2229))
- Added `ckpt_path` option to `LightningModule.test(...)` to load particular checkpoint ([#2190](https://github.com/PyTorchLightning/pytorch-lightning/pull/2190))
- Added `setup` and `teardown` hooks for model ([#2229](https://github.com/PyTorchLightning/pytorch-lightning/pull/2229))

### Changed

Expand Down Expand Up @@ -67,7 +78,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

- Run graceful training teardown on interpreter exit ([#1631](https://github.com/PyTorchLightning/pytorch-lightning/pull/1631))
- Fixed user warning when apex was used together with learning rate schedulers ([#1873](https://github.com/PyTorchLightning/pytorch-lightning/pull/1873))
- Fixed multiple calls of `EarlyStopping` callback ([#1751](https://github.com/PyTorchLightning/pytorch-lightning/issues/1751))
- Fixed multiple calls of `EarlyStopping` callback ([#1863](https://github.com/PyTorchLightning/pytorch-lightning/pull/1863))
- Fixed an issue with `Trainer.from_argparse_args` when passing in unknown Trainer args ([#1932](https://github.com/PyTorchLightning/pytorch-lightning/pull/1932))
- Fixed bug related to logger not being reset correctly for model after tuner algorithms ([#1933](https://github.com/PyTorchLightning/pytorch-lightning/pull/1933))
- Fixed root node resolution for SLURM cluster with dash in host name ([#1954](https://github.com/PyTorchLightning/pytorch-lightning/pull/1954))
Expand Down
25 changes: 15 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,8 @@
-->
</div>

---
## Trending contributors
----------------

[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/0)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/0)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/1)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/1)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/2)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/2)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/3)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/3)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/4)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/4)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/5)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/5)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/6)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/6)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/7)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/7)

---
## Continuous Integration
<center>

Expand Down Expand Up @@ -188,7 +184,7 @@ Although your research/production project might start simple, once you add thing

Lightning is also part of the [PyTorch ecosystem](https://pytorch.org/ecosystem/) which requires projects to have solid testing, documentation and support.

---
----------------

## README Table of Contents
- [How do I use it](https://github.com/PytorchLightning/pytorch-lightning#how-do-i-do-use-it)
Expand All @@ -204,7 +200,7 @@ Lightning is also part of the [PyTorch ecosystem](https://pytorch.org/ecosystem/
- [Lightning team](https://github.com/PytorchLightning/pytorch-lightning#lightning-team)
- [FAQ](https://github.com/PytorchLightning/pytorch-lightning#faq)

---
----------------

## Realistic example
Here's how you would organize a realistic PyTorch project into Lightning.
Expand Down Expand Up @@ -369,7 +365,7 @@ Check out this awesome list of research papers and implementations done with Lig
Check out our [introduction guide](https://pytorch-lightning.readthedocs.io/en/latest/introduction_guide.html) to get started.
Or jump straight into [our tutorials](https://pytorch-lightning.readthedocs.io/en/latest/#tutorials).

---
----------------

## Asking for help
Welcome to the Lightning community!
Expand All @@ -380,7 +376,8 @@ If you have any questions, feel free to:
3. [Ask on stackoverflow](https://stackoverflow.com/questions/ask?guided=false) with the tag pytorch-lightning.
4. [Join our slack](https://join.slack.com/t/pytorch-lightning/shared_invite/zt-f6bl2l0l-JYMK3tbAgAmGRrlNr00f1A).

---
----------------

## FAQ
**How do I use Lightning for rapid research?**
[Here's a walk-through](https://pytorch-lightning.readthedocs.io/en/latest/introduction_guide.html)
Expand Down Expand Up @@ -447,6 +444,14 @@ pip install https://github.com/PytorchLightning/pytorch-lightning/archive/0.X.Y.
- Adrian Wälchli [(awaelchli)](https://github.com/awaelchli)
- Nicki Skafte [(skaftenicki)](https://github.com/SkafteNicki)

----------------

### Trending contributors

[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/0)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/0)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/1)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/1)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/2)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/2)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/3)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/3)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/4)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/4)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/5)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/5)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/6)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/6)[![](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/images/7)](https://sourcerer.io/fame/williamFalcon/pytorchlightning/pytorch-lightning/links/7)

----------------

#### Funding
Building open-source software with only a few part-time people is hard! We've secured funding to make sure we can
hire a full-time staff, attend conferences, and move faster through implementing features you request.
Expand All @@ -463,7 +468,7 @@ If you want to cite the framework feel free to use this (but only if you loved i
@article{falcon2019pytorch,
title={PyTorch Lightning},
author={Falcon, WA},
journal={GitHub. Note: https://github. com/williamFalcon/pytorch-lightning Cited by},
journal={GitHub. Note: https://github.com/PyTorchLightning/pytorch-lightning Cited by},
volume={3},
year={2019}
}
Expand Down
2 changes: 1 addition & 1 deletion docs/source/apex.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
Lightning offers 16-bit training for CPUs, GPUs and TPUs.

GPU 16-bit
-----------
----------
16 bit precision can cut your memory footprint by half.
If using volta architecture GPUs it can give a dramatic training speed-up as well.

Expand Down
12 changes: 6 additions & 6 deletions docs/source/callbacks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Example:
We successfully extended functionality without polluting our super clean
:class:`~pytorch_lightning.core.LightningModule` research code.

---
----------------

.. automodule:: pytorch_lightning.callbacks.base
:noindex:
Expand All @@ -56,7 +56,7 @@ We successfully extended functionality without polluting our super clean
_abc_impl,
check_monitor_top_k,

---
----------------

.. automodule:: pytorch_lightning.callbacks.early_stopping
:noindex:
Expand All @@ -66,7 +66,7 @@ We successfully extended functionality without polluting our super clean
_abc_impl,
check_monitor_top_k,

---
----------------

.. automodule:: pytorch_lightning.callbacks.gradient_accumulation_scheduler
:noindex:
Expand All @@ -76,15 +76,15 @@ We successfully extended functionality without polluting our super clean
_abc_impl,
check_monitor_top_k,

---
----------------

.. automodule:: pytorch_lightning.callbacks.lr_logger
:noindex:
:exclude-members:
_extract_lr,
_find_names

---
----------------

.. automodule:: pytorch_lightning.callbacks.model_checkpoint
:noindex:
Expand All @@ -94,7 +94,7 @@ We successfully extended functionality without polluting our super clean
_abc_impl,
check_monitor_top_k,

---
----------------

.. automodule:: pytorch_lightning.callbacks.progress
:noindex:
Expand Down
14 changes: 6 additions & 8 deletions docs/source/experiment_logging.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,6 @@
Experiment Logging
==================

---

Comet.ml
^^^^^^^^

Expand Down Expand Up @@ -49,7 +47,7 @@ The :class:`~pytorch_lightning.loggers.CometLogger` is available anywhere except
.. seealso::
:class:`~pytorch_lightning.loggers.CometLogger` docs.

---
----------------

MLflow
^^^^^^
Expand All @@ -76,7 +74,7 @@ Then configure the logger and pass it to the :class:`~pytorch_lightning.trainer.
.. seealso::
:class:`~pytorch_lightning.loggers.MLFlowLogger` docs.

---
----------------

Neptune.ai
^^^^^^^^^^
Expand Down Expand Up @@ -116,7 +114,7 @@ The :class:`~pytorch_lightning.loggers.NeptuneLogger` is available anywhere exce
.. seealso::
:class:`~pytorch_lightning.loggers.NeptuneLogger` docs.

---
----------------

allegro.ai TRAINS
^^^^^^^^^^^^^^^^^
Expand Down Expand Up @@ -160,7 +158,7 @@ The :class:`~pytorch_lightning.loggers.TrainsLogger` is available anywhere in yo
.. seealso::
:class:`~pytorch_lightning.loggers.TrainsLogger` docs.

---
----------------

Tensorboard
^^^^^^^^^^^
Expand All @@ -186,7 +184,7 @@ The :class:`~pytorch_lightning.loggers.TensorBoardLogger` is available anywhere
.. seealso::
:class:`~pytorch_lightning.loggers.TensorBoardLogger` docs.

---
----------------

Test Tube
^^^^^^^^^
Expand Down Expand Up @@ -221,7 +219,7 @@ The :class:`~pytorch_lightning.loggers.TestTubeLogger` is available anywhere exc
.. seealso::
:class:`~pytorch_lightning.loggers.TestTubeLogger` docs.

---
----------------

Weights and Biases
^^^^^^^^^^^^^^^^^^
Expand Down
4 changes: 2 additions & 2 deletions docs/source/fast_training.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Fast Training
There are multiple options to speed up different parts of the training by choosing to train
on a subset of data. This could be done for speed or debugging purposes.

---
----------------

Check validation every n epochs
-------------------------------
Expand All @@ -19,7 +19,7 @@ If you have a small dataset you might want to check validation every n epochs
# DEFAULT
trainer = Trainer(check_val_every_n_epoch=1)

---
----------------

Force training for min or max epochs
------------------------------------
Expand Down
4 changes: 2 additions & 2 deletions docs/source/hooks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ To enable a hook, simply override the method in your LightningModule and the tra

3. Add it in the correct place in :mod:`pytorch_lightning.trainer` where it should be called.

---
----------------

Hooks lifecycle
---------------
Expand Down Expand Up @@ -72,7 +72,7 @@ Test loop
- ``torch.set_grad_enabled(True)``
- :meth:`~pytorch_lightning.core.hooks.ModelHooks.on_post_performance_check`

---
----------------

General hooks
-------------
Expand Down
6 changes: 3 additions & 3 deletions docs/source/tpu.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,13 @@ Lightning supports running on TPUs. At this moment, TPUs are available
on Google Cloud (GCP), Google Colab and Kaggle Environments. For more information on TPUs
`watch this video <https://www.youtube.com/watch?v=kPMpmcl_Pyw>`_.

---
----------------

Live demo
----------
Check out this `Google Colab <https://colab.research.google.com/drive/1-_LKx4HwAxl5M6xPJmqAAu444LTDQoa3>`_ to see how to train MNIST on TPUs.

---
----------------

TPU Terminology
---------------
Expand All @@ -23,7 +23,7 @@ A TPU pod hosts many TPUs on it. Currently, TPU pod v2 has 2048 cores!
You can request a full pod from Google cloud or a "slice" which gives you
some subset of those 2048 cores.

---
----------------

How to access TPUs
------------------
Expand Down
2 changes: 1 addition & 1 deletion pytorch_lightning/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
"""Root package info."""

__version__ = '0.8.0'
__version__ = '0.8.1-dev'
__author__ = 'William Falcon et al.'
__author_email__ = '[email protected]'
__license__ = 'Apache-2.0'
Expand Down