-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve Optimizer docs, update quickstart to use Optimizer #416
Conversation
justheuristic
commented
Nov 29, 2021
- deduplicate docs, fix typos, improve formatting
- switch quickstart.md to the new optimizer
Codecov Report
@@ Coverage Diff @@
## master #416 +/- ##
==========================================
+ Coverage 83.57% 83.67% +0.09%
==========================================
Files 77 77
Lines 7788 7790 +2
==========================================
+ Hits 6509 6518 +9
+ Misses 1279 1272 -7
|
docs/modules/optim.rst
Outdated
@@ -26,9 +29,9 @@ | |||
|
|||
.. raw:: html | |||
|
|||
CollaborativeOptimizer is a legacy version of hivemind.Optimizer. **For new projects, please use hivemind.Optimizer.** | |||
CollaborativeOptimizer is a legacy version of hivemind.Optimizer. <b>For new projects please use hivemind.Optimizer</b>. | |||
Currently, hivemind.Optimizer supports all the features of CollaborativeOptimizer and then some. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
supports all the features of CollaborativeOptimizer and then some.
The sentence seems to be not finished.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I respectfully disagree [1]
[1] - https://www.urbandictionary.com/define.php?term=and%20then%20some
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this phrase is not widely-known enough to be used in the docs. Most readers just won't understand it :(
Co-authored-by: Alexander Borzunov <[email protected]>
…mizer-docs-review
>>> # alternative: opt = hivemind.Optimizer(dht, run_id="run_42", optimizer=torch.optim.Adam(model.parameters()) | ||
>>> opt = hivemind.Optimizer(dht=dht, run_id="run_42", params=model.parameters(), | ||
>>> optimizer=lambda params: torch.optim.Adam(params, **other_options), | ||
>>> target_batch_size=4096, batch_size_per_step=4) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's use the same order of parameters in hivemind.Optimizer definition, both examples in the API docs and the quickstart. It makes it easier to compare the values of the same arguments in these examples, as well as go through them to check whether you've specified all you need in your own code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
``` | ||
|
||
Congrats, you've just started a pocket-sized experiment with decentralized deep learning! | ||
|
||
However, this is just the bare minimum of what hivemind can do. In [this example](https://github.com/learning-at-home/hivemind/tree/master/examples/albert), | ||
However, this is only the basics of what hivemind can do. In [this example](https://github.com/learning-at-home/hivemind/tree/master/examples/albert), | ||
we show how to use a more advanced version of DecentralizedOptimizer to collaboratively train a large Transformer over the internet. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please replace DecentralizedOptimizer -> hivemind.Optimizer here and in L186.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
``` | ||
|
||
Congrats, you've just started a pocket-sized experiment with decentralized deep learning! | ||
|
||
However, this is just the bare minimum of what hivemind can do. In [this example](https://github.com/learning-at-home/hivemind/tree/master/examples/albert), | ||
However, this is only the basics of what hivemind can do. In [this example](https://github.com/learning-at-home/hivemind/tree/master/examples/albert), | ||
we show how to use a more advanced version of DecentralizedOptimizer to collaboratively train a large Transformer over the internet. | ||
|
||
If you want to learn more about each individual component, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please replace:
(Li et al. 2020)
->Li et al. (2020)
(Ryabinin et al. 2021)
->Ryabinin et al. (2021)