Skip to content

Commit

Permalink
Add links to "Example Use Cases" (#497)
Browse files Browse the repository at this point in the history
I think some people are interested in the "Example Use Cases" section because they'd like to know what was already built with hivemind, and other people would like to take a look on the code if they've already started to use hivemind and want some code examples.

Currently, the sahajBERT link leads to the sahajBERT repo that doesn't describe much about the project itself. Conversely, it's hard to find the repo with the code following the CALM and "Training Transformers Together" links.

This PR adds more useful links to each of the projects.

(cherry picked from commit 7a7c93a)
  • Loading branch information
borzunov authored and mryab committed Sep 13, 2022
1 parent af6320a commit 6334a7b
Showing 1 changed file with 5 additions and 5 deletions.
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,16 +26,16 @@ large model on hundreds of computers from different universities, companies, and
To learn more about the ideas behind this library,
see the [full list](https://github.com/learning-at-home/hivemind/tree/refer-to-discord-in-docs#citation) of our papers below.

## Example Applications and Use Cases
## Example Use Cases

This section lists projects that leverage hivemind for decentralized training.
If you have succesfully trained a model or created a downstream repository with the help of our library,
feel free to submit a pull request that adds your project to this list.

* [sahajBERT](https://github.com/tanmoyio/sahajbert) — a collaboratively pretrained ALBERT-xlarge for the Bengali language.
* [CALM](https://github.com/NCAI-Research/CALM) (Collaborative Arabic Language Model) — a masked language model trained on a combination of Arabic datasets.
* [Training Transformers Together](https://training-transformers-together.github.io/) — a NeurIPS 2021 demonstration that trained a collaborative text-to-image Transformer model.
* [HivemindStrategy](https://pytorch-lightning.readthedocs.io/en/latest/api/pytorch_lightning.strategies.HivemindStrategy.html) in PyTorch Lightning allows adapting your existing pipelines to training over slow network with unreliable peers.
* **sahajBERT** ([blog post](https://huggingface.co/blog/collaborative-training), [code](https://github.com/tanmoyio/sahajbert)) — a collaboratively pretrained ALBERT-xlarge for the Bengali language.
* **CALM** ([webpage](https://huggingface.co/CALM), [code](https://github.com/NCAI-Research/CALM)) — a masked language model trained on a combination of Arabic datasets.
* **Training Transformers Together** ([webpage](https://training-transformers-together.github.io/), [code](https://github.com/learning-at-home/dalle-hivemind)) — a NeurIPS 2021 demonstration that trained a collaborative text-to-image Transformer model.
* **HivemindStrategy** ([docs](https://pytorch-lightning.readthedocs.io/en/latest/api/pytorch_lightning.strategies.HivemindStrategy.html)) in PyTorch Lightning allows adapting your existing pipelines to training over slow network with unreliable peers.

## Installation

Expand Down

0 comments on commit 6334a7b

Please sign in to comment.