Skip to content
This repository has been archived by the owner on Nov 21, 2022. It is now read-only.

PL has changed the paramerters of Trainer #238

Closed
ElderWanng opened this issue Apr 10, 2022 · 4 comments · Fixed by #239
Closed

PL has changed the paramerters of Trainer #238

ElderWanng opened this issue Apr 10, 2022 · 4 comments · Fixed by #239
Assignees
Labels
bug / fix Something isn't working help wanted Extra attention is needed

Comments

@ElderWanng
Copy link

🐛 Bug

In the latest pytorch-lightning ( 1.6.x ), the argument of number of gpus has changed to 'devices', but in this project's requirement it's
pytorch-lightning>=1.4.0. So pip will automatically install 1.6.x which conflicts to the config file

@ElderWanng ElderWanng added bug / fix Something isn't working help wanted Extra attention is needed labels Apr 10, 2022
@Borda
Copy link
Member

Borda commented Apr 10, 2022

@ElderWanng mind sending a PR? 🐰

@tanmoyio
Copy link
Contributor

@ElderWanng if you are not working on it, I want to make a PR
cc @Borda

@Borda
Copy link
Member

Borda commented Apr 14, 2022

if you are not working on it, I want to make a PR

that would be great!

@tanmoyio
Copy link
Contributor

@Borda I am preparing the PR, I have some confusions,

  1. pl.Trainer still has the argument gpus, so should I keep it along side with argument devices, also I have seen some benchmark config files which contains gpus should I change those?

  2. Also in the main README.md there are several commands which can be changed, like from -
    python train.py task=nlp/language_modeling dataset=nlp/language_modeling/wikitext trainer.gpus=1 training.batch_size=8 to python train.py task=nlp/language_modeling dataset=nlp/language_modeling/wikitext trainer.accelerator=gpu training.batch_size=8 we don't need to manually set devices if we set the accelerator, correct me if I am wrong.

Let me know your opinion.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants