Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Hydra Configuration PL #1

Closed
wants to merge 91 commits into from
Closed

[WIP] Hydra Configuration PL #1

wants to merge 91 commits into from

Conversation

anthonytec2
Copy link
Owner

@anthonytec2 anthonytec2 commented Jun 20, 2020

Pytorch Lightning Hydra Configuration

TODO:
Done: 1) Finish top level application config for PL
Done: 2) Fix Union types to Any
Done: 3) Add logic in example to pass config into respective parts
4) Test out various configs to ensure everything works

Extras:
Done: 1) Fix callbacks input

williamFalcon and others added 30 commits June 17, 2020 22:45
* add tpu view

* add tpu view

* add tpu view

* add tpu view

* add tpu view
* init the port using a seed that matches process id for ddp

* init the port using a seed that matches process id for ddp

* init the port using a seed that matches process id for ddp

* init the port using a seed that matches process id for ddp

* init the port using a seed that matches process id for ddp

* init the port using a seed that matches process id for ddp

* init the port using a seed that matches process id for ddp

Co-authored-by: Zhaofeng Wu <[email protected]>
* final clean for v0.8.0

* chlog

* chlog

* date

* rename stage

* date

* missing
* update docs

* update docs

* update docs

* update docs

* update docs

* update docs
* add iou function

* update stat scores

* add iou class

* add iou tests

* chlog

* Apply suggestions from code review

* tests

* docs

* Apply suggestions from code review

* docs

Co-authored-by: Jirka <[email protected]>
Co-authored-by: Jirka Borovec <[email protected]>
* Change PR template

* Update .github/PULL_REQUEST_TEMPLATE.md

* Apply suggestions from code review

Co-authored-by: Adrian Wälchli <[email protected]>

Co-authored-by: Jirka Borovec <[email protected]>
Co-authored-by: Adrian Wälchli <[email protected]>
…#2244)

* Fixed the load_from_checkpoint path detected as URL bug

* Fixed the load_from_checkpoint path detected as URL bug

* fixed Caps lock typo

* Added .absolute() to checkpoint path to force hard drive prefix in string
There was a typo in metrics description: RMSE titled metric was actually RMSLE and vice-versa. So I've fixed it, changing two characters.
* chlog

* docs

* ver++

* docs

* url

* docs

* readme

* docs ---
* added barrier

* blank line

* added barrier

* added barrier

* made fx public

Co-authored-by: Jirka Borovec <[email protected]>
* made fx public

* made fx public

* made fx public
* remove frame inspection on self.hparams

* remove frame inspection on self.hparams

* remove frame inspection on self.hparams

* remove frame inspection on self.hparams

* remove frame inspection on self.hparams

* remove frame inspection on self.hparams
* remove barriers

* remove barriers

* remove barriers

* remove barriers

* remove barriers

* remove barriers

* remove barriers

* remove barriers

* remove barriers

* remove barriers
* miss

* miss

* chlog

* chlog
* fix missing arg

* fix missing arg

* fix missing arg

* fix missing arg

* fix missing arg

* fix missing arg

* fix missing arg
* cleaning

* docs

* docs

* types

* mixins

* mixins

* docs

* typo
…AI#2272)

* Attempt to add broken test

* use wandb logger

* Update test_amp.py

Co-authored-by: William Falcon <[email protected]>
* move backward

* refactor backward to remove 16 bit from user override

* refactor backward to remove 16 bit from user override

* Update pytorch_lightning/core/hooks.py

Co-authored-by: Jirka Borovec <[email protected]>

Co-authored-by: Jirka Borovec <[email protected]>
@anthonytec2
Copy link
Owner Author

I just rebased on master, did not realize it would make a mess here...closing and reopening

@anthonytec2
Copy link
Owner Author

#2

@omry
Copy link

omry commented Jun 25, 2020

Next time:
You should be able to fast forward as you didn't change anything inside PL itself.
don't merge, rebase.
also, backup your branch first:

git co -b backup # while being on the feature branch

@anthonytec2
Copy link
Owner Author

I fetched the upstream master and rebased on top of that. The branch history show my commits on top of master, but this MR shows them integrated into master.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.