Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Symmetry-aware flip augmentation #455

Merged
merged 9 commits into from
Feb 3, 2021
Merged

Symmetry-aware flip augmentation #455

merged 9 commits into from
Feb 3, 2021

Conversation

talmo
Copy link
Collaborator

@talmo talmo commented Jan 10, 2021

This PR implements horizontal flipping augmentation. This is tricky in our setting since we have to swap left/right symmetric nodes in addition to adjusting the coordinates.

Addresses the long-standing #228.

  • Implement base op
  • Transformer for pipelining
  • Expose to config and GUI

@talmo talmo requested a review from arie-matsliah January 22, 2021 19:16
@talmo talmo marked this pull request as ready for review January 22, 2021 19:16
@talmo talmo merged commit bd31807 into develop Feb 3, 2021
@talmo talmo mentioned this pull request Feb 4, 2021
@talmo talmo deleted the talmo/symmetric_flip branch February 4, 2021 07:35
talmo added a commit that referenced this pull request Mar 24, 2021
* Add track indices to instance cropper

* Add class vector generator

* Split class vectors correctly in instance cropper

* Move head output layer construction to heads module
- Heads now subclass a base Head class
- Naming doesn't include _0 anymore since we don't have any multi-output
  models for now.
- Better input validation in Model.from_config constructor
- Add loss weight to all heads in config
- Test coverage for heads and (minimally for) model

* Add topdown config, head and model
- Rename multiclass to multiclass_bottomup

* Add trainer

* Data pipeline

* Apply black to 'sleap' and 'tests' (#465)

Co-authored-by: Arie Matsliah <[email protected]>

* Fix model creation and add pooling param to head

* Symmetry-aware flip augmentation (#455)

* Implement symmetry-aware instance reflection

* Fix symmetries sometimes not being returned uniquely

* Add fancier indexing to instances

* Add random flipping transformer

* Fix failing linux test
- Make sure indices are all cast to int32

* Add vertical flip

* Add flip augmentation to config, GUI and pipeline builders

* Update profiles with default fields

Co-authored-by: ariematsliah-princeton <[email protected]>

* Multi-size videos in data pipelines (#440)

Add support for variable size videos within the same dataset by matching their size with padding or resizing

Co-authored-by: Arie Matsliah <[email protected]>

* Type check + Lint in CI (#470)

* Try lint and typecheck in CI workflow

* update

* nit

* continue on MyPy errors

* test

* correct

* correct

* correct

Co-authored-by: Arie Matsliah <[email protected]>

* Rename predictors for consistency with inference layers
- TopdownPredictor -> TopDownPredictor
- BottomupPredictor -> BottomUpPredictor

* Create PULL_REQUEST_TEMPLATE.md

* Update authors list (#471)

Co-authored-by: Arie Matsliah <[email protected]>

* Add CLA (#473)

* Add CLA

* update links

Co-authored-by: Arie Matsliah <[email protected]>

* Update PULL_REQUEST_TEMPLATE.md

* Miscellaneous QOL (#467)

Pre-1.1.0 update features (changelist in #467)

* Bump pre-release version

* Add back load_model that got lost in the merge
- Add detection of bottomup and topdown multi-class model loading

* Fix more missing things post-merge

* Fix lint

* Fix training from config

* Add inference

* Tweak describe tensor to accept nested tuples/dicts

* Lint

* Fix test

* Lint

* Fix load video dataset arg

* Fix inference

* Fix evals

* Add BU MC to evals

* Remove batch norm from TD MC head

* Add option to disable batch norm in pretrained models

* Add track matching when merging labels

* Don't error when training finishes with no inference target

Co-authored-by: ariematsliah-princeton <[email protected]>
Co-authored-by: Arie Matsliah <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants