Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add CLA #473

Merged
merged 2 commits into from
Feb 5, 2021
Merged

Add CLA #473

merged 2 commits into from
Feb 5, 2021

Conversation

arie-matsliah
Copy link
Contributor

@arie-matsliah arie-matsliah commented Feb 5, 2021

Your checklist for this pull request

  • Review the guidelines for contributing to this repository
  • Read and sign the CLA and add yourself to the authors list
  • Make sure you are making a pull request against the develop branch (not master). Also you should start your branch off develop
  • Add tests that prove your fix is effective or that your feature works
  • Add necessary documentation (if appropriate)

Types of changes

  • Bugfix
  • New feature
  • Refactor / Code style update (no logical changes)
  • Build / CI changes
  • Documentation Update
  • Other (explain)

Description

Adding CLA document

Does this address any currently open issues?

No

Thank you for contributing to SLEAP!

❤️

@arie-matsliah arie-matsliah merged commit 2082384 into develop Feb 5, 2021
@arie-matsliah arie-matsliah deleted the arie/cla branch February 10, 2021 20:57
talmo added a commit that referenced this pull request Mar 24, 2021
* Add track indices to instance cropper

* Add class vector generator

* Split class vectors correctly in instance cropper

* Move head output layer construction to heads module
- Heads now subclass a base Head class
- Naming doesn't include _0 anymore since we don't have any multi-output
  models for now.
- Better input validation in Model.from_config constructor
- Add loss weight to all heads in config
- Test coverage for heads and (minimally for) model

* Add topdown config, head and model
- Rename multiclass to multiclass_bottomup

* Add trainer

* Data pipeline

* Apply black to 'sleap' and 'tests' (#465)

Co-authored-by: Arie Matsliah <[email protected]>

* Fix model creation and add pooling param to head

* Symmetry-aware flip augmentation (#455)

* Implement symmetry-aware instance reflection

* Fix symmetries sometimes not being returned uniquely

* Add fancier indexing to instances

* Add random flipping transformer

* Fix failing linux test
- Make sure indices are all cast to int32

* Add vertical flip

* Add flip augmentation to config, GUI and pipeline builders

* Update profiles with default fields

Co-authored-by: ariematsliah-princeton <[email protected]>

* Multi-size videos in data pipelines (#440)

Add support for variable size videos within the same dataset by matching their size with padding or resizing

Co-authored-by: Arie Matsliah <[email protected]>

* Type check + Lint in CI (#470)

* Try lint and typecheck in CI workflow

* update

* nit

* continue on MyPy errors

* test

* correct

* correct

* correct

Co-authored-by: Arie Matsliah <[email protected]>

* Rename predictors for consistency with inference layers
- TopdownPredictor -> TopDownPredictor
- BottomupPredictor -> BottomUpPredictor

* Create PULL_REQUEST_TEMPLATE.md

* Update authors list (#471)

Co-authored-by: Arie Matsliah <[email protected]>

* Add CLA (#473)

* Add CLA

* update links

Co-authored-by: Arie Matsliah <[email protected]>

* Update PULL_REQUEST_TEMPLATE.md

* Miscellaneous QOL (#467)

Pre-1.1.0 update features (changelist in #467)

* Bump pre-release version

* Add back load_model that got lost in the merge
- Add detection of bottomup and topdown multi-class model loading

* Fix more missing things post-merge

* Fix lint

* Fix training from config

* Add inference

* Tweak describe tensor to accept nested tuples/dicts

* Lint

* Fix test

* Lint

* Fix load video dataset arg

* Fix inference

* Fix evals

* Add BU MC to evals

* Remove batch norm from TD MC head

* Add option to disable batch norm in pretrained models

* Add track matching when merging labels

* Don't error when training finishes with no inference target

Co-authored-by: ariematsliah-princeton <[email protected]>
Co-authored-by: Arie Matsliah <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant