Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add missed loaders for the ReLU and ELU activation layers #78

Closed
5 tasks
zaleslaw opened this issue May 31, 2021 · 8 comments · Fixed by #128
Closed
5 tasks

Add missed loaders for the ReLU and ELU activation layers #78

zaleslaw opened this issue May 31, 2021 · 8 comments · Fixed by #128
Assignees
Labels
good first issue Good for newcomers
Milestone

Comments

@zaleslaw
Copy link
Collaborator

Each layer should have an implementation for export/import from/to JSON configuration (Keras compatible format) (see ModelLoader.kt and ModelSaver.kt)

The saving functions for the ReLU and ELU activation layer are missed

As an integration example, add a convolutional neural network to the examples package (in CNN package for example), it could be based on improved LeNet model, train it on "mnist" or "fashion mnist" and add it as an integration test to the "examples" module tests

  • add saving function for ReLU activation layer
  • add saving function for ELU activation layer
  • add an example
  • convert this example to the integration test
  • train network, tune it to achieve accuracy more than >70%
@zaleslaw zaleslaw added the good first issue Good for newcomers label May 31, 2021
@zaleslaw zaleslaw added this to the 0.3 milestone May 31, 2021
@zaleslaw zaleslaw changed the title Add missed loader for the ReLU and ELU activation layers Add missed loaders for the ReLU and ELU activation layers May 31, 2021
@avan1235
Copy link
Contributor

I can take if there are no volunteers for this issue and I was adding the ELU layer as I don't see other work for now for me and I'd like to try to export the trained model in kotlinDL to see how it works 😉

@zaleslaw
Copy link
Collaborator Author

I suppose you are enough mature for second-level tasks, let's leave it for newcomers, it's easy to add, but need to understand some important things.
If it will be not closed till the end of July, I'll notify you or fix it be myself

kokorins added a commit to kokorins/KotlinDL that referenced this issue Jun 14, 2021
- missing activation layers loaders added
- two distinct examples with save/load added to examples folder (trying to reach 0.7 accuracy)
kokorins added a commit to kokorins/KotlinDL that referenced this issue Jun 14, 2021
- missing activation layers loaders added
- two distinct examples with save/load added to examples folder (trying to reach 0.7 accuracy)
@dosier
Copy link
Contributor

dosier commented Jun 14, 2021

Nevermind my commits, we happened to work on it at the same time :D

@mkaze
Copy link
Contributor

mkaze commented Jun 15, 2021

@kokorins @dosier Just a tip: it's much better to first leave a comment mentioning that you are working on a PR for an existing issue in the issue page (or create an issue first if it does not already exist and let the maintainers know that you are interested to work on it). With that, you can prevent duplicated effort and you don't end up working on an issue which another person is also working on at the same time. Further, for some of the issues, you may need to confirm and check with maintainers that you want to start working on it; this is so as to prevent working on an issue which is not really an issue or is not a priority at least in a near future. Currently, the maintainer of this project is @zaleslaw.

@zaleslaw
Copy link
Collaborator Author

Yeah, as @dosier we have not yet the contribution guidelines, but looks like this is a time to write something about that.
Sorry for the situation, @dosier

@zaleslaw
Copy link
Collaborator Author

@kokorins could you please write any comment here, to assign the ticket on you

@zaleslaw zaleslaw linked a pull request Jun 15, 2021 that will close this issue
@kokorins
Copy link
Contributor

I've just read a code a bit and decided to contribute. As there were no action until the Pull Request, I havent expected any parallel work.

Can someone share the code style for the project (IntelliJ xml or something of those lines)?

kokorins added a commit to kokorins/KotlinDL that referenced this issue Jun 15, 2021
- missing activation layers loaders added
- two distinct examples with save/load added to examples folder (trying to reach 0.7 accuracy)
@zaleslaw
Copy link
Collaborator Author

zaleslaw commented Jun 16, 2021

It's ok @kokorins, no code style/contribution guideline in the project yet (we are working on it).

I've committed some days ago gradle.properties file with the setting kotlin.code.style=official, so it's enough, for now, to apply Ctrl+Alt+L to the file and remove double blank lines

kokorins added a commit to kokorins/KotlinDL that referenced this issue Jun 17, 2021
- missing activation layers loaders added
- two distinct examples with save/load added to examples folder (trying to reach 0.7 accuracy)
@zaleslaw zaleslaw linked a pull request Jun 21, 2021 that will close this issue
zaleslaw referenced this issue Jun 21, 2021
* Added missing saving functions for ReLU and ELU activation layers (JetBrains#78)

* Reverted changes to the imports

* Added "Model name: $name" line (if `name` is not null) to summary method #120

* Added model name validation in SequentialCompilationTest.summary #120

* Added "type" part to summary line for Sequential model #120

* Wrote simple test for Functional model summary (#120)
- based of toyresnet
- also offers base for other Functional model tests
zaleslaw pushed a commit that referenced this issue Jun 22, 2021
* Add loaders for ELU, RELU activation layers (#78)

- missing activation layers loaders added
- two distinct examples with save/load added to examples folder (trying to reach 0.7 accuracy)

* Add parameter trainability to ELU, ReLU savers
zaleslaw referenced this issue Jun 24, 2021
* Added missing saving functions for ReLU and ELU activation layers (JetBrains#78)

* Reverted changes to the imports

* Added RepeatVector layer #123

* Added serialisation support for RepeatVector layer #123

* Wrote test for RepeatVector #123

* Made changed requested by avan (see desc.)
- added missing require check in init block of RepeatVector
- updated docs
- reformatted code
- housekeeping

* Removed redundant Obs.repeat ext fun

* Made changed requested by avan (see desc.)

- change require message in computeOutputShape
- used inputShape.size(...) for creating shape
- removed author tag

* Used `=` instead of `return` block, added TODO

* Implemented changes requested by zaleslaw

- save trainability status
- renamed tests

* Added test for negative `n` #123

* Added missing newline
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
5 participants