-
-
Notifications
You must be signed in to change notification settings - Fork 157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WIP: adds ltc layer and its example #536
base: master
Are you sure you want to change the base?
Conversation
Looks great but I won't be able to review until tomorrow |
Couple of points, I am still going through it and will get back with more informed feedback later.
where user gives you the NN as the
I think we can use this as the test, but for an example we would want to recreate of the experiments from the paper, probably MNIST since we have some existing code for it that should help it get up and running. |
Agree 100% on @Vaibhavdixit02 's points. |
@manyfeatures just checking if this is something you are still working on? |
@Vaibhavdixit02 Yep, I'll try to remaster the PR in a couple of days |
Great 👍 |
I've got some problem because can't limit the hidden state amplitude if I transform the task in NeuralODE layer. I'll describe the case on discourse |
@manyfeatures you would also want to take a look at https://github.com/lungd/LTC.jl btw |
I created the post with the code and also will examine it lately as well |
PR adds Liquid Constant Time Layer #509.
Original repo1 and repo2 contain the code which is consistent with biological inspired NCP architecture.
The current implementation differs from them and corresponds to the plain description from the paper
To ensure the stability the output of the network is clipped to be in range between -1 and 1