Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: adds ltc layer and its example #536

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

manyfeatures
Copy link

PR adds Liquid Constant Time Layer #509.

Original repo1 and repo2 contain the code which is consistent with biological inspired NCP architecture.

Screenshot from 2021-04-22 18-53-17

The current implementation differs from them and corresponds to the plain description from the paper
Screenshot from 2021-04-22 18-59-21

To ensure the stability the output of the network is clipped to be in range between -1 and 1

@ChrisRackauckas
Copy link
Member

Looks great but I won't be able to review until tomorrow

@Vaibhavdixit02
Copy link
Member

Couple of points, I am still going through it and will get back with more informed feedback later.

  1. I can see that in the python repo they are doing the evolution of the ode with the loop. I am wondering if we can instead use the solve for it by defining the diff eq mentioned in eq 1 directly. My interpretation is the LTC should look similar to the NeuralODE and be something like
struct LTC{M,P,RE,T,TA,AB,A,K} <: NeuralDELayer
    model::M
    p::P
    re::RE
    tspan::T
    τ::TA
    A::AB
    args::A
    kwargs::K

where user gives you the NN as the model and we internally create the ODEProblem with is as (n is the LTC layer below)

dudt_(u,p,t) = -(1/n.τ + n.re(p)(u)) * u +  n.re(p)(u)*n.A
ff = ODEFunction{false}(dudt_,tgrad=basic_tgrad)
prob = ODEProblem{false}(ff,x,getfield(n,:tspan),p)
  1. The current example you added doesn't seem to have any benefit of using LTC and instead actually gives much higher loss for the same number of epochs (400 that you have used) as compared to just a dense layer.
##LTC
epoch = 400
loss_(data_x, data_y) = 0.0020240898579002888

## changing m to m = Chain(Dense(2,32, tanh), Dense(32,1,x->x)) in the example
epoch = 400
loss_(data_x, data_y) = 1.0341962f-5

I think we can use this as the test, but for an example we would want to recreate of the experiments from the paper, probably MNIST since we have some existing code for it that should help it get up and running.

@ChrisRackauckas
Copy link
Member

Agree 100% on @Vaibhavdixit02 's points.

@Vaibhavdixit02
Copy link
Member

@manyfeatures just checking if this is something you are still working on?

@manyfeatures
Copy link
Author

@Vaibhavdixit02 Yep, I'll try to remaster the PR in a couple of days

@Vaibhavdixit02
Copy link
Member

Great 👍

@manyfeatures
Copy link
Author

manyfeatures commented May 21, 2021

I've got some problem because can't limit the hidden state amplitude if I transform the task in NeuralODE layer. I'll describe the case on discourse

@Vaibhavdixit02
Copy link
Member

@manyfeatures you would also want to take a look at https://github.com/lungd/LTC.jl btw

@manyfeatures
Copy link
Author

manyfeatures commented May 21, 2021

I created the post with the code and also will examine it lately as well

@manyfeatures manyfeatures changed the title adds ltc layer and its example WIP:adds ltc layer and its example May 31, 2021
@manyfeatures manyfeatures changed the title WIP:adds ltc layer and its example WIP: adds ltc layer and its example May 31, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants