Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for KerasSpiking layers #182

Merged
merged 2 commits into from
Oct 4, 2020
Merged

Conversation

drasmuss
Copy link
Member

Also added support for TimeDistributed layers, since that's present in a lot of KerasSpiking models. And then added some pointers to KerasSpiking in various places throughout the documentation.

@drasmuss drasmuss force-pushed the keras-spiking-converter branch 2 times, most recently from 3e12381 to 140a158 Compare September 28, 2020 13:31
@drasmuss drasmuss marked this pull request as ready for review September 28, 2020 14:24
@drasmuss drasmuss force-pushed the keras-spiking-converter branch from 6a93a68 to fe75a52 Compare September 29, 2020 14:01


def test_time_distributed():
inp = x = tf.keras.Input((10, 8))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How much work would it be to also support None for the number of timesteps?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Never mind, I realized this (seems to) work, I was just getting an exception in the second part of this test when I switched it.

Set to True if the Keras model contains a temporal
dimension (e.g. the inputs and outputs of each layer have shape
``(batch_size, n_steps, ...)``). Note that all layers must be temporal, the
Converter cannot handle models with a mix of temporal and non-temporal layers.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm trying to think of ways we could get around this for at least some of the simple/more common cases. It's pretty common for models to collapse over time in some way near the end.

What if rather than removing the first two dimensions for all layers, we only do that if rank > 2. So any layer that's already flattened would be treated as not having time. That would at least allow e.g. a dense layer at the end of the model that does the classification.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Never mind. While the shape trick is possible, there's no way to pool across time in Nengo, since the whole network has to run each timestep.

Probably the best way to do something equivalent is to have time in all Keras layers, but then have a filter near the end of the network, and a loss function that just uses the last timestep. We could automate this "use the last timestep in the loss function" if e.g. we detect a filter layer with return_sequences=False, but it's not as trivial as I thought (and probably somewhat fragile, too).

raise ValueError("Input shapes must be fully specified; got %s" % (shape,))
raise ValueError(
"Input shapes must be fully specified; got %s. For inputs with `None` "
"in the first axis, indicating a variable number of timesteps, "
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It doesn't matter whether the shape is None or fully specified, they still need to set temporal_model=True.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh nm I was parsing this wrong, this is just helping to give them a hint that they should set temporal_model=True. I might rephrase to something like "If inputs contain None in the first axis to indicate a variable number of timesteps, ..." (I read the current phrasing as implying that if the first axis is None that means it is a temporal model, which isn't necessarily True).

TypeError, match="can only be converted when temporal_model=True"
):
converter.Converter(tf.keras.Model(inp, x), temporal_model=False)
if timesteps is not None:
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This error should be raised in both cases I think

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, I think if timesteps is None, then it's the ValueError in ConvertInput that we were discussing above. That's why I added that comment about setting temporal_model=True there, since if timesteps is None and temporal_model=False, that's the first error the user sees. We could test that other error here, too, though.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yeah that's true I guess there isn't an obvious way to hit this in the timesteps=None case. I might catch the other error here, just to make it clear that it's an error in both cases (so no one reading this test thinks it should work in the other case, like my mistake).

@hunse
Copy link
Collaborator

hunse commented Oct 2, 2020

Ok, made your suggested changes @drasmuss

@drasmuss
Copy link
Member Author

drasmuss commented Oct 2, 2020

Fixups look good

Also support for TimeDistributed layers, and
added some pointers to KerasSpiking in the
documentation.
@hunse hunse force-pushed the keras-spiking-converter branch from 22479f9 to d09e1cb Compare October 2, 2020 14:32
@hunse hunse merged commit d09e1cb into master Oct 4, 2020
@hunse hunse deleted the keras-spiking-converter branch October 4, 2020 18:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

2 participants