You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Not all thnets supported layers have been implemented
is the LSTM layer supported? I use thexport on my model,It seems the LSTM with embedding layer have been decomposed to a extreme complicated structure with more than 7000+ functions, even exceed the default maxium recursive depth of Python. (the length of input series is 400).
some of them
<torch.autograd._functions.basic_ops.Add object at 0x10fe1c960>
<torch.autograd._functions.basic_ops.Mul object at 0x10fe1e5c0>
<torch.autograd._functions.pointwise.Sigmoid object at 0x10fe1e138>
<torch.autograd._functions.tensor.Chunk object at 0x10fe1e050>
<torch.autograd._functions.pointwise.Tanh object at 0x10fe1e308>
<torch.autograd._functions.tensor.Chunk object at 0x10fe1e050>
<torch.autograd._functions.tensor.Concat object at 0x10ee39308>
is that affordable for thnets? Can I use the model file directly for production?
The text was updated successfully, but these errors were encountered:
is the LSTM layer supported? I use thexport on my model,It seems the LSTM with embedding layer have been decomposed to a extreme complicated structure with more than 7000+ functions, even exceed the default maxium recursive depth of Python. (the length of input series is 400).
some of them
is that affordable for thnets? Can I use the model file directly for production?
The text was updated successfully, but these errors were encountered: