Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relay][Frontend] Add MXNet converter for RNN layer ops #3125

Merged
merged 1 commit into from
May 2, 2019

Conversation

icemelon
Copy link
Member

@icemelon icemelon commented May 1, 2019

Add MXNet frontend converter for RNN/GRU/LSTM layer. Currently we only support inputs with fixed sequence length and manually unroll the RNN layer by sequence length steps. After we have any dimension support in Relay, we can then change the converter to support dynamic seq length.

@icemelon icemelon requested review from yidawang and vinx13 May 1, 2019 06:57
Copy link
Contributor

@yidawang yidawang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@tqchen tqchen merged commit d39a4ea into apache:master May 2, 2019
@tqchen
Copy link
Member

tqchen commented May 2, 2019

Thanks, @yidawang @vinx13 , this is now merged

@icemelon icemelon deleted the fused_rnn branch May 6, 2019 22:40
wweic pushed a commit to wweic/tvm that referenced this pull request May 13, 2019
wweic pushed a commit to neo-ai/tvm that referenced this pull request May 13, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants