Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[relay][frontend] Return module from frontend parsers #3353

Merged
merged 9 commits into from
Jun 17, 2019

Conversation

zhiics
Copy link
Member

@zhiics zhiics commented Jun 12, 2019

#3346

This PR returns Module instead of Expr from the frontend converters because:

  1. As we move to pass manager the optimizations should usually be Module to Module
  2. If the converters need to work on modules, i.e. containing more than one functions, returning Expr could be error-prone. Or we should really return a function with closures. Returning module should be more convenient.

Necessary changes are made in various executors to take both module and expr.

Covered converters:

  • TensorFlow

  • TFLite

  • Keras

  • Caffe2

  • MXNet

  • Darknet

  • CoreML

  • ONNX

@zhiics zhiics marked this pull request as ready for review June 12, 2019 20:50
@zhiics
Copy link
Member Author

zhiics commented Jun 12, 2019

cc @icemelon9 @jroesch @wweic @tqchen @yongwww @yzhliu, plz review.

@tqchen
Copy link
Member

tqchen commented Jun 12, 2019

please rebase against the current master

assert isinstance(expr, Expr)
self.mod[self.mod.entry_func] = expr
def _make_executor(self, expr=None):
expr = expr if expr else self.mod
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should try to factor all this normalization code into one spot.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO we should only take module and remove expr related code

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also think we probably only should just take a module, but it looks we need to refactor the interpreter and the graph_runtime somehow.

@tqchen
Copy link
Member

tqchen commented Jun 13, 2019

@zhiics please rebase against the latest master

@tqchen tqchen added the status: need update need update based on feedbacks label Jun 13, 2019
@tqchen tqchen merged commit fa35104 into apache:master Jun 17, 2019
@tqchen tqchen added status: accepted and removed status: need update need update based on feedbacks labels Jun 17, 2019
@tqchen
Copy link
Member

tqchen commented Jun 17, 2019

Thanks, @jroesch @zhiics @icemelon9 , this PR is now merged

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants