-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[NNVM] Support argmax/argmin in tensorflow frontend #1514
Conversation
f2ea7ee
to
0f3d7c4
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ArgMax & ArgMin are good to go.
Few suggestions on the main function changes to look into.
|
||
try: | ||
from tensorflow.python.framework import tensor_util | ||
except ImportError as e: | ||
raise ImportError( | ||
"Unable to import tensorflow which is required {}".format(e)) | ||
|
||
# Parse the nodes briefly to determine the preconditions: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Brief parsing is a good addition, Thanks. It helps to list all missing ops at one place.
Please make it a separate function instead of inline.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I'll move it to a separate function.
@@ -940,7 +976,7 @@ def from_tensorflow(self, graph): | |||
"Please freeze the graph with add_shapes=True") | |||
elif node.op == "Const": | |||
# Assuming first Const node as Graph Input node | |||
if self._input_node == '': | |||
if self._input_node == '' and not has_placeholders: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Technically there is nothing does with _input_node. Hence _input_node has no significance here.
As you touched this area I suggest to change the below or ignore this part in this PR.
A graph can be executed as ConstA+ConstB=C.
where there is no place holders and we don't need to convert an Const to placeholder and expect input from user (Example test cases.).
Hence I suggest to simplify as "All placeholders are inputs" and "all Constants are Params".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Treating constants as inputs seemed strange to me too, so I am glad to see you suggestion.
I attempted to make this simplification and it seems that at least some operations implicitly depend on Consts being an input. For example, _conv
operation uses _input_shape
here . Input shape in its turn is set only for input nodes (i.e. nodes which aren't listed in params
attribute, here is the key point) Thus, removing constants from inputs unfortunately breaks some tests involving convolution, since _input_shapes
is no longer available for input[0].
I think it is a bug in _conv
since any operation should AFAIK work with any type of input nodes. Let me suggest that I fix it in a separate commit. As for this commit, I'd better remove the placeholder tweak completely to let other changes go.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, you could hold the placeholder change from this PR and let this go first.
We could open a discussion @ discuss.tvm.ai and discuss it further.
BTW, _input_shapes is a conclusion after a long discussion around SAME padding implementation of tensorflow convolution. This is primarily due to NNVM compilation process expects to build based on a fixed input shapes (which is different for tensorflow or other frameworks which accepts variable shapes at runtime).
_input_shape for constant is not a problem as it's a constant and shape is available from Tensor.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done, postponing the placeholders part. I've started https://discuss.tvm.ai/t/importing-tensorflow-models-placeholder-vs-const-issues/561 topic for further discussion.
d50bf27
to
fa69b93
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @grwlf .
LGTM.
Thanks @grwlf @srkreddy1238 this is merged |
This PR aims to improve Tensorflow frontend by: