Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relay, Topi, TF Frontend] Isfinite operator #4981

Merged
merged 25 commits into from
Mar 23, 2020

Conversation

maheshambule
Copy link
Contributor

@maheshambule maheshambule commented Mar 3, 2020

Added support for isfinite operator.

@yongwww, @zhiics, @kevinthesun, @FrozenGene
Could you please review the PR?

@maheshambule
Copy link
Contributor Author

Please help in reviewing this: @jwfromm, @huajsj, @yzhliu, @zxy844288792

@@ -38,6 +38,8 @@
from tvm import te
from tvm import relay
import tvm.relay.testing.tf as tf_testing
from tensorflow.python.framework import dtypes
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't need to import this, just use tf.float32 and tf.float64 instead.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

removed

@@ -58,6 +58,13 @@ TVM_DLL PrimExpr max_value(const DataType& dtype);
*/
TVM_DLL PrimExpr min_value(const DataType& dtype);

/*!
* Get the infinity.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Get the value of infinity."

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

corrected

@@ -829,6 +829,7 @@ class CallNode : public PrimExprNode {
static constexpr const char* glsl_texture_store = "glsl_texture_store";
static constexpr const char* prefetch = "prefetch";
static constexpr const char* isnan = "isnan";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it worth adding an isinf operator to this PR? That way we can check for both halves of isfinite separately if needed. I'm not sure if you'd ever need to check for just infinity rather than both nan and infinity though.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added support for isinf as well.

@maheshambule
Copy link
Contributor Author

@jwfromm Thanks for the review and comments. I have added support for isinf op as well.

Copy link
Contributor

@jwfromm jwfromm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the changes! LGTM

@maheshambule
Copy link
Contributor Author

@jwfromm could you help in merging this?

@jwfromm
Copy link
Contributor

jwfromm commented Mar 19, 2020

@masahi, @tqchen, @FrozenGene, I think this PR is ready to merge. Can one of you take a quick look and pull the trigger?

@masahi
Copy link
Member

masahi commented Mar 20, 2020

What is the use case of this op in tensorflow? Having to check finite-ness inside DL models seems very weird to me. If there is indeed a possibility of infinity to show up, something (model or data) is broken anyway, so it is not a problem for us.

Error handling should be done in earlier layers before TVM.

@maheshambule
Copy link
Contributor Author

@masahi, Yes generally it does not make sense to use finiteness check inside DL model. It can be generally used in preprocessing input data etc but not in the model. We had received a model from the customer which was using this operator. We do not know the exact use case though. But there is a possibility that this operator appears in the DL models. Also, will it be useful considering training support in TVM?

@masahi
Copy link
Member

masahi commented Mar 20, 2020

If it is only TF frontend change, it is totally fine. But you also added a new relay/topi op, and even added new TIR intrinsics, just to support that customer's model. That makes me concerned.

TVM is supposed to support other frameworks not just TF, and if we start adding these framework specific corner case ops, it quickly leads to code bloat. I prefer adding new ops only if they could be useful across different frameworks.

People use TVM because they want to go fast. Having finteness check doesn't help with that. You should talk to your customer if they can remove it, or if it is fine for us to ignore Isfinite op in our frontend.

Also, will it be useful considering training support in TVM?

I don't know if it is useful to training. Maybe @MarisaKirisame can comment.

@masahi
Copy link
Member

masahi commented Mar 20, 2020

@maheshambule also check out the proposal by @soiferj below. If you need to support TF models, this should be a good solution.
https://discuss.tvm.ai/t/discuss-running-graphs-when-ops-arent-supported/4482

@FrozenGene
Copy link
Member

Generally speaking, I think Isfinite / Isnan should be provided in our tir, which has been provided in Halide. @tqchen could comment it more.

@masahi
Copy link
Member

masahi commented Mar 21, 2020

I also found IsInf and IsNaN are also on ONNX https://github.com/onnx/onnx/blob/master/docs/Operators.md#IsInf

Still not sure what the use case is. We have to use VM runtime, which has some overhead, if these ops are used together with if/else.

@masahi
Copy link
Member

masahi commented Mar 21, 2020

@FrozenGene I grepped through the Halide code base, isinf/isnan don't seem to be in Halide's IR, but they use std::isnan and std::isfinite during codegen. They also appear inside generated code for C source based backend (OpenCL etc.)

@FrozenGene
Copy link
Member

FrozenGene commented Mar 21, 2020

@FrozenGene I grepped through the Halide code base, isinf/isnan don't seem to be in Halide's IR, but they use std::isnan and std::isfinite during codegen. They also appear inside generated code for C source based backend (OpenCL etc.)

halide/Halide@697b4be this commit add nan / inf support in Halide.

@masahi
Copy link
Member

masahi commented Mar 23, 2020

@maheshambule There is a conflict, please rebase

@masahi
Copy link
Member

masahi commented Mar 23, 2020

Merging it for now, thanks @maheshambule @jwfromm @FrozenGene

@masahi masahi merged commit 9037f4e into apache:master Mar 23, 2020
@maheshambule
Copy link
Contributor Author

Thanks @masahi.
I will try to check for the exact use case and will let you know. Also, the proposal to add support to fallback to frontend runtime looks interesting. I will try to look into it once I get some time.

trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Apr 16, 2020
* isfinite doc update

* isfinit expr

* isfinit expr

* isfinite schedule reg

* isfinite python binding

* isfinite python binding

* relay register isfinite

* isfinite type relation

* intrin isfinite

* topi isfinite

* testcase topi isfinite

* tf frontend isfinite

* tf frontend isfinite testcase

* test case relay isfinite

* small fixes

* test forward tf isfinite

* test cases injective for cuda

* remove float16 test case

* add support for isinf

* remove unwanted import

* fix conflict
zhiics pushed a commit to neo-ai/tvm that referenced this pull request Apr 17, 2020
* isfinite doc update

* isfinit expr

* isfinit expr

* isfinite schedule reg

* isfinite python binding

* isfinite python binding

* relay register isfinite

* isfinite type relation

* intrin isfinite

* topi isfinite

* testcase topi isfinite

* tf frontend isfinite

* tf frontend isfinite testcase

* test case relay isfinite

* small fixes

* test forward tf isfinite

* test cases injective for cuda

* remove float16 test case

* add support for isinf

* remove unwanted import

* fix conflict
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants