Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relay][Op] Add unbiased variance op and corresponding support in pytorch frontend #6232

Merged
merged 1 commit into from
Aug 10, 2020
Merged

[Relay][Op] Add unbiased variance op and corresponding support in pytorch frontend #6232

merged 1 commit into from
Aug 10, 2020

Conversation

shiwenloong
Copy link
Contributor

Unbiased variance uses N-1 as the divisor in the calculation, where N represents the number of elements. torch.std and torch.var are unbiased by default in pytorch and these unbiased ops can't be converted.
This PR adds unbiased variance op and corresponding support in pytorch frontend.

@masahi @junrushao1994 Please help to review this PR. Thanks.

python/tvm/relay/frontend/pytorch.py Outdated Show resolved Hide resolved
python/tvm/relay/op/reduce.py Outdated Show resolved Hide resolved
@masahi
Copy link
Member

masahi commented Aug 7, 2020

Do we need to introduce new Relay ops? Can we just add unbiased argument to existing ops like PyTorch?

@shiwenloong
Copy link
Contributor Author

shiwenloong commented Aug 8, 2020

Do we need to introduce new Relay ops? Can we just add unbiased argument to existing ops like PyTorch?

The existing variance op is a reduce op, and all reduce ops take axis, keepdims and exclude as attributes. I think adding unbiased argument to variance op will change the consistency. It will also bring more work to isolate the variance op from reduce ops.

@masahi
Copy link
Member

masahi commented Aug 8, 2020

I think adding unbiased argument to variance op will change the consistency. It will also bring more work to isolate the variance op from reduce ops.

I'm not completely sure what you mean by above, but how about adding VarianceAttrs with default unbiased=false? I think that should fix the coupling between variance and reduce op you are referring to?

Anyway, adding two new ops just for the sake of adding unbiased support sounds a bit overkill to me.

@shiwenloong
Copy link
Contributor Author

I think adding unbiased argument to variance op will change the consistency. It will also bring more work to isolate the variance op from reduce ops.

I'm not completely sure what you mean by above, but how about adding VarianceAttrs with default unbiased=false? I think that should fix the coupling between variance and reduce op you are referring to?

Anyway, adding two new ops just for the sake of adding unbiased support sounds a bit overkill to me.

If we follow this modification method, we need to add more than just VarianceAttrs. Many codes and tests related to the variance operator need to be changed. For example, we should also add variance_shape_func in python/tvm/relay/op/_reduce.py. std and variance should be isolated from python/tvm/relay/op/reduce.py because their parameters are different from other reduce ops.

In my opinion, although adding two unbiased operators seems a bit redundant, it has little impact on the code structure and is of a lower probability of causing bugs.

@masahi
Copy link
Member

masahi commented Aug 8, 2020

I mean, the new VarianceAttrs should also have all attributes of ReduceAttrs. The underlying implementation of variance op can continue to be the reduce op.

Isn't updating VarianceCompute (like you did) and MakeVariance (to add unbiased param with default False) with corresponding changes in reduce.py enough? I don't see why we'd need to register a new shape func to variance op. I don't think it would break anything either.

Sorry I didn't look into the details, so I may be missing something.

@shiwenloong
Copy link
Contributor Author

I mean, the new VarianceAttrs should also have all attributes of ReduceAttrs. The underlying implementation of variance op can continue to be the reduce op.

Isn't updating VarianceCompute (like you did) and MakeVariance (to add unbiased param with default False) with corresponding changes in reduce.py enough? I don't see why we'd need to register a new shape func to variance op. I don't think it would break anything either.

Sorry I didn't look into the details, so I may be missing something.

I try the way of of adding VarianceAttrs in current implementation. There are fewer problems than I thought. But the inconsistency of reduce.py still exists because std and variance take one more parameter than all other functions in the file. I am not sure if this is a noteworthy issue.

@masahi
Copy link
Member

masahi commented Aug 10, 2020

But the inconsistency of reduce.py still exists because std and variance take one more parameter than all other functions in the file. I am not sure if this is a noteworthy issue.

They look fine to me. I don't think a small API change like this is a problem at all. Being in the same file doesn't mean they have to have the same signature.

@masahi masahi merged commit 7926a5d into apache:master Aug 10, 2020
@masahi
Copy link
Member

masahi commented Aug 10, 2020

Thanks @shiwenloong @leandron

wjliu1998 pushed a commit to wjliu1998/incubator-tvm that referenced this pull request Aug 13, 2020
trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Aug 26, 2020
trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Aug 26, 2020
trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Aug 26, 2020
trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Sep 2, 2020
trevor-m pushed a commit to neo-ai/tvm that referenced this pull request Sep 3, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants