-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Relay][Op] Add unbiased variance op and corresponding support in pytorch frontend #6232
Conversation
Do we need to introduce new Relay ops? Can we just add |
The existing variance op is a reduce op, and all reduce ops take |
I'm not completely sure what you mean by above, but how about adding Anyway, adding two new ops just for the sake of adding |
If we follow this modification method, we need to add more than just In my opinion, although adding two unbiased operators seems a bit redundant, it has little impact on the code structure and is of a lower probability of causing bugs. |
I mean, the new Isn't updating Sorry I didn't look into the details, so I may be missing something. |
I try the way of of adding |
They look fine to me. I don't think a small API change like this is a problem at all. Being in the same file doesn't mean they have to have the same signature. |
Thanks @shiwenloong @leandron |
Unbiased variance uses
N-1
as the divisor in the calculation, where N represents the number of elements.torch.std
andtorch.var
are unbiased by default in pytorch and these unbiased ops can't be converted.This PR adds unbiased variance op and corresponding support in pytorch frontend.
@masahi @junrushao1994 Please help to review this PR. Thanks.