-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
matmul and matmul_v2 refactor #42732
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work
@@ -519,43 +519,6 @@ def init_data_type(self): | |||
self.data_type_ = np.int8 | |||
|
|||
|
|||
class TestMatMulOpTransposeReshapeTransposeAxisNotSupportedException( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why these tests are deleted?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These tests only checked if after setting incorrect values for fuse_reshape_out
and fuse_transpose_out
test would fail. I deleted it, because the only way of setting these parameters is via fuse_pass, which has all PADDLE_ENFORCES to prevent such case. Also, we have separate unit test dedicated for matmul+transpose+reshape
fuse pass.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really nice that you have spotted that! Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! It's good to see another really needed refactoring done
Maybe you need to reassign license ? |
I checked the CLA assistant manually and everything looks fine. Unfortunately, despite the renewal of the license, the CI is still not updated. Can this be merged as it is, or should I trigger CI with empty commit? |
@baoachun please review |
很抱歉,经过我们的反复讨论,你的PR暂未达到合入标准,请阅读飞桨原生算子开发规范,你可以重新提交新的PR,我们先将此PR关闭,感谢你的贡献。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
paddle/phi/core/ddim.cc
Outdated
@@ -171,11 +171,25 @@ DDim stride_numel(const DDim& ddim) { | |||
return strides; | |||
} | |||
|
|||
DDim DDim::reshape(const std::vector<int>& shape) const { | |||
DDim DDim::reshape(const std::vector<int>& new_shape) const { | |||
std::vector<int> shape = new_shape; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Creating a new vector here and do ifelse branch in core/ddim.cc will influence each cpu/gpu op I think ?
9dbb43c
你的PR已合入Paddle库,请关注后续测试结果。 |
PR types
Others
PR changes
OPs
Describe
reshape
function, so it can be reused,PADDLE_ENFORCE
which are checked both in fuse pass and operator, with same conditions,fuse_transpose_out
andfuse_reshape_out
are intentionally set incorrectly,