Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AMP] add fp16&bf16 support for flatten op #52035

Merged
merged 5 commits into from
Mar 28, 2023

Conversation

thisjiang
Copy link
Contributor

@thisjiang thisjiang commented Mar 23, 2023

PR types

Others

PR changes

Others

Describe

为flatten算子单测添加float16 & bfloat16类型的测试.

@paddle-bot
Copy link

paddle-bot bot commented Mar 23, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

self.dtype = "float64"

def init_input_data(self):
x = np.random.random(self.in_shape).astype("float32")
Copy link
Contributor

@ZzSean ZzSean Mar 27, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里所有数据类型都会被先初始化为float32的,应该改成self.dtype,对uint16单独处理即可

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@@ -31,7 +32,8 @@ def setUp(self):
self.stop_axis = -1
self.skip_cinn()
self.init_test_case()
self.inputs = {"X": np.random.random(self.in_shape).astype("float64")}
self.init_test_dtype()
self.init_input_data()
self.init_attrs()
self.outputs = {
"Out": self.inputs["X"].reshape(self.new_shape),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

output也需要对uint16做特殊处理,convert_float_to_uint16

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

self.inputs["X"]已经在生成的时候转为了uint16,此处无需再转换

Copy link
Contributor

@ZzSean ZzSean left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@thisjiang thisjiang merged commit a33a4d0 into PaddlePaddle:develop Mar 28, 2023
@thisjiang thisjiang deleted the add_flatten_fp16_support branch March 28, 2023 03:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants