We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe the bug A clear and concise description of what the bug is.
The asin_bw function returns an invalid gradient value.
asin_bw
To Reproduce Steps to reproduce the behavior:
/tests/tt_eager/python_api_testing/unit_testing/backward_ops/test_backward_asin.py
# SPDX-FileCopyrightText: © 2023 Tenstorrent Inc. # SPDX-License-Identifier: Apache-2.0 import torch import pytest import tt_lib from tests.tt_eager.python_api_testing.unit_testing.backward_ops.utility_funcs import data_gen_pt_tt, compare_results def data_gen_pt_tt(input_shapes, device, required_grad=False, val=1): pt_tensor = (torch.ones(input_shapes, requires_grad=required_grad) * val).bfloat16() tt_tensor = ( tt_lib.tensor.Tensor(pt_tensor, tt_lib.tensor.DataType.BFLOAT16).to(tt_lib.tensor.Layout.TILE).to(device) ) return pt_tensor, tt_tensor @pytest.mark.parametrize( "input_shapes", ( (torch.Size([1, 1, 32, 32])), ), ) def test_bw_asin(input_shapes, device): in_data, input_tensor = data_gen_pt_tt(input_shapes, device, True, val=0) grad_data, grad_tensor = data_gen_pt_tt(input_shapes, device, False, val=0.95) print("input_tensor", input_tensor) # 0 print("grad_tensor", grad_tensor) # 0.94922 pyt_y = torch.asin(in_data) tt_output_tensor_on_device = tt_lib.tensor.asin_bw(grad_tensor, input_tensor) in_data.retain_grad() pyt_y.backward(gradient=grad_data) golden_tensor = [in_data.grad] comp_pass = compare_results(tt_output_tensor_on_device, golden_tensor) print("tt_output_tensor_on_device", tt_output_tensor_on_device) # 2.96875 print("golden_tensor", golden_tensor) # 0.9492 assert comp_pass
pytest ./tests/tt_eager/python_api_testing/unit_testing/backward_ops/test_backward_asin.py
input_tensor ttnn.Tensor([[[[ 0.00000, 0.00000, ..., 0.00000, 0.00000], [ 0.00000, 0.00000, ..., 0.00000, 0.00000], ..., [ 0.00000, 0.00000, ..., 0.00000, 0.00000], [ 0.00000, 0.00000, ..., 0.00000, 0.00000]]]], shape=Shape([1, 1, 32, 32]), dtype=DataType::BFLOAT16, layout=Layout::TILE) grad_tensor ttnn.Tensor([[[[ 0.94922, 0.94922, ..., 0.94922, 0.94922], [ 0.94922, 0.94922, ..., 0.94922, 0.94922], ..., [ 0.94922, 0.94922, ..., 0.94922, 0.94922], [ 0.94922, 0.94922, ..., 0.94922, 0.94922]]]], shape=Shape([1, 1, 32, 32]), dtype=DataType::BFLOAT16, layout=Layout::TILE) tt_output_tensor_on_device [ttnn.Tensor([[[[ 2.96875, 2.96875, ..., 2.96875, 2.96875], [ 2.96875, 2.96875, ..., 2.96875, 2.96875], ..., [ 2.96875, 2.96875, ..., 2.96875, 2.96875], [ 2.96875, 2.96875, ..., 2.96875, 2.96875]]]], shape=Shape([1, 1, 32, 32]), dtype=DataType::BFLOAT16, layout=Layout::TILE)] golden_tensor [tensor([[[[0.9492, 0.9492, 0.9492, ..., 0.9492, 0.9492, 0.9492], [0.9492, 0.9492, 0.9492, ..., 0.9492, 0.9492, 0.9492], [0.9492, 0.9492, 0.9492, ..., 0.9492, 0.9492, 0.9492], ..., [0.9492, 0.9492, 0.9492, ..., 0.9492, 0.9492, 0.9492], [0.9492, 0.9492, 0.9492, ..., 0.9492, 0.9492, 0.9492], [0.9492, 0.9492, 0.9492, ..., 0.9492, 0.9492, 0.9492]]]], dtype=torch.bfloat16)]
Expected behavior A clear and concise description of what you expected to happen.
I want asin_bw to return the correct gradient
Screenshots If applicable, add screenshots to help explain your problem.
Please complete the following environment information:
Additional context Add any other context about the problem here.
The text was updated successfully, but these errors were encountered:
#6536: Fix asin backward op
2c776db
0da45af
11fb243
88ab23f
903fb49
Merged to main
Sorry, something went wrong.
umadevimcw
No branches or pull requests
Describe the bug
A clear and concise description of what the bug is.
The
asin_bw
function returns an invalid gradient value.To Reproduce
Steps to reproduce the behavior:
/tests/tt_eager/python_api_testing/unit_testing/backward_ops/test_backward_asin.py
pytest ./tests/tt_eager/python_api_testing/unit_testing/backward_ops/test_backward_asin.py
Expected behavior
A clear and concise description of what you expected to happen.
I want
asin_bw
to return the correct gradientScreenshots
If applicable, add screenshots to help explain your problem.
Please complete the following environment information:
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: