Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug Report] invalid asin result #6722

Closed
Tracked by #6445 ...
hschoi4448 opened this issue Mar 25, 2024 · 10 comments
Closed
Tracked by #6445 ...

[Bug Report] invalid asin result #6722

hschoi4448 opened this issue Mar 25, 2024 · 10 comments
Assignees
Labels
bug Something isn't working forward moreh moreh contribution op_cat: eltwise P1

Comments

@hschoi4448
Copy link
Contributor

Describe the bug
A clear and concise description of what the bug is.

The asin function returns an invalid value.

To Reproduce
Steps to reproduce the behavior:

  1. Copy and past below code
# SPDX-FileCopyrightText: © 2023 Tenstorrent Inc.

# SPDX-License-Identifier: Apache-2.0

import torch
import pytest
import tt_lib
from tests.tt_eager.python_api_testing.unit_testing.backward_ops.utility_funcs import data_gen_pt_tt, compare_results

import ttnn
from tests.tt_eager.python_api_testing.sweep_tests import pytorch_ops

def data_gen_pt_tt(input_shapes, device, required_grad=False, val=1):
    pt_tensor = (torch.ones(input_shapes, requires_grad=required_grad) * val).bfloat16()
    tt_tensor = (
        tt_lib.tensor.Tensor(pt_tensor, tt_lib.tensor.DataType.BFLOAT16).to(tt_lib.tensor.Layout.TILE).to(device)
    )
    return pt_tensor, tt_tensor

@pytest.mark.parametrize(
    "input_shapes",
    (
        (torch.Size([1, 1, 32, 32])),
    ),
)
def test1(input_shapes, device):
    val = 90
    in_data, input_tensor = data_gen_pt_tt(input_shapes, device, True, val=val)
    
    print("input_tensor", input_tensor)

    golden_tensor = pytorch_ops.asin(in_data)
    tt_output_tensor_on_device = tt_lib.tensor.asin(input_tensor)
    
    print("tt_output_tensor_on_device", tt_output_tensor_on_device)
    print("golden_tensor", golden_tensor)
    
  1. Run with pytest
input_tensor ttnn.Tensor([[[[90.00000, 90.00000,  ..., 90.00000, 90.00000],
               [90.00000, 90.00000,  ..., 90.00000, 90.00000],
               ...,
               [90.00000, 90.00000,  ..., 90.00000, 90.00000],
               [90.00000, 90.00000,  ..., 90.00000, 90.00000]]]], shape=Shape([1, 1, 32, 32]), dtype=DataType::BFLOAT16, layout=Layout::TILE)
tt_output_tensor_on_device ttnn.Tensor([[[[70039981404865953792.00000, 70039981404865953792.00000,  ..., 70039981404865953792.00000, 70039981404865953792.00000],
               [70039981404865953792.00000, 70039981404865953792.00000,  ..., 70039981404865953792.00000, 70039981404865953792.00000],
               ...,
               [70039981404865953792.00000, 70039981404865953792.00000,  ..., 70039981404865953792.00000, 70039981404865953792.00000],
               [70039981404865953792.00000, 70039981404865953792.00000,  ..., 70039981404865953792.00000, 70039981404865953792.00000]]]], shape=Shape([1, 1, 32, 32]), dtype=DataType::BFLOAT16, layout=Layout::TILE)
golden_tensor tensor([[[[nan, nan, nan,  ..., nan, nan, nan],
          [nan, nan, nan,  ..., nan, nan, nan],
          [nan, nan, nan,  ..., nan, nan, nan],
          ...,
          [nan, nan, nan,  ..., nan, nan, nan],
          [nan, nan, nan,  ..., nan, nan, nan],
          [nan, nan, nan,  ..., nan, nan, nan]]]], dtype=torch.bfloat16,
       grad_fn=<AsinBackward0>)

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Please complete the following environment information:

  • OS: [e.g. Ubuntu 20.04]
  • Version of software (eg. commit 2e5b96f)
  • Device: wormhole_b0

Additional context
Add any other context about the problem here.

@umadevimcw
Copy link
Contributor

@hschoi4448 Can you label whether the bugs belong forward or backward?
It will help us to categorise it.

@hschoi4448
Copy link
Contributor Author

@hschoi4448 Can you label whether the bugs belong forward or backward? It will help us to categorise it.

Got it. However, I can't find the 'forward' label.

@umadevimcw
Copy link
Contributor

umadevimcw commented Mar 25, 2024

@jliangTT Can we create a label name "forward" and use it? Also, It would be helpful for us whether the issue is P0/P1/P2 is added

@jliangTT
Copy link

let's make bug report from @hschoi4448 by default p1. And we can spend tuesday morning to look at the overall work priority and load-balance.

@sangwon-chae sangwon-chae added the moreh moreh contribution label Mar 26, 2024
@umadevimcw
Copy link
Contributor

umadevimcw commented Mar 26, 2024

@hschoi4448 Can you add the labels for other issues created by you? (use the same labels added in this issue and assign it to @umadevimcw ). Thanks in advance!

@umadevimcw
Copy link
Contributor

@hschoi4448 @razorback3 #8944, #8945 (comment), Please look at this comments for this issue

@umadevimcw
Copy link
Contributor

Need to update the test files with the supported range and test it after migration

@umadevimcw
Copy link
Contributor

@hschoi4448 Fix for this issue available in this PR (due to hardware limitations nan/inf are replaced with the numbers)
#11243
Kindly review it.

@VirdhatchaniKN
Copy link
Contributor

Hi @hschoi4448
Merged #11243, Can we close this issue?

@VirdhatchaniKN
Copy link
Contributor

Test file passes, Closing Issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working forward moreh moreh contribution op_cat: eltwise P1
Projects
None yet
Development

No branches or pull requests

5 participants