Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug Report] ttnn.clip interface does not follow Pytorch #14099

Closed
Tracked by #13795
kevinwuTT opened this issue Oct 22, 2024 · 3 comments · Fixed by #14127
Closed
Tracked by #13795

[Bug Report] ttnn.clip interface does not follow Pytorch #14099

kevinwuTT opened this issue Oct 22, 2024 · 3 comments · Fixed by #14127
Assignees
Labels

Comments

@kevinwuTT
Copy link

Describe the bug
ttnn.clip has recently changed such that the arguments min and max are keywords. However, the equivalent arguments for torch.clip and also torch.clamp and Hardtanh are positional instead.

https://pytorch.org/docs/stable/generated/torch.clip.html

torch.clip(input, min=None, max=None, *, out=None) → Tensor

https://pytorch.org/docs/main/generated/torch.clamp.html
https://pytorch.org/docs/stable/generated/torch.nn.Hardtanh.html#torch.nn.Hardtanh

This changed was observed since tags/v0.53.0-rc21 but could have been earlier.

To Reproduce

import ttnn
import torch

tensor = torch.rand((4, 4), dtype=torch.bfloat16)

torch_clip = torch.clip(tensor, 0.5, 0.5)

with ttnn.manage_device(device_id=0) as device:
    ttnn_from_torch = ttnn.from_torch(tensor, layout = ttnn.TILE_LAYOUT, dtype = ttnn.bfloat16, device = device)
    ttnn_clip = ttnn.clip(ttnn_from_torch, -0.5, 0.5)
    ttnn_to_torch = ttnn.to_torch(ttnn_clip)

Error message:

TypeError: __call__(): incompatible function arguments. The following argument types are supported:
    1. (self: ttnn._ttnn.operations.unary.clip_t, input_tensor: ttnn._ttnn.tensor.Tensor, *, min: Optional[float] = None, max: Optional[float] = None, memory_config: Optional[ttnn._ttnn.tensor.MemoryConfig] = None) -> ttnn._ttnn.tensor.Tensor

Expected behavior
If the min and max args are positional like torch.clip then ttnn.clip would not show any error.

Please complete the following environment information:

  • OS: Ubuntu 20.04.6 LTS
  • tags/v0.53.0-rc21
@VirdhatchaniKN
Copy link
Contributor

Hi @kevinwuTT
#14127 Should Fix this issue. Can you check this?

@kevinwuTT
Copy link
Author

@VirdhatchaniKN
I can confirm it works with this branch. Thanks!

@VirdhatchaniKN
Copy link
Contributor

Will close issue after PR gets merged : #14127

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants