Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bugfix][Relay] Fix softplus about the wrong calculation formula in Relay PyTorch frontend #14821

Merged
merged 4 commits into from
May 12, 2023

Conversation

jikechao
Copy link
Contributor

Due to lacking consider of the attribute threshold in Softplus, the inference results in TVM are different from PyTorch.
You can see the definition of Softplus in Pytorch documentation

Expected Behavior

The TVM gives the same inference results as PyTorch.

Actual Behavior

image

Steps for reproduce

import torch
from tvm import relay
import tvm
import numpy as np

m = torch.nn.Softplus(1, 2,)
input_data = torch.tensor([[1.0, 4.0]], dtype=torch.float32)

torch_outputs = m(input_data)

trace = torch.jit.trace(m, input_data)
input_shapes = [('input0', torch.Size([1, 2]))]

mod, params = relay.frontend.from_pytorch(trace, input_shapes)

with tvm.transform.PassContext(opt_level=3):
    exe = relay.create_executor('graph', mod=mod, params=params, device=tvm.device('llvm', 0), target='llvm').evaluate()
input_tvm = {'input0': np.array([[1.,  4.]], dtype='float32')}
tvm_outputs = exe(**input_tvm).asnumpy()

np.testing.assert_allclose(torch_outputs, tvm_outputs, rtol=1e-3, atol=1e-3)

cc @Hzfengsy @echuraev

@tvm-bot
Copy link
Collaborator

tvm-bot commented May 10, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

Generated by tvm-bot

@github-actions github-actions bot requested review from Hzfengsy and echuraev May 10, 2023 18:15
@jikechao jikechao changed the title Fix softplus about the wrong calculation formula in Relay PyTorch frontend [Bugfix][Relay] Fix softplus about the wrong calculation formula in Relay PyTorch frontend May 11, 2023
Copy link
Contributor

@echuraev echuraev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks

@echuraev
Copy link
Contributor

@tvm-bot rerun

@jikechao
Copy link
Contributor Author

The added test cases produce a wired flaky problem, I'll try to figure out it later.

@jikechao
Copy link
Contributor Author

This problem has been solved. Waiting for the CI testing.

@masahi masahi merged commit 483b87d into apache:main May 12, 2023
masahi pushed a commit to masahi/tvm that referenced this pull request May 13, 2023
…elay PyTorch frontend (apache#14821)

* fix softplus operator

* add test cases

* Update pytorch.py

* Update pytorch.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants