Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Compatible] Fix batch_matmul from Relay #155

Merged
merged 3 commits into from
Feb 23, 2023

Conversation

comaniac
Copy link
Contributor

Description

apache/tvm#13927 changes the logic of converting PyTorch batch_matmul to Relay. Specifically, it uses Relay's batch_matmul attribute transpose_b to avoid explicitly adding a transpose op. However, the current logic of converting Relay to RAF doesn't consider this attribute and results in incorrect IRs.

This PR thus fixes the Relay to RAF conversion logic to dispatch Relay batch_matmul to one of batch_matmul_xy ops.

Checklist

  • PR's title starts with a category (e.g. [BUGFIX], [MODEL], [TUTORIAL], [FEATURE], [DOC], etc)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage
  • Code is well-documented

cc @awslabs/raf-reviewer

@comaniac comaniac merged commit 51ec502 into awslabs:main Feb 23, 2023
@comaniac comaniac deleted the fix_batch_matmul branch February 23, 2023 21:01
comaniac added a commit to comaniac/raf that referenced this pull request Feb 24, 2023
comaniac added a commit that referenced this pull request Feb 24, 2023
* Revert "[Compatible] Fix batch_matmul from Relay (#155)"

This reverts commit 51ec502.

* Revert "[TVM] Update Submodule (#154)"

This reverts commit fc48da9.

* Revert "[TVM] Update Submodule (#153)"

This reverts commit b33f2ac.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant