Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Torch] Upsampling op support and enable registering a user defined op conversion map #4961

Merged
merged 10 commits into from
Mar 1, 2020

Conversation

masahi
Copy link
Member

@masahi masahi commented Feb 28, 2020

This adds support for torch upsample op. It enables running image segmentation models from torchvision (see the test case).

I also added a simple mechanism to register a custom op conversion map, that can be used with the predefined _convert_map we already have. The motivation is that torch has too many ops to cover, and it also allows defining custom operators. This makes implementing all op conversion in a single map inside Relay frontend infeasible, and it would be very tedious if we have to modify TVM source each time we need to add a new conversion. Also in torch there are many esoteric operators that doesn't justify for being added to the "official" conversion map in the frontend.

For example of how to use the custom conversion map, see the test case test_custom_conversion_map() where I show how to convert torchvision's custom op torchvision::roi_align using Relay's vision.roi_align op.

Please review @jwfromm @alexwong @vinx13 @anijain2305 @icemelon9

@masahi masahi changed the title [Relay, Torch] Add upsampling op support and enable registering a user defined op conversion map [Torch] Upsampling op support and enable registering a user defined op conversion map Feb 28, 2020
@masahi
Copy link
Member Author

masahi commented Feb 28, 2020

@tqchen @comaniac I get an error from sphinx issues which is not related to this PR, can you take a look? It seems this comes from task_sphinx_precheck.sh that you added earlier.
https://ci.tvm.ai/blue/organizations/jenkins/tvm/detail/PR-4961/1/pipeline/150

@comaniac
Copy link
Contributor

I saw these errors before but have no idea how were they resolved. Could you rebase to upstream master to see if errors persist?

@masahi
Copy link
Member Author

masahi commented Feb 28, 2020

It is already rebased to the latest commit. The error is still there.
https://github.com/masahi/tvm/commits/torch-more-operators

@masahi
Copy link
Member Author

masahi commented Feb 28, 2020

@tqchen @comaniac other PRs are blocked by the same CI error.

@leandron
Copy link
Contributor

leandron commented Feb 28, 2020

On a quick look, it seems it is behaving as expected by task_sphinx_precheck.sh. It is not a CI unexpected crash.

task_sphinx_precheck.sh found warnings during Sphinx execution and it is exiting with status 1 (non-zero), so it fails the Jenkins pipeline.

The warnings are not introduced by your change, so it seems we need a pre-patch fixing docs, in order to unblock the build for this new check, introduced recently.

The relevant lines are:
https://github.com/apache/incubator-tvm/blob/7ccb4363eefa76ae355ee263aa5527d43fb699fb/tests/scripts/task_sphinx_precheck.sh#L43-L46

@leandron
Copy link
Contributor

Discussion about fixing docs is happening in #4908.

@comaniac
Copy link
Contributor

Yes. It is due to the doc checker introduced yesterday. I'm looking into the problem now.

@jwfromm
Copy link
Contributor

jwfromm commented Feb 28, 2020

Left minor comments but otherwise looks good. Having a custom convert map is a neat idea that I'd like to see added to other frontends.

@tqchen
Copy link
Member

tqchen commented Feb 28, 2020

docs error should be fixed by #4967

@masahi masahi force-pushed the torch-more-operators branch from 07c63dd to d610392 Compare February 28, 2020 22:09
@masahi
Copy link
Member Author

masahi commented Feb 28, 2020

@alexwong @jwfromm I removed the dependency on packaging and dropped support for PT 1.2 and older per discussion https://discuss.tvm.ai/t/pytorch-frontend-no-module-named-packaging/5829.

@masahi
Copy link
Member Author

masahi commented Feb 29, 2020

@vinx13 @anijain2305 @icemelon9 comments are addressed and tests passed, I think it is ready for merge. A custom convert map could be of interest to other frontends.

Copy link
Contributor

@anijain2305 anijain2305 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, with a minor suggestion.

tests/python/frontend/pytorch/test_forward.py Outdated Show resolved Hide resolved
@masahi masahi force-pushed the torch-more-operators branch from 4bd9cdb to e410221 Compare February 29, 2020 22:53
@masahi masahi merged commit 92a2427 into apache:master Mar 1, 2020
@masahi
Copy link
Member Author

masahi commented Mar 1, 2020

Thanks @jwfromm @alexwong @anijain2305

trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Apr 16, 2020
…p conversion map (apache#4961)

* add custom conversion map

* add roi align test using custom convert map

* refactor test

* add support for upsampling op and test on segmentation models

* remove redundant no_grad

* add upsampling test case

* make the default custom map None, instead of empty dict

* updated tests, remove packaging and drop PT 1.2 support

* add better support for aten::to and tests

* add a note on dilation in x86
zhiics pushed a commit to neo-ai/tvm that referenced this pull request Apr 17, 2020
…p conversion map (apache#4961)

* add custom conversion map

* add roi align test using custom convert map

* refactor test

* add support for upsampling op and test on segmentation models

* remove redundant no_grad

* add upsampling test case

* make the default custom map None, instead of empty dict

* updated tests, remove packaging and drop PT 1.2 support

* add better support for aten::to and tests

* add a note on dilation in x86
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants