Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RFC] Support inference with TVM #28

Closed
zhiqwang opened this issue Dec 22, 2020 · 3 comments · Fixed by #50
Closed

[RFC] Support inference with TVM #28

zhiqwang opened this issue Dec 22, 2020 · 3 comments · Fixed by #50
Labels
deployment Inference acceleration for production enhancement New feature or request help wanted Extra attention is needed

Comments

@zhiqwang
Copy link
Owner

zhiqwang commented Dec 22, 2020

🚀 Feature

Now, torchvision faster-rcnn and mask-rcnn can be deployed with Relay VM, as we can understand that faster-rcnn and yolov5rt share the same operators. It could be deployed with minor changes.

Additional context

https://discuss.tvm.apache.org/t/error-from-compiling-and-running-retinanet-from-torchvision/8678

@zhiqwang zhiqwang added enhancement New feature or request help wanted Extra attention is needed labels Dec 22, 2020
@zhiqwang
Copy link
Owner Author

With recently updates of tvm and refactor of backbone (maybe this is not necessary), the model can be compiled and run successfully now. Wrapping a model as following:

class TraceWrapper(nn.Module):
    def __init__(self, model):
        super().__init__()
        self.model = model

    def forward(self, inp):
        out = self.model(inp)
        return dict_to_tuple(out[0])

model_func = torch.hub.load('zhiqwang/yolov5-rt-stack', 'yolov5s', pretrained=True)
model = TraceWrapper(model_func)

I'll add more detailed information soon.

This was referenced Jan 28, 2021
@zhiqwang zhiqwang added the deployment Inference acceleration for production label Feb 17, 2021
@Kyrie-Zhao
Copy link

aten::silu not implemented in relay frontend

@zhiqwang
Copy link
Owner Author

zhiqwang commented Jul 12, 2022

aten::silu not implemented in relay frontend

Hi @Kyrie-Zhao , Please upgrade tvm to version 0.8 or above. Or you can load yolort model as belows:

from yolort.models import yolov5s

model = yolov5s(export_friendly=True)
model = model.eval()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
deployment Inference acceleration for production enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants