Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

prioritize batching for torchvision::nms #8593

Open
RishiMalhotra920 opened this issue Aug 14, 2024 · 4 comments
Open

prioritize batching for torchvision::nms #8593

RishiMalhotra920 opened this issue Aug 14, 2024 · 4 comments

Comments

@RishiMalhotra920
Copy link

🚀 The feature

Implement batched nms!

Motivation, pitch

My motivation to create this issue was this warning:

[W BatchedFallback.cpp:84] Warning: There is a performance drop because we have not yet implemented the batching rule for torchvision::nms. Please file us an issue on GitHub so that we can prioritize its implementation. (function warnFallback)

Alternatives

No response

Additional context

No response

@RishiMalhotra920 RishiMalhotra920 changed the title prioritize batched nms prioritize batching for torchvision::nms Aug 14, 2024
@NicolasHug
Copy link
Member

Hi @RishiMalhotra920 , can you please share a minimal reproducing example of the code you tried to run?
I suspect the warning you're observing comes from torch core rather than from torchvision directly.

Not that we do have a batched version of nms here: https://pytorch.org/vision/main/generated/torchvision.ops.batched_nms.html#torchvision.ops.batched_nms

@RishiMalhotra920
Copy link
Author

Ah unfortunately, I can't find the place where i saw the warning again but i think i was doing something with

torch.vmap and torch.box_iou

Additionally, i saw the batched_nms earlier and this threw me off initially since it mentioned that it does not do NMS between objects in different categories. However, to make this work for multiple images in a batch, i would just assign each image in the batch with a different category and it would work as expected. Thanks for pointing me to this!

Once you have noted the suggestion, if you don't have any other questions, you can close this.

@NicolasHug
Copy link
Member

Thanks for the reply @RishiMalhotra920

However, to make this work for multiple images in a batch, i would just assign each image in the batch with a different category and it would work as expected

Can you share a reproducible example of this? If all examples in a batch have a different label, then batched_nms shouls basically be a no-op.

@another-sasha
Copy link

Thanks to the Team for developing PyTorch!

It would be so much useful to have namely batch-NMS, but multiclass-NMS as now. It's extremely helpfull for for those who works with Triton and Jetson and needs to torch.jit.trace or torch.jit.script their model to use them with Triton backend. Triton Python backend doesn't support GPU, so compiling Torch models is the only way to run them on GPU, and it doesn't seem to be changed because of bad connection between Python and Jetson-GPU (see here triton-inference-server/server#4772 (comment)).

Please, it would be remarkable to have NMS which returns the bboxes with the batch dimension and padding with fake bboxes to support the tensor shape.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants