You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[feature request] Support batches (arbitrary number of batch dims) for box_iou / generalized_box_iou / box_area / box_convert / clip_boxes_to_image
#3478
Open
vadimkantorov opened this issue
Mar 1, 2021
· 2 comments
Currently it supports only (N4, M4) -> NM. I propose to also support (BN4, BM4) -> BNM. It would also be nice if they supported arbitrary number of batch dimensions.
This is useful for computing a cost matrix between predicted boxes and ground truth boxes for a batch of frames. Probably it can be done by adjusting tensor indexing. Something like that:
But given how frequent this is, I propose this (and maybe some related box ops) are included in core torchvision
vadimkantorov
changed the title
[feature request] Support batches for box_iou / generalized_box_iou
[feature request] Support batches for box_iou / generalized_box_iou / box_area / box_convert
Nov 1, 2021
vadimkantorov
changed the title
[feature request] Support batches for box_iou / generalized_box_iou / box_area / box_convert
[feature request] Support batches (arbitrary number of batch dims) for box_iou / generalized_box_iou / box_area / box_convert
Nov 17, 2021
vadimkantorov
changed the title
[feature request] Support batches (arbitrary number of batch dims) for box_iou / generalized_box_iou / box_area / box_convert
[feature request] Support batches (arbitrary number of batch dims) for box_iou / generalized_box_iou / box_area / box_convert / clip_boxes_to_image
Nov 22, 2021
Currently it supports only
(N4, M4) -> NM
. I propose to also support(BN4, BM4) -> BNM
. It would also be nice if they supported arbitrary number of batch dimensions.This is useful for computing a cost matrix between predicted boxes and ground truth boxes for a batch of frames. Probably it can be done by adjusting tensor indexing. Something like that:
The text was updated successfully, but these errors were encountered: