-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix naming of statistics in MeanAveragePrecision
with custom max det thresholds
#2367
Conversation
MeanAveragePrecision
with custom max det thresholds
Codecov Report
Additional details and impacted files@@ Coverage Diff @@
## master #2367 +/- ##
========================================
- Coverage 69% 36% -33%
========================================
Files 305 305
Lines 17164 17164
========================================
- Hits 11836 6208 -5628
- Misses 5328 10956 +5628 |
detection per image | ||
- mar_{mdt[1]}: (:class:`~torch.Tensor`), mean average recall for `max_detection_thresholds[1]` (default 10) | ||
detection per image | ||
- mar_{mdt[1]}: (:class:`~torch.Tensor`), mean average recall for `max_detection_thresholds[2]` (default 100) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this a typo here? Should it maybe be
mar_{mdt[2]}: (:class:`~torch.Tensor`), mean average recall for `max_detection_thresholds[2]` (default 100)
mar_{mdt[1]}: (:class:`~torch.Tensor`), mean average recall for `max_detection_thresholds[2]` (default 100)
I realize this is merged already, but maybe this can be fixed boyscout-style in some future PR.
What does this PR do?
Fixes #2360
Two fixes in this PR:
Beforehand we postulated that
faster-coco-eval
supported morelen(max_det_thresholds) > 3
. However, looking at the code again, it is not true. Onlylen(max_det_thresholds)==3
is actually supported, which we need to guard against. Here is the relevant code in both backends:https://github.com/MiXaiLL76/faster_coco_eval/blob/3475e078ec65e21f2ecf320b8493546357040a34/faster_coco_eval/core/cocoeval.py#L434-L442
https://github.com/cocodataset/cocoapi/blob/8c9bcc3cf640524c4c20a9c40e89cb6a2f2fa0e9/PythonAPI/pycocotools/cocoeval.py#L466-L468
(note how
self.params.maxDets[0]
,self.params.maxDets[1]
,self.params.maxDets[2]
are hardcoded in both backends)Even when the user provides a custom list with length 3 like
[1,100,1000]
the returned scores still saidmar_1
,mar_10
andmar_100
(notmar_1000
). This is just a simple renaming of the returned statistics.Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃
📚 Documentation preview 📚: https://torchmetrics--2367.org.readthedocs.build/en/2367/