Skip to content

Commit

Permalink
Fix docstring in MAP (#1751)
Browse files Browse the repository at this point in the history
  • Loading branch information
SkafteNicki authored May 2, 2023
1 parent 67ddbbc commit 3c0d8a2
Showing 1 changed file with 6 additions and 3 deletions.
9 changes: 6 additions & 3 deletions src/torchmetrics/detection/mean_ap.py
Original file line number Diff line number Diff line change
Expand Up @@ -150,9 +150,12 @@ def _segm_iou(det: List[Tuple[np.ndarray, np.ndarray]], gt: List[Tuple[np.ndarra
class MeanAveragePrecision(Metric):
r"""Compute the `Mean-Average-Precision (mAP) and Mean-Average-Recall (mAR)`_ for object detection predictions.
Predicted boxes and targets have to be in Pascal VOC format (xmin-top left, ymin-top left, xmax-bottom right,
ymax-bottom right). The metric can both compute the mAP and mAR values per class or as an global average over all
classes.
.. math::
\text{mAP} = \frac{1}{n} \sum_{i=1}^{n} AP_i
where :math:`AP_i` is the average precision for class :math:`i` and :math:`n` is the number of classes. The average
precision is defined as the area under the precision-recall curve. If argument `class_metrics` is set to ``True``,
the metric will also return the mAP/mAR per class.
As input to ``forward`` and ``update`` the metric accepts the following input:
Expand Down

0 comments on commit 3c0d8a2

Please sign in to comment.