Skip to content
This repository has been archived by the owner on Jan 16, 2024. It is now read-only.
/ probdet Public archive

Code for "Estimating and Evaluating Regression Predictive Uncertainty in Deep Object Detectors." (ICLR 2021)

License

Notifications You must be signed in to change notification settings

asharakeh/probdet

Repository files navigation

Probabilistic Detectron2

This repository contains the official implementation of Estimating and Evaluating Regression Predictive Uncertainty in Deep Object Detectors.

This code extends the detectron2 framework to estimate bounding box covariance matrices, and is meant to be a starter kit for entering the domain of probabilistic object detection.

Disclaimer

This research code was produced by one person with a single set of eyes, it may contain bugs and errors that I did not notice by the time of release.

Updates

Date Change
30-September-2021 Added pip frozen requirements (requirements_pip_freeze.txt).
10-October-2021 Added ability to perform inference on images without passing through specific dataset handlers.

Requirements

Software Support:

Name Supported Versions
Ubuntu 20.04
Python 3.8
CUDA 11.0+
Cudnn 8.0.1+
PyTorch 1.8+

To install requirements choose between a python virtualenv or build a docker image using the provided Dockerfile.

# Clone repo
git clone https://github.com/asharakeh/probdet.git
cd probdet
git submodule update --init --recursive
  1. Virtual Environment Creation:
# Create python virtual env
mkvirtualenv probdet

# Add library path to virtual env
add2virtualenv src

# Install requirements
cat requirements.txt | xargs -n 1 -L 1 pip install
  1. Docker Image
# Clone repo
git clone https://github.com/asharakeh/probdet.git
cd probdet/Docker

# Build docker image
sh build.sh 

Datasets

COCO Dataset

Download the COCO Object Detection Dataset here. The COCO dataset folder should have the following structure:

 └── COCO_DATASET_ROOT
     |
     ├── annotations
     ├── train2017
     └── val2017

To create the corrupted datasets using Imagenet-C corruptions, run the following code:

python src/core/datasets/generate_coco_corrupted_dataset.py --dataset-dir=COCO_DATASET_ROOT

OpenImages Datasets

Download our OpenImages validation splits here. We created a tarball that contains both shifted and out-of-distribution data splits used in our paper to make our repo easier to use. Do not modify or rename the internal folders as those paths are hard coded in the dataset reader. We will refer to the root folder extracted from the tarball as OPENIM_DATASET_ROOT.

Training

To train the models in the paper, use this command:

python src/train_net.py
--num-gpus xx
--dataset-dir COCO_DATASET_ROOT
--config-file COCO-Detection/architecture_name/config_name.yaml
--random-seed xx
--resume

For an explanation of all command line arguments, use python src/train_net.py -h

Evaluation

To run model inference after training, use this command:

python src/apply_net.py 
--dataset-dir TEST_DATASET_ROOT 
--test-dataset test_dataset_name 
--config-file path/to/config.yaml 
--inference-config /path/to/inference/config.yaml 
--random-seed xx
--image-corruption-level xx

For an explanation of all command line arguments, use python src/apply_net.py -h

--image-corruption-level can vary between 0-5, with 0 being the original COCO dataset with no corruption. In addition, --image-corruption-level has no effect when used with OpenImages dataset splits.

--test-dataset can be one of coco_2017_custom_val, openimages_val, or openimages_ood_val. --dataset-dir corresponds to the root directory of the dataset used. Evaluation code will run inference on the test dataset and then will generate mAP, Negative Log Likelihood, Brier Score, Energy Score, and Calibration Error results. If only evaluation of metrics is required, add --eval-only to the above code snippet.

Inference on new images

We provide a script to perform inference on new images without passing through dataset handlers.

python single_image_inference.py 
--image-dir /path/to/image/dir
--output-dir /path/to/output/dir
--config-file /path/to/config/file 
--inference-config /path/to/inference/config 
--model-ckpt /path/to/model.pth

image-dir is a folder containing all images to be used for inference. output-dir is a folder to write the output json file containing probabilistic detections. model-ckpt is the path to the model checkpoint to be used for inference. Look below to download model checkpoints.

Configurations in the paper

We provide a list of config combinations that generate the architectures used in our paper:

Method Name Config File Inference Config File Model
Deterministic RetinaNet retinanet_R_50_FPN_3x.yaml standard_nms.yaml retinanet_R_50_FPN_3x.pth
RetinaNet NLL retinanet_R_50_FPN_3x_reg_var_nll.yaml standard_nms.yaml retinanet_R_50_FPN_3x_reg_var_nll.pth
RetinaNet DMM retinanet_R_50_FPN_3x_reg_var_dmm.yaml standard_nms.yaml retinanet_R_50_FPN_3x_reg_var_dmm.pth
RetinaNet ES retinanet_R_50_FPN_3x_reg_var_es.yaml standard_nms.yaml retinanet_R_50_FPN_3x_reg_var_es.pth
--- --- --- ---
Deterministic FasterRCNN faster_rcnn_R_50_FPN_3x.yaml standard_nms.yaml faster_rcnn_R_50_FPN_3x.pth
FasterRCNN NLL faster_rcnn_R_50_FPN_3x_reg_covar_nll.yaml standard_nms.yaml faster_rcnn_R_50_FPN_3x_reg_covar_nll.pth
FasterRCNN DMM faster_rcnn_R_50_FPN_3x_reg_var_dmm.yaml standard_nms.yaml faster_rcnn_R_50_FPN_3x_reg_var_dmm.pth
FasterRCNN ES faster_rcnn_R_50_FPN_3x_reg_var_es.yaml standard_nms.yaml faster_rcnn_R_50_FPN_3x_reg_var_es.pth
--- --- --- ---
Deterministic DETR detr_R_50.yaml standard_nms.yaml detr_R_50.pth
DETR NLL detr_R_50_reg_var_nll.yaml standard_nms.yaml detr_R_50_reg_var_nll.pth
DETR DMM detr_R_50_reg_var_dmm.yaml standard_nms.yaml detr_R_50_reg_var_dmm.pth
DETR ES detr_R_50_reg_var_es.yaml standard_nms.yaml detr_R_50_reg_var_es.pth

Experiments in the paper were performed on 5 models trained and evaluated using random seeds [0, 1000, 2000, 3000, 4000]. The variance in performance between different seeds was seen to be negligible, and the results of the top performing seed were reported.

Additional Configurations

The repo supports many more variants including dropout and ensemble methods for estimating epistemic uncertainty. We provide a list of config combinations that generate the architectures used in our paper:

Method Name Config File Inference Config File
RetinaNet Classification Loss Attenuation retinanet_R_50_FPN_3x_cls_la.yaml standard_nms.yaml
RetinaNet Dropout Post-NMS Uncertainty Computation retinanet_R_50_FPN_3x_dropout.yaml mc_dropout_ensembles_post_nms_mixture_of_gaussians.yaml
RetinaNet Dropout Pre-NMS Uncertainty Computation retinanet_R_50_FPN_3x_dropout.yaml mc_dropout_ensembles_pre_nms.yaml
RetinaNet BayesOD with NLL loss retinanet_R_50_FPN_3x_reg_var_nll.yaml bayes_od.yaml
RetinaNet BayesOD with ES loss retinanet_R_50_FPN_3x_reg_var_es.yaml bayes_od.yaml
RetinaNet BayesOD with ES loss and Dropout retinanet_R_50_FPN_3x_reg_var_es_dropout.yaml bayes_od_mc_dropout.yaml
RetinaNet Ensembles Post-NMS Uncertainty Estimation with NLL loss retinanet_R_50_FPN_3x_reg_var_nll.yaml (Need to train 5 Models with different random seeds) ensembles_post_nms_mixture_of_gaussians.yaml
RetinaNet Ensembles Pre-NMS Uncertainty Estimation with NLL loss retinanet_R_50_FPN_3x_reg_var_nll.yaml (Need to train 5 Models with different random seeds) ensembles_pre_nms.yaml
RetinaNet Ensembles Post-NMS Uncertainty Estimation with ES loss retinanet_R_50_FPN_3x_reg_var_es.yaml (Need to train 5 Models with different random seeds) ensembles_post_nms_mixture_of_gaussians.yaml
RetinaNet Ensembles Pre-NMS Uncertainty Estimation with ES loss retinanet_R_50_FPN_3x_reg_var_es.yaml (Need to train 5 Models with different random seeds) ensembles_pre_nms.yaml
--- --- ---
FasterRCNN Classification Loss Attenuation faster_rcnn_R_50_FPN_3x_cls_la.yaml standard_nms.yaml
FasterRCNN Dropout Post-NMS Uncertainty Computation faster_rcnn_R_50_FPN_3x_dropout.yaml mc_dropout_ensembles_post_nms_mixture_of_gaussians.yaml
FasterRCNN Dropout Pre-NMS Uncertainty Computation faster_rcnn_R_50_FPN_3x_dropout.yaml mc_dropout_ensembles_pre_nms.yaml
FasterRCNN BayesOD with NLL loss faster_rcnn_R_50_FPN_3x_reg_var_nll.yaml bayes_od.yaml
FasterRCNN BayesOD with ES loss faster_rcnn_R_50_FPN_3x_reg_var_es.yaml bayes_od.yaml
FasterRCNN BayesOD with ES loss and Dropout retinanet_R_50_FPN_3x_reg_var_es_dropout.yaml bayes_od_mc_dropout.yaml
FasterRCNN Ensembles Post-NMS Uncertainty Estimation with NLL loss faster_rcnn_R_50_FPN_3x_reg_var_nll.yaml (Need to train 5 Models with different random seeds) ensembles_post_nms_mixture_of_gaussians.yaml
FasterRCNN Ensembles Pre-NMS Uncertainty Estimation with NLL loss faster_rcnn_R_50_FPN_3x_reg_var_nll.yaml (Need to train 5 Models with different random seeds) ensembles_pre_nms.yaml
FasterRCNN Ensembles Post-NMS Uncertainty Estimation with ES loss faster_rcnn_R_50_FPN_3x_reg_var_es.yaml (Need to train 5 Models with different random seeds) ensembles_post_nms_mixture_of_gaussians.yaml
FasterRCNN Ensembles Pre-NMS Uncertainty Estimation with ES loss faster_rcnn_R_50_FPN_3x_reg_var_es.yaml (Need to train 5 Models with different random seeds) ensembles_pre_nms.yaml
--- --- ---
DETR Classification Loss Attenuation detr_R_50_cls_la.yaml standard_nms.yaml
DETR Dropout detr_R_50.yaml (dropout is included in original implementation of DETR) mc_dropout_ensembles_post_nms_mixture_of_gaussians.yaml
DETR Ensembles with NLL loss detr_R_50_reg_var_nll.yaml (Need to train 5 Models with different random seeds) ensembles_post_nms_mixture_of_gaussians.yaml
DETR Ensembles with ES loss detr_R_50_reg_var_es.yaml (Need to train 5 Models with different random seeds) ensembles_post_nms_mixture_of_gaussians.yaml

DETR has no NMS post-processing, and as such does not support BayesOD NMS replacement. The repo also supports many additional lower performing configurations. I will continue developing it and add additional configurations when time allows.

Citation

If you use this code, please cite our paper:

@inproceedings{
harakeh2021estimating,
title={Estimating and Evaluating Regression Predictive Uncertainty in Deep Object Detectors},
author={Ali Harakeh and Steven L. Waslander},
booktitle={International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=YLewtnvKgR7}
}

License

This code is released under the Apache 2.0 License.

About

Code for "Estimating and Evaluating Regression Predictive Uncertainty in Deep Object Detectors." (ICLR 2021)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages