This is the repository containing the code and results for the paper Efficient Object Detection in Autonomous Driving using Spiking Neural Networks: Performance, Energy Consumption Analysis, and Insights into Open-set Object Discovery. The paper is currently being evaluated for the ITSC 2023, but a preprint is available at arXiv (https://arxiv.org/abs/2312.07466).
Requierements used:
- norse 0.0.7
- torch 1.12.1 + cu113
- torchaudio 0.12.1
- torchvision 0.13.1
- pycocotools
- PyYAML
- Common science python kit (numpy, pandas, scypy, sklearn)
- Tensorboard
It is recommended to use a virtualenv. It can be installed also in conda environments.
- Install PyTorch desired GPU version (https://pytorch.org/get-started/previous-versions), checking compatibility with Norse (https://github.com/norse/norse)
- Install Norse. The recommended way is to clone the repo and then run
python setup.py install
. In case of problems with norse, refer to installation troubleshooting of norse. - Install the other dependencies.
To download the dataset, visit the official website, register, and download the leftImg8bit_trainvaltest.zip
file, that can be found in the
downloads
section. Extract its content in data/cityscapes folder.
The annotations are automatically downloaded from this repository to data/annotations.
Download the images from the official website clicking on the button "100k Images".
The labels are downloaded automatically from the cloud, as they have been converted into coco format manually.
Download the dataset from the official website (Dataset Name: IDD Detection) and extract the content of the downloaded file to data/idd/
.
First download all models from the following link and include them in the desired folder. We recomend outputs/<dataset name>/
.
Run
standard_metrics.sh
Run
noise_metrics.sh
Precision and recall metrics:
python test_and_energy_eff.py -d cityscapes -b 2 --rpn-snn --detector-snn --load-model outputs/cityscapes/model_Cityscapes_SNN_Trpn8_Tdet12.pth --test-only -o metrics
Efficiency w.r.t the NoSNN models. For this computation the following modifications on the source code are needed. All changes are marked in the source code with the flag ### EXTRACT SPIKE RATES ###
. It can be followed by the word "activate" or "deactivate":
- If
### EXTRACT SPIKE RATES ###
activate, then uncomment that block for the computations - If
### EXTRACT SPIKE RATES ###
deactivate, then comment that block for the computations
- Change the forward function of the RPNHeadSNN (from rpn.py) and the RoIHeadsSNN (from faster_rcnn.py) to the one indicated with the flag (comment the normal forward function)
- Edit both RegionProposalNetwork (from rpn.py) and RoIHeadsSNN (from roi_heads.py) to return immediately the spike rates after computing them to skip all the unnecessary code of the transformations. Again, look for the flag. In rpn.py, 2 blocks of the RegionProposalNetwork object have to be activated and 1 deactivated. In roi_heads.py only one block has to be activated.
- In GeneralizedRCNN, just uncomment the part indicated with the flag.
python test_and_energy_eff.py -d cityscapes -b 2 --rpn-snn --detector-snn --load-model outputs/cityscapes/model_Cityscapes_SNN_Trpn8_Tdet12.pth --test-only -o efficiency
First it is necessary to extract the proposals and detections from the images. With the -n-img
argument the number of images retrieved can be modified:
Cityscapes
python train.py -d bdd --batch-size 4 --rpn-snn --detector-snn --load-model outputs/bdd/model_BDD_SNN_5cls.pth --only-known-cls -ext-prop-det test -n-img 500
BDD
python train.py -d bdd --batch-size 4 --rpn-snn --detector-snn --load-model outputs/bdd/model_BDD_SNN_5cls.pth --only-known-cls -ext-prop-det test -n-img 2000
To plot the images:
Cityscapes
python new_object_discovery.py -d cityscapes -f outputs/cityscapes/test_results_per_img_cityscapes.pt --score-thr 2 --nms-thr 0.25 -max 6 --save-images 50 --only-known-cls --iou-thr 0 --plot-all-in-one-img
BDD
python new_object_discovery.py -d bdd -f outputs/cityscapes/test_results_per_img_bdd.pt --score-thr 2 --nms-thr 0.25 -max 6 --save-images 50 --only-known-cls --iou-thr 0 --plot-all-in-one-img
Following the training protocol exposed in the paper, we first train the RPN, then the detector, and finally tune all together with the FPN. Example with Cityscapes:
Training RPN:
python train.py -d cityscapes -b 2 --epochs 25 --lr-decay-rate 0.5 --lr 0.0005 --lr-decay-milestones 10 15 20 --rpn-snn --detector-snn --freeze-fpn --freeze-detector
Training Detector:
python train.py -d cityscapes -b 2 --epochs 25 --lr-decay-rate 0.5 --lr 0.0005 --lr-decay-milestones 10 15 20 --rpn-snn --detector-snn --freeze-fpn --freeze-rpn --load-model /outputs/cityscapes/path-to-the-model
Finetuning FPN:
python train.py -d cityscapes -b 2 --epochs 15 --lr-decay-rate 0.5 --lr 0.00005 --lr-decay-milestones 5 10 --rpn-snn --detector-snn --load-model /outputs/cityscapes/path-to-the-model
To train only for certain amount of classes, add --only-known-cls
and the classes indicated as known in the config file of the dataset will be used.
Obtain the metrics of the trained model
python train.py -d cityscapes -b 2 --rpn-snn --detector-snn --load-model /outputs/cityscapes/path-to-the-model --test-only
To add noise to testing
python train.py -d cityscapes -b 2 --rpn-snn --detector-snn --load-model /outputs/cityscapes/path-to-the-model --test-only --add-noise
To test only for certain amount of classes, add --only-known-cls
and the classes indicated as known in the config file of the dataset will be used.