Changhong Fu*, Liangliang Yao, Haobo Zuo, Guangze Zheng, Jia Pan
- * Corresponding author.
- SAM-DA is accepted by IEEE ICARM.
- The paper “SAM-DA: UAV Tracks Anything at Night with SAM-Powered Domain Adaptation” is awarded the Toshio Fukuda Best Paper Award in Mechatronics of ICARM 2024!
This code has been tested on Ubuntu 18.04, Python 3.8.3, Pytorch 1.13.1, and CUDA 11.6. Please install related libraries before running this code:
Install Segment Anything:
bash install.sh
Install SAM-DA-Track:
pip install -r requirements.txt
-
Download a model checkpoint below and put it in
./tracker/BAN/snapshot
.Training data Model Source 1 Source 2 Source 3 SAM-NAT-B (base, default) sam-da-track-b
Baidu Google Hugging face SAM-NAT-S (small) sam-da-track-s
Baidu Google Hugging face SAM-NAT-T (tiny) sam-da-track-t
Baidu Google Hugging face SAM-NAT-N (nano) sam-da-track-n
Baidu Google Hugging face -
Download NUT-L dataset and put it in
./tracker/BAN/test_dataset
. -
Test and evalute on NUT-L with
default
settings.
cd tracker/BAN
python tools/test.py
python tools/eval.py
- (optional) Test with other checkpoints (e.g.,
sam-da-track-s
):
cd tracker/BAN
python tools/test.py --snapshot sam-da-track-s
python tools/eval.py
-
SAM-powered target domain training sample swelling on NAT2021-train.
- Download original nighttime dataset NAT2021-train and put it in
./tracker/BAN/train_dataset/sam_nat
. - Sam-powered target domain training sample swelling!
bash swell.sh
⚠️ warning: A huge passport is necessary for saving data.Training jsons are here: Baidu.
- Download original nighttime dataset NAT2021-train and put it in
-
Prepare daytime dataset [VID] and [GOT-10K].
-
Train
sam-da-track-b
(default) and other models.cd tracker/BAN python tools/train.py --model sam-da-track-b
SAM-DA aims to reach the few-better training for quick deployment of night-time tracking methods for UAVs.
- SAM-DA enriches the training samples and attributes (ambient intensity) of target domain.
-
SAM-DA can achieve better performance on fewer raw images with quicker training.
Method Training data Images Propotion Training AUC (NUT-L) Baseline NAT2021-train 276k 100% 12h 0.377 SAM-DA SAM-NAT-N 28k 10% 2.4h 0.411 SAM-DA SAM-NAT-T 92k 33% 4h 0.414 SAM-DA SAM-NAT-S 138k 50% 6h 0.419 SAM-DA SAM-NAT-B 276k 100% 12h 0.430 For more details, please refer to the paper.
Training duration on a single A100 GPU.
The model is licensed under the Apache License 2.0 license.
Please consider citing the related paper(s) in your publications if it helps your research.
@Inproceedings{Yao2023SAMDA,
title={{SAM-DA: UAV Tracks Anything at Night with SAM-Powered Domain Adaptation}},
author={Fu, Changhong and Yao, Liangliang and Zuo, Haobo and Zheng, Guangze and Pan, Jia},
booktitle={Proceedings of the IEEE International Conference on Advanced Robotics and Mechatronics (ICARM)},
year={2024}
pages={1-8}
}
@article{kirillov2023segment,
title={{Segment Anything}},
author={Kirillov, Alexander and Mintun, Eric and Ravi, Nikhila and Mao, Hanzi and Rolland, Chloe and Gustafson, Laura and Xiao, Tete and Whitehead, Spencer and Berg, Alexander C and Lo, Wan-Yen and others},
journal={arXiv preprint arXiv:2304.02643},
year={2023}
pages={1-30}
}
@Inproceedings{Ye2022CVPR,
title={{Unsupervised Domain Adaptation for Nighttime Aerial Tracking}},
author={Ye, Junjie and Fu, Changhong and Zheng, Guangze and Paudel, Danda Pani and Chen, Guang},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2022},
pages={1-10}
}
We sincerely thank the contribution of following repos: SAM, SiamBAN, and UDAT.
If you have any questions, please contact Liangliang Yao at [email protected] or Changhong Fu at [email protected].