Authors: Qiang Zhai, Xin Li, Fan Yang, Chenglizhao Chen, Hong Cheng, Deng-Ping Fan.
-
Configuring your environment (Prerequisites): The training and testing experiments are conducted using PyTorch.
Note that MGLNet is only tested on Ubuntu OS with the following environments. It may work on other operating systems as well but we do not guarantee that it will.
- Creating a virtual environment in terminal:
conda env create -f env.yaml
.
- Creating a virtual environment in terminal:
- Downloading Testing Sets:
- downloading NEW testing dataset (COD10K-test + CAMO-test + CHAMELEON), which can be found in this Google Drive link or Baidu Pan link with the fetch code: z83z.
-
Testing Configuration:
- After you download all the trained models Google Drive link or Baidu Pan link with the fetch code: ry8h, move it into './model_file/', and testing data.
- Assigning your comstomed path in 'config/cod_mgl50.yaml', like 'data_root', 'test_list'.
- Ensure consistency between 'stage' and 'model_path'. Setting 'stage: 1' and 'model_path: pre-trained/mgl_s.pth' to evaluate S-MGL model and setting 'stage: 2' and 'model_path: pre-trained/mgl_r.pth' to evaluate R-MGL model.
- Playing 'test.py' to generate the final prediction map, the predicted camouflaged object region and cmouflaged object edge is saved into 'exp/result' as default.
- You can also download the results Google Drive link or Baidu Pan link with the fetch code: b1gr.
-
Other Dataset:
- For NC4K dataset: You can find the results in Google Drive link or Baidu Pan link with the fetch code: 8ntb.
-
Evaluation your trained model:
- One-key evaluation is written in MATLAB code (revised from link),
please follow this the instructions in
main.m
and just run it to generate the evaluation results in./EvaluationTool/EvaluationResults/Result-CamObjDet/
.
- One-key evaluation is written in MATLAB code (revised from link),
please follow this the instructions in
-
Training Configuration:
- After you download the initial model Google Drive link or Baidu Pan link, move it to './pre_trained/'.
- Put the 'train_test_file/train.lst' to the path which is included in cod_mgl50.yaml.
- Run train.py
-
If you think this work is helpful, please cite
@inproceedings{zhai2021Mutual,
title={Mutual Graph Learning for Camouflaged Object Detection},
author={Zhai, Qiang and Li, Xin and Yang, Fan and Chen, Chenglizhao and Cheng, Hong and Fan, Deng-Ping},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={},
year={2021}
}