ALOFT: A Lightweight MLP-like Architecture with Dynamic Low-frequency Transform for Domain Generalization
- Python == 3.7.3
- torch >= 1.8
- Cuda == 10.1
- Torchvision == 0.4.2
- timm == 0.4.12
Please download PACS dataset from here.
Make sure you use the official train/val/test split in PACS paper.
Take /data/DataSets/
as the saved directory for example:
images -> /data/DataSets/PACS/kfold/art_painting/dog/pic_001.jpg, ...
splits -> /data/DataSets/PACS/pacs_label/art_painting_crossval_kfold.txt, ...
Then set the "data_root"
as "/data/DataSets/"
and "data"
as "PACS"
in "main_gfnet.py"
.
You can directly set the "data_root"
and "data"
in "ALOFT-S.sh"
/"ALOFT-S.sh"
for training the model.
To evaluate the performance of the models, you can download the models trained on PACS as below:
Methods | Acc (%) | models |
---|---|---|
Strong Baseline | 87.76 | download |
ALOFT-S | 90.88 | download |
ALOFT-E | 91.58 | download |
Please set the --eval = 1
and --resume
as the saved path of the downloaded models. e.g., /trained/model/path/photo/checkpoint.pth
. Then you can simple run:
python main_gfnet.py --target $domain --data 'PACS' --device $device --eval 1 --resume '/trained/model/path/photo/checkpoint.pth'
Firstly download the GFNet-H-Ti model pretrained on ImageNet from here and save it to /pretrained_model
. To run ALOFT-E, you could run the following code. Please set the --data_root
argument needs to be changed according to your folder.
bash ALOFT-E.sh
You can also train the ALOFT-S model by running the following code:
base ALOFT-S.sh
Part of our code is borrowed from the following repositories.
- GFNet: "Global Filter Networks for Image Classification", NeurIPS 2021
- DSU: "Uncertainty modeling for out-of-distribution generalization", ICLR 2022
We thank to the authors for releasing their codes. Please also consider citing their works.