This repository is the official implementation of “Denoising Masked Autoencoders Help Robust Classification”, based on the official implementation of MAE in PyTorch.
@inproceedings{wu2023dmae,
title={Denoising Masked Autoencoders Help Robust Classification},
author={Wu, QuanLin and Ye, Hang and Gu, Yuntian and Zhang, Huishuai and Wang, Liwei and He, Di},
booktitle={The Eleventh International Conference on Learning Representations},
year={2023}
}
The pre-training instruction is in PRETRAIN.md.
The following table provides the pre-trained checkpoints used in the paper:
Model | Size | Epochs | Link |
---|---|---|---|
DMAE-Base | 427MB | 1100 | download |
DMAE-Large | 1.23GB | 1600 | download |
The fine-tuning and evaluation instruction is in FINETUNE.md.
This project is under the CC-BY-NC 4.0 license. See LICENSE for details.