Skip to content

Official code of "Continuous Knowledge-Preserving Decomposition for Few-Shot Continual Learning"

Notifications You must be signed in to change notification settings

xiaojieli0903/CKPD-FSCIL

Repository files navigation

Continuous Knowledge-Preserving Decomposition for Few-Shot Continual Learning

This is the official repository for "Continuous Knowledge-Preserving Decomposition for Few-Shot Continual Learning."

Continuous Knowledge-Preserving Decomposition for Few-Shot Continual Learning PDF
Xiaojie Li^12, Yibo Yang^2, Jianlong Wu^1, David A. Clifton^4, Yue Yu^2, Bernard Ghanem^2, Min Zhang^1
^1Harbin Institute of Technology (Shenzhen), ^2Peng Cheng Laboratory, ^3King Abdullah University of Science and Technology (KAUST), ^4University of Oxford

CKPD-FSCIL Framework

📒 Updates

  • 16 Jan 2025 Released the code.
  • 9 Jan 2025 Released the paper.

🔨 Installation

  1. Create Conda environment:

    conda create --name ckpdfscil python=3.10 -y
    conda activate ckpdfscil
  2. Install dependencies:

    pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu113
    pip install -U openmim
    mim install mmcv-full==1.7.0 mmengine==0.10.4
    pip install opencv-python matplotlib einops timm==0.6.12 scikit-learn transformers==4.44.2
    pip install git+https://github.com/openai/CLIP.git
    git clone https://github.com/state-spaces/mamba.git && cd mamba && git checkout v1.2.0.post1 && pip install .
  3. Clone the repository:

    git clone https://github.com/xiaojieli0903/CKPD-FSCIL.git
    cd Mamba-FSCIL && mkdir ./data

➡️ Data Preparation

  1. Download datasets from NC-FSCIL link.

  2. Organize the datasets:

    ./data/
    ├── cifar/
    ├── CUB_200_2011/
    └── miniimagenet/

➡️ Pretrained Models Preparation

Use tools/convert_pretrained_model.py to convert models. Supported types:

  • CLIP: Converts OpenAI CLIP models.
  • TIMM: Converts TIMM models.

Commands

  • CLIP Model:

    python tools/convert_pretrained_model.py ViT-B/32 ./pretrained_models/clip-vit-base-p32_openai.pth --model-type clip
  • TIMM Model:

    python tools/convert_pretrained_model.py vit_base_patch16_224 ./pretrained_models/vit_base_patch16_224.pth --model-type timm

🚀 Training

Execute the provided scripts to start training:

Mini Imagenet

sh train_miniimagenet.sh

CUB

sh train_cub.sh

✏️ Citation

If you find our work useful in your research, please consider citing:

@article{li2025continuous,
  title={Continuous Knowledge-Preserving Decomposition for Few-Shot Continual Learning},
  author={Li, Xiaojie and Yang, Yibo and Wu, Jianlong and Clifton, David A and Yu, Yue and Ghanem, Bernard and Zhang, Min},
  journal={arXiv preprint arXiv:2501.05017},
  year={2025}
}

About

Official code of "Continuous Knowledge-Preserving Decomposition for Few-Shot Continual Learning"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published