Skip to content

Combine multiple MNIST digits to create datasets with 100/1000 classes for few-shot learning/meta-learning

License

Notifications You must be signed in to change notification settings

shaohua0116/MultiDigitMNIST

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-digit MNIST for Few-shot Learning

Cite this repository

@misc{mulitdigitmnist,
  author = {Sun, Shao-Hua},
  title = {Multi-digit MNIST for Few-shot Learning},
  year = {2019},
  journal = {GitHub repository},
  url = {https://github.com/shaohua0116/MultiDigitMNIST},
}

Papers that use this dataset:

  • MetaSDF: Meta-learning Signed Distance Functions (NeurIPS 2020): Paper, Project page, Code
  • Regularizing Deep Multi-Task Networks using Orthogonal Gradients: Paper
  • GMAIR: Unsupervised Object Detection Based on Spatial Attention and Gaussian Mixture: Paper
  • Data-free meta learning via knowledge distillation from multiple teachers: Thesis

Description

Multi-digit MNIST generator creates datasets consisting of handwritten digit images from MNIST for few-shot image classification and meta-learning. It simply samples images from MNIST dataset and put digits together to create images with multiple digits. It also creates training/validation/testing splits (64/20/16 classes for DoubleMNIST and 640/200/160 for TripleMNIST).

You can generate customized by following the cammands provided in Usage to change the number of images in each class, the image size, etc. You can also download generated datasets from Datasets.

This repository benchmarks the performance of MAML (Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks) using datasets created via the generation script in a variety of settings.

Some examples of images from the datasets are as follows.

  • Double MNIST Datasets (100 classes: 00 to 99)
Class 10 48 59 62 73
Image
  • Triple MNIST Datasets (1000 classes: 000 to 999)
Class 039 146 258 512 874
Image

Prerequisites

Usage

Generate a DoubleMNIST dataset with 1k images for each class

python generator.py --num_image_per_class 1000 --multimnist_path ./dataset/double_mnist --num_digit 2 --image_size 64 64

Generate a TripleMNIST dataset with 1k images for each class

python generator.py --num_image_per_class 1000 --multimnist_path ./dataset/triple_mnist --num_digit 3 --image_size 84 84

Arguments

  • --mnist_path: the path to the MNIST dataset (download it if not found)
  • --multimnist_path: the path to the output Multi-digit MNIST dataset
  • --num_digit: how many digits in an image
  • --train_val_test_ratio: determine how many classes for train, val, and test
  • --image_size: the size of images. Note that the width needs to be larger than num_digit * mnist_width
  • --num_image_per_class: how many images for each class
  • --random_seed: numpy random seed

Datasets

You can download the generated datasets

Dataset Image size Train/Val/Test classes # of images per class File size link
DoubleMNIST (64, 64) 64, 16, 20 1000 69MB Google Drive
TripleMNIST (84, 84) 640, 160, 200 1000 883MB Google Drive

Benchmark

This repository benchmarks training MAML (Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks) using datasets created via this generation script in a variety of settings.

Dataset/Setup 5-way 1-shot 5-way 5-shot 20-way 1-shot 20-way 1-shot
Double MNIST 97.046% in progress 85.461% in progress
Triple MNIST 98.813% in progress 96.251% in progress
Omniglot 98.7% 99.9% 95.8% 98.9%

Hyperparameters

  • slow learning rate: 1e-3
  • fast learning rate: 0.4
  • number of gradient steps: 1
  • meta batch size: 12
  • number of conv layers: 4
  • iterations: 100k

Training

*The trainings have not fully converged and the new results will be reported once they are finished.

Author

Shao-Hua Sun

About

Combine multiple MNIST digits to create datasets with 100/1000 classes for few-shot learning/meta-learning

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages