Skip to content

Latest commit

 

History

History
248 lines (188 loc) · 12.7 KB

README.md

File metadata and controls

248 lines (188 loc) · 12.7 KB

Open Neural Network eXchange (ONNX) Model Zoo

Generic badge Generic badge

The ONNX Model Zoo is a collection of pre-trained models for state-of-the-art models in deep learning, available in the ONNX format. Accompanying each model are Jupyter notebooks for model training and running inference with the trained model. The notebooks are written in Python and include links to the training dataset as well as references to the original paper that describes the model architecture. The notebooks can be exported and run as python(.py) files.

What is ONNX?

The Open Neural Network eXchange (ONNX) is an open format to represent deep learning models. With ONNX, developers can move models between state-of-the-art tools and choose the combination that is best for them. ONNX is developed and supported by a community of partners.

Models

Image Classification

This collection of models take images as input, then classifies the major objects in the images into a set of predefined classes.

Model Class Reference Description
MobileNet Sandler et al. Efficient CNN model for mobile and embedded vision applications.
Top-5 error from paper - ~10%
ResNet He et al., He et al. Very deep CNN model (up to 152 layers), won the ImageNet Challenge in 2015.
Top-5 error from paper - ~6%
SqueezeNet Iandola et al. A light-weight CNN providing Alexnet level accuracy with 50X fewer parameters.
Top-5 error from paper - ~20%
VGG Simonyan et al. Deep CNN model (upto 19 layers) which won the ImageNet Challenge in 2014.
Top-5 error from paper - ~8%
Bvlc_AlexNet Krizhevsky et al. Deep CNN model for Image Classification
Bvlc_GoogleNet Szegedy et al. Deep CNN model for Image Classification
Bvlc_reference_CaffeNet Krizhevsky et al. Deep CNN model for Image Classification
Bvlc_reference_RCNN_ILSVRC13 Girshick et al. Deep CNN model for Image Classification
DenseNet121 Huang et al. Deep CNN model for Image Classification
Inception_v1 Szegedy et al. Deep CNN model for Image Classification
Inception_v2 Szegedy et al. Deep CNN model for Image Classification
ShuffleNet Zhang et al. Deep CNN model for Image Classification
ZFNet512 Zeiler et al. Deep CNN model for Image Classification

Semantic Segmentation

Semantic segmentation models partition an input image by labeling each pixel into a set of pre-defined categories.

Model Class Reference Description
DUC Wang et al. Deep CNN based model with >80% mIOU (mean Intersection Over Union) trained on urban street images
FCN Long et al. contribute

Object Detection & Segmentation

These models detect the presence of multiple objects in an image and segment out areas of the image where the objects are detected.

Model Class Reference Description
Tiny_YOLOv2 Redmon et al. Deep CNN model for Object Detection
SSD Liu et al. contribute
Faster-RCNN Ren et al. contribute
Mask-RCNN He et al. contribute
YOLO v2 Redmon et al. contribute
YOLO v3 Redmon et al. contribute

Face Detection and Recognition

These models detect and/or recognize human faces in images. Some more popular models are used for detection/recognition of celebrity faces, gender, age, and emotions.

Model Class Reference Description
ArcFace Deng et al. ArcFace is a CNN based model for face recognition which learns discriminative features of faces and produces embeddings for input face images.
CNN Cascade Li et al. contribute

Emotion Recognition

Model Class Reference Description
Emotion FerPlus Barsoum et al. Deep CNN model for Emotion recognition

Gender Detection

Model Class Reference Description
Age and Gender Classification using Convolutional Neural Networks Levi et al. contribute

Hand Written Digit Recognition

Model Class Reference Description
MNIST- Hand Written Digit Recognition Convolutional Neural Network with MNIST Deep CNN model for hand written digit identification

Super Resolution

Model Class Reference Description
Image Super resolution using deep convolutional networks Dong et al. contribute

Style Transfer

Model Class Reference Description
Unpaired Image to Image Translation using Cycle consistent Adversarial Network Zhu et al. contribute

Machine Translation

Model Class Reference Description
Neural Machine Translation by jointly learning to align and translate Bahdanau et al. contribute
Google's Neural Machine Translation System Wu et al. contribute

Speech Processing

Model Class Reference Description
Speech recognition with deep recurrent neural networks Graves et al. contribute
Deep voice: Real time neural text to speech Arik et al. contribute

Language Modelling

Model Class Reference Description
Deep Neural Network Language Models Arisoy et al. contribute

Visual Question Answering & Dialog

Model Class Reference Description
VQA: Visual Question Answering Agrawal et al. contribute
Yin and Yang: Balancing and Answering Binary Visual Questions Zhang et al. contribute
Making the V in VQA Matter Goyal et al. contribute
Visual Dialog Das et al. contribute

Other interesting models

Model Class Reference Description
Text to Image Generative Adversarial Text to image Synthesis contribute
Sound Generative models WaveNet: A Generative Model for Raw Audio contribute
Time Series Forecasting Modeling Long- and Short-Term Temporal Patterns with Deep Neural Networks contribute
Recommender systems DropoutNet: Addressing Cold Start in Recommender Systems contribute
Collaborative filtering contribute
Autoencoders contribute

Model Visualization

You can see visualizations of each model's network architecture by using Netron.

Usage

Every ONNX backend should support running these models out of the box. After downloading and extracting the tarball of each model, there should be

  • A protobuf file model.onnx which is the serialized ONNX model.
  • Test data.

The test data are provided in two different formats:

  • Serialized Numpy archives, which are files named like test_data_*.npz, each file contains one set of test inputs and outputs. They can be used like this:
import numpy as np
import onnx
import onnx_backend as backend

# Load the model and sample inputs and outputs
model = onnx.load(model_pb_path)
sample = np.load(npz_path, encoding='bytes')
inputs = list(sample['inputs'])
outputs = list(sample['outputs'])

# Run the model with an onnx backend and verify the results
np.testing.assert_almost_equal(outputs, backend.run_model(model, inputs))

Note: please replace onnx_backend in your code with the appropriate framework of your choice that provides ONNX inferencing support, and likewise replace backend.run_model with said framework's model evaluation logic.

  • Serialized protobuf TensorProtos, which are stored in folders named like test_data_set_*. They can be used as the following:
import numpy as np
import onnx
import os
import glob
import onnx_backend as backend

from onnx import numpy_helper

model = onnx.load('model.onnx')
test_data_dir = 'test_data_set_0'

# Load inputs
inputs = []
inputs_num = len(glob.glob(os.path.join(test_data_dir, 'input_*.pb')))
for i in range(inputs_num):
    input_file = os.path.join(test_data_dir, 'input_{}.pb'.format(i))
    tensor = onnx.TensorProto()
    with open(input_file, 'rb') as f:
        tensor.ParseFromString(f.read())
    inputs.append(numpy_helper.to_array(tensor))

# Load reference outputs
ref_outputs = []
ref_outputs_num = len(glob.glob(os.path.join(test_data_dir, 'output_*.pb')))
for i in range(ref_outputs_num):
    output_file = os.path.join(test_data_dir, 'output_{}.pb'.format(i))
    tensor = onnx.TensorProto()
    with open(output_file, 'rb') as f:
        tensor.ParseFromString(f.read())
    ref_outputs.append(numpy_helper.to_array(tensor))

# Run the model on the backend
outputs = list(backend.run_model(model, inputs))

# Compare the results with reference outputs.
for ref_o, o in zip(ref_outputs, outputs):
    np.testing.assert_almost_equal(ref_o, o)

Contributions

Do you want to contribute a model? To get started, pick any model presented above with the contribute link under the Description column. The links point to a page containing guidelines for making a contribution.

License

MIT License