Skip to content

L-Icarus/mmrazor

 
 

Repository files navigation

 
OpenMMLab website HOT      OpenMMLab platform TRY IT OUT
 

PyPI docs badge codecov license open issues issue resolution

📘Documentation | 🛠️Installation | 👀Model Zoo | 🤔Reporting Issues

English | 简体中文

Introduction

MMRazor is a model compression toolkit for model slimming and AutoML, which includes 3 mainstream technologies:

  • Neural Architecture Search (NAS)
  • Pruning
  • Knowledge Distillation (KD)
  • Quantization (in the next release)

It is a part of the OpenMMLab project.

Major features:

  • Compatibility

    MMRazor can be easily applied to various projects in OpenMMLab, due to the similar architecture design of OpenMMLab as well as the decoupling of slimming algorithms and vision tasks.

  • Flexibility

    Different algorithms, e.g., NAS, pruning and KD, can be incorporated in a plug-n-play manner to build a more powerful system.

  • Convenience

    With better modular design, developers can implement new model compression algorithms with only a few codes, or even by simply modifying config files.

Below is an overview of MMRazor's design and implementation, please refer to tutorials for more details.


What's new

MMRazor v0.3.1 was released in 5/4/2022.

Benchmark and model zoo

Results and models are available in the model zoo.

Supported algorithms:

Neural Architecture Search
Pruning
Knowledge Distillation

Installation

MMRazor depends on PyTorch and MMCV.

Please refer to get_started.md for more detailed instruction.

Getting Started

Please refer to train.md and test.md for the basic usage of MMRazor. There are also tutorials:

Contributing

We appreciate all contributions to improve MMRazor. Please refer to CONTRUBUTING.md for the contributing guideline.

Acknowledgement

MMRazor is an open source project that is contributed by researchers and engineers from various colleges and companies. We appreciate all the contributors who implement their methods or add new features, as well as users who give valuable feedbacks. We wish that the toolbox and benchmark could serve the growing research community by providing a flexible toolkit to reimplement existing methods and develop their own new model compression methods.

Citation

If you find this project useful in your research, please consider cite:

@misc{2021mmrazor,
    title={OpenMMLab Model Compression Toolbox and Benchmark},
    author={MMRazor Contributors},
    howpublished = {\url{https://github.com/open-mmlab/mmrazor}},
    year={2021}
}

License

This project is released under the Apache 2.0 license.

Projects in OpenMMLab

  • MMCV: OpenMMLab foundational library for computer vision.
  • MIM: MIM installs OpenMMLab packages.
  • MMClassification: OpenMMLab image classification toolbox and benchmark.
  • MMDetection: OpenMMLab detection toolbox and benchmark.
  • MMDetection3D: OpenMMLab's next-generation platform for general 3D object detection.
  • MMRotate: OpenMMLab rotated object detection toolbox and benchmark.
  • MMSegmentation: OpenMMLab semantic segmentation toolbox and benchmark.
  • MMOCR: OpenMMLab text detection, recognition, and understanding toolbox.
  • MMPose: OpenMMLab pose estimation toolbox and benchmark.
  • MMHuman3D: OpenMMLab 3D human parametric model toolbox and benchmark.
  • MMSelfSup: OpenMMLab self-supervised learning toolbox and benchmark.
  • MMRazor: OpenMMLab model compression toolbox and benchmark.
  • MMFewShot: OpenMMLab fewshot learning toolbox and benchmark.
  • MMAction2: OpenMMLab's next-generation action understanding toolbox and benchmark.
  • MMTracking: OpenMMLab video perception toolbox and benchmark.
  • MMFlow: OpenMMLab optical flow toolbox and benchmark.
  • MMEditing: OpenMMLab image and video editing toolbox.
  • MMGeneration: OpenMMLab image and video generative models toolbox.
  • MMDeploy: OpenMMLab model deployment framework.

About

OpenMMLab Model Compression Toolbox and Benchmark.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 98.1%
  • Shell 1.5%
  • Dockerfile 0.4%