Skip to content
This repository has been archived by the owner on Jan 24, 2024. It is now read-only.

CoreML Survey

Abhinav Arora edited this page Nov 29, 2017 · 1 revision

Introduction

CoreML is Apple's framework that allows developers to integrate trained machine learning models in their applications. CoreML itself builds on top of low-level primitives such as Accelerate and BNNS and Metal Performance Shaders.

Architecture

CoreML Model

CoreML requires that machine learning models are supplied in a specified format called the ML Model format (models with a .mlmodel file extension). This specification is defined in protobuf. This specification can be found in the file Model.proto. CoreML model can be of various types. For our purpose, the most important type is the NeuralNetwork type. The protobuf message for this is defined in NeuralNetwork.proto.

Apple provides several popular, open source models that are already in the Core ML model format.

Model Conversion

If a machine learning model is created and trained using a third-party machine learning tool, then it needs to be converted to the Core ML model format. Apple provides a python package called coremltools for creating, examining, and testing models in the .mlmodel format. The deep learning frameworks currently supported by coremltools are:

  1. Caffe v1
  2. Keras 1.2.2+

How can PaddlePaddle be used with Core ML

If we wish to integrate PaddlePaddle with Core ML, then we will have to write a custom Conversion tool for converting PaddlePaddle model to CoreML .mlmodel format. There is no tutorial on how to write such custom conversion tools. We can rely on the open source converters written for Caffe and Keras.

API Design

The CoreML API has the following main abstractions:

  1. Model: The model is exposed through the MLModel class. The MLModel exposes the following methods:

    • Create Model

      init(contentsOf: URL)

      Creates a Core ML model, to be used only when not using the Xcode autogenerated interface.

    • Compile Model

      class func compileModel(at: URL)
    • Predicting output values

      func prediction(from: MLFeatureProvider)
      class MLPredictionOptions
    • Inspecting Model

      var modelDescription: MLModelDescription

      Metadata about the model, which is also displayed in the Xcode view of the model.

      class MLModelDescription

      Information about the model, primarily the expected input and output format, with additional optional metadata.

  2. Model Layers: An interface for the behavior of custom neural network layers is defined in the protocol MLCustomLayer. This protocol exposes the following methods:

    • Create Layer

      init(parameters: [String : Any])
    • Integrating a Layer

      func setWeightData([Data])
      func outputShapes(forInputShapes: [[NSNumber]])
    • Evaluating a Layer

      func evaluate(inputs: [MLMultiArray], outputs: [MLMultiArray])