Skip to content

Design draft: 2.x API

Edward edited this page Feb 10, 2018 · 11 revisions

Synaptic 2.x API

status: draft

Discussion part

Expectations

Synaptic is used and aimed for many purposes.

Most of them are related with experience transfer from theory, other languages or to other languages. JS is not a favorite language in questions of data science, so we probably should look at best players in other languages and use their practices.

We have a great source of useful statistics: https://github.com/showcases/machine-learning. Here we can see that top high-level NN solutions are:

  • Scikit (Python)
  • Caffe (C++)
  • Keras (Python)

And lot of other C++, Python and Java frameworks. The ones that should be mentioned (ignoring the fact that they are not in top5, but keeping in mind that they are actually used a lot) are Lasagne, FANN

We have Brain.js which is unmaintained in this list and Convnet.js by famous data scientist Karpathy - and we can possibly expect some help from him during this work.

If we will look at all of this solutions we will see following features:

  • multiple back-ends: CPU, GPU, some have distributed options, some - even introduce something else
  • multiple layer types: Dense (common), Dropout, Convolutional, Recurrent, Activation are base ones
  • as declarative as possible and as configurable as possible. This is related to the fact that multiple back-ends are used and you cannot simply use lambda functions e.g.

All of them are aimed to be:

  • as fast as possible
  • as memory-efficient as possible
  • as simple as possible

We should possibly aim for the same purpose. So, I suggest to take Keras as a reference (as it has best documentation).

Next part is a design draft which can be changed anytime and represents fictional (for now) state of design.

For now this is an optimistic design, and is intended to include as many options as possible. 2.0 version will not be intended to include all of them.

Requests

@jocooler requested to keep a human-readable output (in some form) to be transferable to other systems. JSON is perfect.

Formal part

Global design

For end-user the following files are provided:

//base entities to use. Expected to be used as an abstract class for every other construction of that kinds
import {
  Network,
  Layer,
  Trainer,
} from 'synaptic';

import {
  AsmJS,
  TensorFlow,
  WorkerAsmJS,
  WebCL,
} from 'synaptic/optimizers';

import {
  Dense,
  Activation,
  Dropout,

  Flatten,
  Reshape,
  Permute,


  Convolution1D,
  MaxPooling1D,
  AveragePooling1D,

  Convolution2D,
  MaxPooling2D,
  AveragePooling2D,

  GRU,
  LSTM
} from 'synaptic/layers';

import {
  Trainer, //same as in 'synaptic'
  objectives, //same as below, but as an object
  optimizers, //same as below, but as an object
} from 'synaptic/train'

import {
  mean_squared_error, 
  mean_absolute_error, 
  mean_absolute_percentage_error, 
  mean_squared_logarithmic_error, 
  squared_hinge,
  hinge, 
  binary_crossentropy, 
  categorical_crossentropy, 
  sparse_categorical_crossentropy, 
  kullback_leibler_divergence,

  iterations,
  time,

  any,
  every,
} from 'synaptic/train/objectives';

import {
  SGD, 
  RMSprop, 
  Adagrad, 
  Adadelta, 
  Adam, 
  Adamax, 
  Nadam,
} from 'synaptic/train/optimizers'

import {
  to_categorical,
} from 'synaptic/util'

Network usage

This is an example for mnist training.

const train_network = new Network(
    new Convolution2D(32, 4, 4),
    new Dropout(.2)
    new Activation.Relu(),
    new Flatten(),
    new Dense(10),
    new Activation.Softmax(),   
);

//important: this can be an async operation now!

await train_network.optimize(new WorkerAsmJS());

const trainer = new Trainer(train_network, {
    optimizer: RMSprop    
})

await trainer.train(train_input, to_categorical(train_output), {
    // any is imported from objectives
    objectives: objectives.any([
         objectives.categorical_crossentropy(.0005),
         objectives.every([
             objectives.iterations(5000),
             objectives.time(1000)
         ])
    ]),
    learning_rate: 1
});

assert.equal(await train_network.activate(test_input[0]), test_output[0])

//this is equal to something like test_input.map(network.activate) - just iteration over every entity

const weights = await train_network.export_weights()
const optimizer = await train_network.export_optimizer()

/* Let's imagine this is separate application already running */

const test_network = new Network(
    new Convolution2D(32, 4, 4),
    new Dropout(.2),
    new Activation.Relu(),
    new Flatten(),
    new Dense(10),
    new Activation.Softmax(),   
);

await test_network.import_weights(weights);

await test_network.import_optimizer(optimizer);

console.log(await train_network.activate_array(test_input));
Clone this wiki locally