In this repo you will find examples of neural networks implimented from scratch using my matrix library.
1. perceptron
The perceptron algorithm. The activation function is a binary step function called "heaviside step function". Hinge loss as a loss function. Capable of binary classification. The example functions as an OR gate.
{
"layers": [2, 1],
"activation_function": ["heaviside"]
}
2. neural
Two layer neural network with gradient descent. Both layers use sigmoid as
activation function. XOR gate. Weights are initialized from a uniform
distribution U(-sqrt(6 / (in + out)), sqrt(6 / (in + out)))
(Xavier initialization).
{
"layers": [2, 3, 1],
"activation_function": ["sigmoid", "sigmoid"]
}
3. momentums
Same as previous except implemented the momentum method. It impovers training speed and accuracy (avoid getting stuck in a local minima). XOR gate
{
"layers": [2, 5, 1],
"activation_function": ["sigmoid", "sigmoid"]
}
4. exseimion
Handwritten digit classification is a multi-label classification problem. The data set used is semeion.data.
{
"layers": [256, 51, 10],
"activation_function": ["sigmoid", "softmax"]
}
5. minibatches
Same as the previous, data is split in small batches (subsets). Impoves memory efficiency, accuracy for a trade-off in compute efficiency. Uses root mean squared propagation, instead of SGD. And L2 regularization is applied for the model's weights.
{
"layers": [256, 51, 10],
"activation_function": ["leaky_relu", "softmax"]
}
Bonus: Cross Validation
Same as the previous, but with cross validation. Implements accuracy, precision, recall and f1-score metrics.
{
"layers": [256, 51, 10],
"activation_function": ["sigmoid", "softmax"]
}
DISCLAIMER: Only for learning purposes. Nim has its own machine learning framework Arraymancer as well as torch bindings.
This library is distributed under the MIT license.