- Cover core with tests
- Move main.go from root to other destination
- New layers and learning optimization
- Softmax layer;
- Maxout layer;
- Dropout layer;
- Optimization for learning;
- Bias;
- Add new operations
- Convolve2D
- Flatten
- Pool2D
- ZeroPadding
- Determinant
- ContoursPadding
- Reshape
- Refactor code and rethink structure of library
- Remove some legacy and experemintal code to other branches WIP
- Error types and wrapping them WIP
- Test cases for most of math functions WIP
- ActivationTanh
- ActivationTanhDerivative
- ActivationSygmoid
- ActivationSygmoidDerivative
- Rot2D90
- Rot2D180
- Rot2D270
- ActivationArcTan
- ActivationArcTanDerivative
- ActivationSoftPlus
- ActivationSoftPlusDerivative
- ActivationGaussian
- ActivationGaussianDerivative
[x] Add (element-wise)Not needed cause of gonum usage[x] Sub (element-wise)Not needed cause of gonum usage[x] TransposeNot needed cause of gonum usage[x] MultiplyNot needed cause of gonum usage[x] HadamardProductNot needed cause of gonum usage[ ] MSENot needed cause of gonum usage- Convolve2D
- Flatten
- Pool2D
- ZeroPadding
- Im2Col
- ContoursPadding
- Reshape and its Unsafe version
- Test cases for layers and its methods
- Convolutional WIP
- Fully connected WIP
- ReLU
- Leaky ReLU
- Pooling
- Gonum integration
- Use of goroutines for boosting calculations
[x] ZeroPadding Slow down perfomance[X] Pool2D Slow down perfomance[x] Im2Col Slow down perfomance[x] Flatten Slow down perfomance[ ] ContoursPadding Slow down perfomance
- Benchmarks. Do we really need it since this is just library for studying purposes? WIP
- Padding for convolutional layer
- Write theoretical documents on most of functions (on every would be even better)
- New struct of examples folder (split it on different types of tasks for neural networks)
- Consider float32 as extension
- Improve README's WIP
- Graphviz pretty print. WIP
- Add CI on https://travis-ci.com
- Import/Export activation functions and its derivatives for JSON files.
Updated at: 2020-10-11