A collection of codes and notebooks implementing various types of neural networks in Numpy. As expected, codes are not optimized, and are incapable of utilizing GPU power!
-
Softmax Classification on MNIST
- Code: 01 Softmax Regression.py
- Notebook: 01 Softmax Regression.ipynb
-
Stochastic, mini-Batch, Batch Gradient Descent, and Dataloaders
- Code: 02 SGD, BGD, mini-BGD.py
- Notebook: 02 SGD, BGD, mini-BGD.ipynb
-
Optimizers (Moment, RMSProp, Adam)
- Code:
- Notebook: 03 Optimizers.ipynb
-
Regularization (L1, L2, Dropout, Batchnorm)
- Code:
- Notebook:
-
Convolutional Neural Networks (CNN) on MNIST
- Code:
- Notebook:
-
Recurrent Neural Networks (RNN) on MNIST
- Code:
- Notebook:
-
Generative Adversarial Networks (GANs) on MNIST
- Code:
- Notebook:
-
Sequential Layers
- Code:
- Notebook:
-
Autograd
- Code:
- Notebook:
-
Linear and CNN Autoencoder on MNIST
- Code:
- Notebook:
-
Sentiment Classification and Word Embedding on IMDB Movie Review Dataset
- Code:
- Notebook:
-
Character-level RNN
- Code:
- Notebook:
- Principal Component Analysis (PCA)
- Code:
- Notebook: PCA.ipynb