All the code files related to the deep learning course from PadhAI
-
Updated
Apr 13, 2020 - Jupyter Notebook
All the code files related to the deep learning course from PadhAI
AutoInit: Analytic Signal-Preserving Weight Initialization for Neural Networks
A module for making weights initialization easier in pytorch.
PREDICT THE BURNED AREA OF FOREST FIRES WITH NEURAL NETWORKS
A curated list of awesome deep learning techniques for deep neural networks training, testing, optimization, regularization etc.
Neural_Networks_From_Scratch
How weight initialization affects forward and backward passes of a deep neural network
FloydHub porting of deeplearning.ai course assignments
Neural Networks: Zero to Hero. I completed the tutorial series by Andrej Karpathy
Making a Deep Learning Framework with C++
RNN-LSTM: From Applications to Modeling Techniques and Beyond - Systematic Review
Playground for trials, attempts and small projects.
This code implements neural network from scratch without using any library
Use ML-FLOW and TensorFlow2.0(Keras) to record all the experiments on the Fashion MNIST dataset.
Neural Network
MachineLearningCurves is a collection of abstract papers, insights, and research notes focusing on various topics in machine learning.
Deep Learning with TensorFlow Keras and PyTorch
Data driven initialization for neural network models
Variance normalising pre-training of neural networks.
Why don't we initialize the weights of a neural network to zero?
Add a description, image, and links to the weight-initialization topic page so that developers can more easily learn about it.
To associate your repository with the weight-initialization topic, visit your repo's landing page and select "manage topics."