Washington University in St. Louis
Instructor: Jeff Heaton
The content of this course changes as technology evolves, to keep up to date with changes follow me on GitHub.
- Section 1. Spring 2023, Monday, 2:30 PM, Location: TBD
- Section 2. Spring 2023, Online
Deep learning is a group of exciting new technologies for neural networks. Through a combination of advanced training techniques and neural network architectural components, it is now possible to create neural networks that can handle tabular data, images, text, and audio as both input and output. Deep learning allows a neural network to learn hierarchies of information in a way that is like the function of the human brain. This course will introduce the student to classic neural network structures, Convolution Neural Networks (CNN), Long Short-Term Memory (LSTM), Gated Recurrent Neural Networks (GRU), General Adversarial Networks (GAN) and reinforcement learning. Application of these architectures to computer vision, time series, security, natural language processing (NLP), and data generation will be covered. High Performance Computing (HPC) aspects will demonstrate how deep learning can be leveraged both on graphical processing units (GPUs), as well as grids. Focus is primarily upon the application of deep learning to problems, with some introduction to mathematical foundations. Students will use the Python programming language to implement deep learning using Google TensorFlow and Keras. It is not necessary to know Python prior to this course; however, familiarity of at least one programming language is assumed. This course will be delivered in a hybrid format that includes both classroom and online instruction.
The complete text for this course is here on GitHub. This same material is also available in book format. The course textbook is “Applications of Deep Neural networks with Keras“, ISBN 9798416344269.
If you would like to cite the material from this course/book, please use the following BibTex citation:
@misc{heaton2020applications,
title={Applications of Deep Neural Networks},
author={Jeff Heaton},
year={2020},
eprint={2009.05673},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
- Explain how neural networks (deep and otherwise) compare to other machine learning models.
- Determine when a deep neural network would be a good choice for a particular problem.
- Demonstrate your understanding of the material through a final project uploaded to GitHub.
This syllabus presents the expected class schedule, due dates, and reading assignments. Download current syllabus.
Module | Content |
---|---|
Module 1 Meet on 01/23/2023 |
Module 1: Python Preliminaries
|
Module 2 Week of 01/30/2023 |
Module 2: Python for Machine Learning
|
Module 3 Week of 02/06/2023 |
Module 3: TensorFlow and Keras for Neural Networks
|
Module 4 Week of 02/13/2023 |
Module 4: Training for Tabular Data
|
Module 5 Meet on 02/20/2023 |
Module 5: Regularization and Dropout
|
Module 6 Week of 02/27/2023 |
Module 6: CNN for Vision
|
Module 7 Week of 03/06/2023 |
Module 7: Generative Adversarial Networks (GANs)
|
Module 8 Week of 03/20/2023 |
Module 8: Kaggle
|
Module 9 Meet on 03/27/2023 |
Module 9: Transfer Learning
|
Module 10 Week of 04/03/2023 |
Module 10: Time Series in Keras
|
Module 11 Week of 04/10/2023 |
Module 11: Natural Language Processing
|
Module 12 Week of 04/17/2023 |
Module 12: Reinforcement Learning
|
Module 13 Meet on 04/24/2023 |
Module 13: Deployment and Monitoring
|