Skip to content

karlstratos/nlp-class

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Lectures on Natural Language Processing

The purpose of this repostitory is to share the materials in a graduate-level course on NLP that I teach regularly. I will make an effort to refresh it occasionally as long as I teach it.

Syllabus (2023 Spring Edition)

# Topic Optional Reading
1 General introduction
2 Classification Steepest descent
3 Linear classifiers Linear separability (Jupyter Notebook)
4 Introduction to deep learning Backpropagation, adaptive learning
5 Neural architectures for language processing Transformers
6 Language models
7 Conditional language models
8 Pretrained language models BERT finetuning (Jupyter Notebook)
9 Prompting large language models
10 Retrieval-augmented models Noise contrastive estimation, search indexes
11 Structured prediction: HMMs and PCFGs
12 Structured prediction: CRFs Variable elimination
13 Latent-variable generative models Gaussian, VAEs, LVGMs
14 Diffusion models, coreference resolution, review Diffusion models, vision architectures

Acknowledgement

I've spent many hours to design and create most of the materials. But no class stands on its own. I stole many pedagogical ideas from my colleagues. The general introduction was inspired by the NLP class taught by Danqi Chen and Karthik Narasimhan; Dan Edminston pointed me to the ambiguous headline "British Left Waffles on Falklands"; the wonderful cartoon-style panel illustrating a debate between Chomsky and Hinton was created by Nathan Srebro; the animation illustrating stochatic gradient descent was created by Greg Shakhnarovich.

Releases

No releases published

Packages

No packages published