This repository is aimed to provide an abstract level introduction to different Word Embeddings in NLP.
- word_embs.ipynb file contains the code with explanation for the tutorial.
Read this repo on a blog: Semantic Representations
To run the code, follow these steps:
git clone https://github.com/panditu2015/Word-Embeddings.git
cd Word-Embeddings
pip install -r requirements.txt
jupyter notebook
- Open word_embs.ipynb in your browser now.
To run on google-colab
, follow this link:
https://colab.research.google.com/drive/1vEO_P564JAjTd-3El6C9lmZokZ0U9UnC
- From the menu on top, click Runtime
- Then click Change runtime type
- Under Hardware Accelarator, select GPU
-
Most of the code is written using
pymagnitude
library in Python. -
Stanford's CS276: Information Retrieval and Web Search
-
Stanford's CS224n: Natural Language Processing with Deep Learning
- Useful lectures: 1 (Jan 8), 2 (Jan 10), 13 (Feb 19)
-
Stanford's CS224U: Natural Language Understanding
- Useful lectures: 2 (Apr 8), 7 (May 11)
-
A really nice blog on the intuition of Word Embeddings:
-
A great blog on Contextual Embeddings:
-
A blog on BERT:
-
Lecture on Distributed Representations from CS276.
-
A lecture note on Word Embeddings from CS224n.