NLP one of the important area of research in AI. A good resource for NLP is cs228 it talks about statistical techniques and there application to NLP.
- Speech and Language Processing by D. Jurafsky and J. Martin.
- SyntaxNet: Neural Models of Syntax
- Facebook's DeepText followed by first and second study.
- Long Short Term Memory by S. Hochreiter and J. Schmidhuber, 1997.
- Distributed representations of words and phrases and their compositionality, T. Mikolov et al., 2013
- Efficient estimation of word representations in vector space, T. Mikolov et al., 2013
- Glove: Global vectors for word representation, J. Pennington et al., 2014
- Sequence to sequence learning with neural networks, I. Sutskever et al., 2014
- Neural machine translation by jointly learning to align and translate, D. Bahdanau et al., 2014
- Learning phrase representations using RNN encoder-decoder for statistical machine translation, K. Cho et al., 2014
- Distributed representations of sentences and documents, Q. Le and T. Mikolov, 2014
- Effective approaches to attention-based neural machine translation, M. Luong et al., 2015
- Recursive deep models for semantic compositionality over a sentiment treebank, R. Socher et al., 2013
- Generating sequences with recurrent neural networks, A. Graves, 2013
- Exploring the limits of language modeling, R. Jozefowicz et al., 2016
- Convolutional neural networks for sentence classification, Y. Kim, 2014
- A convolutional neural network for modeling sentences, N. Kalchbrenner et al., 2014
- Conditional random fields as recurrent neural networks, S. Zheng and S. Jayasumana, 2015
- Neural turing machines, A. Graves et al., 2014
- Memory networks, J. Weston et al., 2014
- Teaching machines to read and comprehend, K. Hermann et al., 2015
- Neural Architectures for Named Entity Recognition, G. Lample et al., 2016