deep-nlp
deep-nlp copied to clipboard
[In-Progress] Mini implementations of deep learning algorithms for natural language processing in PyTorch
deep-nlp
Deep Learning for Natural Language Processing in PyTorch
This tutorial covers the basic deep learning algorithms that are used in nlp with minimal dataset (like a sentence or two). The tutorials assume that you know nothing about nlp or deep learning. You just need to know some Python.
Basics
You might have seen similar examples from somewhere else. These are the machine learning basics you have to go through in order to understand the following contents.
- Linear Regression
- Logistic Regression
- Neural Network
Word Vectors (aka word embeddings)
- Word2Vec
- GloVe
Recurrent Neural Network
Let's implement Recurrent Neural Network and their variants from scratch.
- Recurrent Neural Network (RNN)
- Long Short Term Memory (LSTM)
- Gated Recurrent Unit (GRU)
Text Generation
You can generate sentences using RNN.
- Sentence Generation with RNN
Text Classification
- Text classification with CNN
- Text classification with RNN
- Text classification with RNN + Attention
Sequence-to-Sequence (Seq2Seq)
Think about how you make a response to a text. First, you interpret what the sentence is (classification) and generate your answer from start to finish. Since we know how to build text classifier and text generator, would it make sense for us to use both of them at the same time? Yes! Merging text classifier with text generator gets you a mini translator!
- Encoder Decoder
- Seq2Seq with Attention Mechanism
Question Answering
What if the sentence given is a question? You cannot answer a question by just based on that question. You need knowledge. We memorize stuffs from our experiences. Those can be given by teachers, news papers, and etc. Then we answer to the question based on what we've seen, read, or heard. Now instead of spitting out a nonsensical sentence, lets make it memorize some stuffs.
- Memory Networks