CS224n-2019-solutions
CS224n-2019-solutions copied to clipboard
Complete solutions for Stanford CS224n, winter, 2019
CS224n-winter19
Solutions for CS224n, winter, 2019.
Welcome to discuss problems appearing in assigments, please submit to issue.
Also take notes for the key point in lectures.
The solutions for assignment is written by Markdown in Assignments/written part.
- Course page: https://web.stanford.edu/class/cs224n
- Video page: https://www.youtube.com/watch?v=8rXD5-xhemo&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z
Update
update 2019/12/03
After CS224n I realize that more systematical training is needed. So I start a new repo learn_NLP_again, here is the description(algorithms and solutions is available for chapter 1 until now):
Here is why I started this project: learn NLP from scratch again. I choose Speech and language process as my entry point, and try to write solutions and implement some algorithms/models of this book. I hope I can stick to this project and update frequently.
After one year's training in corporation and lab, I find many faults or incorrect habbits in past parctice, (btw, there is too many commits in this repo). I'll review the code in this repo and solve issues gradually.(:smile:, hopefully)
Welcome communications in new repo!
w1
reading
- [x] note: Word Vectors I: Introduction, SVD and Word2Ve
- [x] Word2Vec Tutorial - The Skip-Gram Model
practice
- [x] coding: Assignment1
- [x] Gensim
w2
reading
- [x] note: Word Vectors II: GloVe, Evaluation and Trainin
- [x] gradient-notes
- [x] CS231n notes on backprop
- [x] review-differential-calculus
- [x] backprop_old
- [x] CS231n notes on network architectures
practice
- [x] coding: Assignment2
- [x] writing: Assignment2
w3
reading
- [x] note: Dependency Parsing
- [x] note: Language Models and Recurrent Neural Network
- [x] a3
practice
- [x] coding: Assignment3
- [x] writing: Assignment3
w4
reading
- [x] note: Machine Translation, Sequence-to-sequence and Attention
- [x] a4
- [x] read: Attention and Augmented Recurrent Neural Networks
- [x] read: Massive Exploration of Neural Machine Translation Architectures (practical advice for hyperparameter choices)
practice
- [x] coding: Assignment4
- [x] writing: Assignment4
key point for a4
How to understand pack_padded_sequence and pad_packed_sequence?
(Chinese ed)
(English ed)
w5
It has been long time for no updating...
reading
- [x] note: Machine Translation, Sequence-to-sequence and Attention
- [x] a4
- [x] read: Attention and Augmented Recurrent Neural Networks
practice
- [x] coding: Assignment5
- [x] writing: Assignment5
Final project
reading:
- [x] final-project-practical-tips
- [x] default-final-project-handout
- [x] project-proposal-instructions
- [x] Practical Methodology_Deep Learning book chapter
- [x] Highway Networks
- [x] Bidirectional Attention Flow for Machine Comprehension
practice:
- [x] anotate codes
- [x] train baseline