cs224n-win2223 icon indicating copy to clipboard operation
cs224n-win2223 copied to clipboard

Code and written solutions of the assignments of the Stanford CS224N: Natural Language Processing with Deep Learning course from winter 2022/2023

CS224N: Natural Language Processing with Deep Learning

Stanford / Winter 2023

This repository contains my solutions of the assignments of the Stanford CS224N: Natural Language Processing with Deep Learning course from winter 2022/23. There are many other great repositories on this course but none that cover the latest assignments (winter 2022 / 2023) and contain the written and practical parts completely (state: Mai 12. 2023). This repository is intended as a learning ressource which provides answers if you are stuck. Please do yourself a favor and try it on your own first. If you come across any errors or like me to include a more expressive explanation, please let me know at [email protected].

Reading papers is an important part of this course and crucial for completing the assignments successfully. Therefore I recommend to have a look at How to read a Paper

From Assignment 2 and onwards you will need to edit latex files for your written solutions. I can recommend this wikibook as an up-to-date, comprehensive, and accessible reference. To have git integration I would use Vscode with a LateX extension as a LateX editor. For personal use, overleaf might be easier and quicker to use.

For now, all assignments are completed. I will continue watching the rest of the lectures and come back in September this year to try completing one of the default projects.

My Schedule

Apr.18.2023

Apr.19.2023

Apr.20.2023

Apr.21.2023

Apr.22.2023

Apr.23.2023

Apr.24.2023

Apr.25.2023

Apr.26.2023

Apr.27.2023

Apr.28.2023 - Mai.5.2023

Mai.6.2023

Mai.7.2023

Mai.8.2023

Mai.9.2023

Mai.10.2023

Mai.11.2023

Mai.12.2023

Mai.13.2023

Mai.14.2023

  • watch Lecture 13 and Lecture 14
  • read Coreference Resolution chapter of Jurafsky and Martin End-to-end Neural Coreference Resolution Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

Mai.15.2023

  • watch Lecture 15 and Lecture 16
  • read ERNIE: Enhanced Language Representation with Informative Entities Barack’s Wife Hillary: Using Knowledge Graphs for Fact-Aware Language Modeling Pretrained Encyclopedia: Weakly Supervised Knowledge-Pretrained Language Model Language Models as Knowledge Bases?

Mai.16.2023

Future/TODO