Deep Learning in NLP (Winter 2020/2021)
Quick Links
General Information
Instructors: | Christian Wurm |
Jakub Waszczuk | |
Theoretical sessions: | Monday, 14:30 – 16:00, online |
Practical sessions: | Tuesday, 14:30 – 16:00, online |
Course web page: | https://user.phil.hhu.de/~waszczuk/teaching/hhu-dl-wi20/ |
(This web page, which will be updated throughout the course.) | |
Office hours: | by appointment |
Languages: | German and English |
Course Description
The aim of this course is to develop an understanding of the state-of-the-art techniques of neural networks and to apply them in practice, to natual language processing problems in particular.
Monday sessions will be typically dedicated to theory, Tuesday sessions – programming. During the practical sessions, we will use the PyTorch framework to implement our networks.
Script
The theoretical content can be found in the script (caution, frequent updates!).
Requirements
- BN: Complete the theoretical and the programming homework exercises. The homeworks will be published on this web page as we go.
- AP: Term paper based on a practical project: 4-5 pages for undergrad students, 7-10 pages for master students. See also the guidelines (last updated: March 03, 2021).
Schedule
Week 1 | Introduction and overview |
Software installation | |
Week 2 | Vektoren Matrizen |
Tensors (homework and the corresponding code ⇒ solution) | |
Week 3 | Lineare Regression |
Data encoding and embedding (homework and the corresponding code ⇒ solution)
|
|
Week 4 | Lineare Separierbarkeit (homework ⇒ solution) |
Building neural modules: composition and inheritence | |
Week 5 | Das einfache Neuron und Tiefe Architekturen |
Training by gradient descent (homework and the corresponding code ⇒ solution)
|
|
Week 6 | TBA |
Word embedding contextualization (optional homework and the corresponding code ⇒ solution)
|
|
Week 7 | TBA |
Contextualization continued (homework) | |
Week 8 | Backpropagation |
BiLSTM, Convolution (homework and the corresponding code ⇒ solution) | |
Week 9 | Backpropagation continued |
Manually specifying backpropagation procedures (additional notes) | |
Weihnachtsferien | |
Week 10 | Optimierung, word embeddings (backpropagation homework) |
Character-level language modeling | |
Week 11 | TBA |
Multi-task learning (homework description, homework project notes, code)
|
|
Week 12 | LSTM, Bi-LSTM and GRU |
Pre-trained embeddings (fastText, BERT) | |
Week 13 | CNN |
Batching, GPU support (stream coming soon…) | |
Week 14 | Project-related session (guidelines) |
NMT project homework (homework description, code repository)
|