Course Material
- Course Info: Screencast, Slides
- Basics: Screencast, Slides
- Linear Algebra I: Screencast, Slides, Homework 1
- Linear Algebra II: Screencast, Slides, Homework 2 , Solutions to homework 1 (as a notebook), Notebook on linear transformations, Linear Transformations and Video Games
- Building Distributional Models: Screencast, Slides, Homework 3, Solutions to homework 2 (as a notebook)
- SVD: Screencast, Slides, Notebooks: Numpy, PCA, SVD and LSI
- Distributional Memory: Screencast, Slides, Homework 4
- Building a Distributional Model (Demo): Screencast, Notebook, RG-65 Dataset, Glove Vectors
- Linear Regression: Screencast I (Theory), Screencast II (Praxis), Slides, Notebook, Example dataset, Homework 5
- Multivariate Linear Regression/Logistic Regression: Screencast, Slides, Homework 6, Solution to ex. 1 of Homework 5
- Neural Nets: Screencast, Slides, Video on Neural Nets by 3Blue1Brown, Homework 7, Solution to ex. 1 and 4 of Homework 6, Notebook on Logistic Regression, Screencast on Logistic Regression Notebook
- Word2vec: Screencast, Slides, Solution to ex. 1 of Homework 7, Homework 8
- Backpropagation: Screencast, Slides (6 – 43), Video on derivatives (Khan academy), Video on chain rule (Khan academy), Original Video on Backpropagation (Andrej Kaparthy, 1:35 – 29:00 min), Homework 9, Solution to Homework 8, Solution to Homework 9
Errata
- On slides 39 and 40 on Linear Algebra II (lesson 4) it should be 1⁄mMTM and not just MTM
- On slides 16 and 17 on SVD (lesson 6) the calculation of MMT and MTM is wrong. The results should be [[3, 2, 2],[2, 3, 1],[2, 1, 2]] and [[3, 2, 2, 1],[2, 2, 1, 1],[2, 1, 2, 0],[1, 1, 0, 1]] respectively.
Communication
- Rocket.Chat-channel
- Webex meeting (Always Wednesdays 13:30 – 14:30; the password is the same as for the download of the course materials)
Reading Suggestions
- Jurafsky, Dan & James H Martin. 2017. Speech and language processing. an introduction to natural language processing, computational linguistics and speech recognition. Draft of 3rd edition. (Chapter on Vector Semantics) [PDF]
- Manning, C., Raghavan, P., & Schütze, H. (2010). Introduction to information retrieval. Natural Language Engineering, 16(1), 100-103. (Chapter 6, 14 and 18) [PDF]
- Sahlgren, M. (2006). The Word-Space Model: Using distributional analysis to represent syntagmatic and paradigmatic relations between words in high-dimensional vector spaces (Doctoral dissertation). [PDF]
- Turney, P. D., & Pantel, P. (2010). From frequency to meaning: Vector space models of semantics. Journal of artificial intelligence research, 37, 141-188. [PDF]