Home » Vector Semantics – Winter 2020/21

Vector Semantics – Winter 2020/21

Course Material

  1. Course Info: Screencast, Slides
  2. Basics: Screencast, Slides
  3. Linear Algebra I: Screencast, Slides, Homework 1
  4. Linear Algebra II: Screencast, Slides, Homework 2 , Solutions to homework 1 (as a notebook), Notebook on linear transformations, Linear Transformations and Video Games
  5. Building Distributional Models: Screencast, Slides, Homework 3, Solutions to homework 2 (as a notebook)
  6. SVD: Screencast, Slides, Notebooks: Numpy, PCA, SVD and LSI
  7. Distributional Memory: Screencast, Slides, Homework 4
  8. Building a Distributional Model (Demo): Screencast, Notebook, RG-65 Dataset, Glove Vectors
  9. Linear Regression: Screencast I (Theory), Screencast II (Praxis), Slides, Notebook, Example dataset, Homework 5
  10. Multivariate Linear Regression/Logistic Regression: Screencast, Slides, Homework 6, Solution to ex. 1 of Homework 5
  11. Neural Nets: Screencast, Slides, Video on Neural Nets by 3Blue1Brown, Homework 7, Solution to ex. 1 and 4 of Homework 6, Notebook on Logistic Regression, Screencast on Logistic Regression Notebook
  12. Word2vec: Screencast, Slides, Solution to ex. 1 of Homework 7, Homework 8
  13. Backpropagation: Screencast, Slides (6 – 43), Video on derivatives (Khan academy), Video on chain rule (Khan academy), Original Video on Backpropagation (Andrej Kaparthy, 1:35 – 29:00 min), Homework 9, Solution to Homework 8, Solution to Homework 9

Errata

  • On slides 39 and 40 on Linear Algebra II (lesson 4) it should be 1mMTM and not just MTM
  • On slides 16 and 17 on SVD (lesson 6) the calculation of MMT and MTM is wrong. The results should be [[3, 2, 2],[2, 3, 1],[2, 1, 2]] and [[3, 2, 2, 1],[2, 2, 1, 1],[2, 1, 2, 0],[1, 1, 0, 1]] respectively.

Communication

Reading Suggestions

  • Jurafsky, Dan & James H Martin. 2017. Speech and language processing. an introduction to natural language processing, computational linguistics and speech recognition. Draft of 3rd edition. (Chapter on Vector Semantics) [PDF]
  • Manning, C., Raghavan, P., & Schütze, H. (2010). Introduction to information retrieval. Natural Language Engineering, 16(1), 100-103. (Chapter 6, 14 and 18) [PDF]
  • Sahlgren, M. (2006). The Word-Space Model: Using distributional analysis to represent syntagmatic and paradigmatic relations between words in high-dimensional vector spaces (Doctoral dissertation). [PDF]
  • Turney, P. D., & Pantel, P. (2010). From frequency to meaning: Vector space models of semantics. Journal of artificial intelligence research, 37, 141-188. [PDF]