Chapter 12.02: Cross-lingual Word Embeddings

Cross-lingual word embeddings create a shared vector space for words from multiple languages, allowing models to understand and process text across different languages seamlessly. In this chapter we describe the two main training strategies and look at examples of models to train those. Additionally, we look at unsupervised learning of multi-lingual word embeddings.

Lecture Slides