Chapter 13.04: Kullback-Leibler Divergence

The Kullback-Leibler divergence (KL) is an important quantity for measuring the difference between two probability distributions. We discuss different intuitions for KL and relate it to risk minimization and likelihood ratios.

Lecture video

Lecture slides