Chapter 13.07: Joint Entropy and Mutual Information I

Information theory also provides means of quantifying relations between two random variables that extend the concept of (linear) correlation. We discuss joint entropy, conditional entropy, and mutual information in this context.

Lecture video

Lecture slides