Chapter 13.08: Joint Entropy and Mutual Information II

Information theory also provides means of quantifying relations between two random variables that extend the concept of (linear) correlation. We discuss joint entropy, conditional entropy, and mutual information in this context.

Lecture video

Lecture slides