We introduce entropy, which expresses the expected information for discrete random variables, as a central concept in information theory.