394 



BELL SYSTEM TECHNICAL JOURNAL 



Ph " ' i Pn ' If :^ is a chance variable we will write H(x) for its entropy; 

 thus X is not an argument of a function but a label for a number, to differen- 

 tiate it from H(y) say, the entropy of the chance variable y. 



The entropy in the case of two possibilities with probabilities p and q = 

 1 — pf namely 



H = -{plogp-\- q log q) 



is plotted in Fig. 7 as a function oi p. 



The quantity H has a number of interesting properties which further sub- 

 stantiate it as a reasonable measure of choice or information. 



.2 



1.0 



Fig. 7 — Entropy in the case of two possibilities with probabilities p and (1 — p). 



1. £? = if and only if all the pi but one are zero, this one having the 

 value unity. Thus only when we are certain of the outcome does H vanish. 

 Otherwise H is positive. 



2. For a given w, £r is a maximum and equal to log n when all the pi are 



equal f i.e., - j . This is also intuitively the most uncertain situation. 



3. Suppose there are two events, x and y, in question with m possibilities 

 for the first and n for the second. Let p{i,j) be the probability of the joint 

 occurrence of i for the first and j for the second. The entropy of the joint 

 event is 



H{x, y) = -Y. Pih j) log />(«, j) 



