MATHEMATICAL THEORY OF COMMUNICATION 629 



1. If a: is limited to a certain volume v in its space, then H{x) is a maximum 

 and equal to log v when p(x) is constant ( - ) in the volume. 



2. With any two variables x, y we have 



H{x,y)<H(x) + H(y) 



with equality if (and only if) x and y are independent, i.e., p{x, y) = p{x) 

 p{y) (apart possibly from a set of points of probability zero). 



3. Consider a generalized averaging operation of the following type: 



y'{y) = j a{x, y)p(x) 



. , ^,, . , dx 

 with 



/ a{x, y) dx = I a{x, y) dy = 1, a(x, y) > 0. 



Then the entropy of the averaged distribution p'{y) is equal to or greater 

 than that of the original distribution p{x). 



4. We have 



H(x, y) = H{x) + H.{y) = H(y) + Hy{x) 

 and 



H^iy) < H(y). 



5. Let p{x) be a one-dimensional distribution. The form of p(x) giving a 

 maximum entropy subject to the condition that the standard deviation 

 of X be fixed at o- is gaussian. To show this we must maximize 



H{x) = — / p{^) log p{x) dx 



with 



o^ = I p{x)x dx and 1 = / p{x) dx 



as constraints. This requires, by the calculus of variations, maximizing 



I [-p{x) log p{x) + \p{x)x'- + yLp{x)\ dx. 



The condition for this is 



-1 -\ogp{x) + \x^ + n = 

 and consequently (adjusting the constants to satisfy the constraints) 



./^\ ^ -(x2/2(r2) 



