MATHEMATICAL THEORY .OF COMMUNICATION 633 



The relation of H to volume can be stated as follows: Under the same as- 

 sumptions consider the n dimensional space corresponding to p(xi , • • • , Xn). 

 Let Vn(q) be the smallest volume in this space which includes in its interior 

 a total probability q. Then 



n—*co W 



provided q does not equal or 1. 



These results show that for large n there is a rather well-defined volume (at 

 least in the logarithmic sense) of high probability, and that within this 

 volume the probability density is relatively uniform (again in the logarithmic 

 sense). 



In the white noise case the distribution function is given by 



Since this depends only on 2x< the surfaces of equal probability density 

 are spheres and the entire distribution has spherical symmetry. The region 

 of high probability is a sphere of radius y/nN. As w -^ <» the probability 



1 



of being outside a sphere of radius ■\/n{N + e) approaches zero and - times 



71 



the logarithm of the volume of the sphere approaches log ^/lireN. 



In the continuous case it is convenient to work not with the entropy H of 

 an ensemble but with a derived quantity which we will call the entropy 

 power. This is defined as the power in a white noise limited to the same 

 band as the original ensemble and having the same entropy. In other words 

 if H' is the entropy of an ensemble its entropy power is 



iVi = ^^ exp 2H'. 



In the geometrical picture this amounts to measuring the high probability 

 volume by the squared radius of a sphere having the same volume. Since 

 white noise has the maximum entropy for a given power, the entropy power 

 of any noise is less than or equal to its actual power. 



21. Entropy Loss in Linear Filters 



Theorem 14: If an ensemble having an entropy Hi per degree of freedom 

 in band W is passed through a filter with characteristic Y{f) the output 

 ensemble has an entropy 



H. = ff. + i/ iog|F(/)r<f/. 



