632 BELL SYSTEM TECHNICAL JOURNAL 







In this case the Jacobian is simply the determinant | aij \~^ and 



H{y) = H{x) + log I a,,- \ . 



In the case of a rotation of coordinates (or any measure preserving trans- 

 formation) 7=1 and H(y) = H(x). 



21. Entropy of an Ensemble of Functions 



Consider an ergodic ensemble of functions limited to a certain band of 

 width W cycles per second. Let 



p(Xi '" Xn) 



be the density distribution function for amplitudes xi - • - XnSit n successive 

 sample points. We define the entropy of the ensemble per degree of free- 

 dom by 



H' = — Lim - / • • • / p{xi ' ' ' Xn) log p(xi , - • • , Xn) dxi- - ■ dxn . 



We may also define an entropy H per second by dividing, not by n, but by 

 the time T in seconds for n samples. Since n = 2TW, H' = 2WH. 

 With white thermal noise p is gaussian and we have 



H' = log \^2TreN, 



H =W\og 2weN. 



For a given average power N, white noise has the maximum possible 

 entropy. This follows from the maximizing properties of the Gaussian 

 distribution noted above. 



The entropy for a continuous stochastic process has many properties 

 analogous to that for discrete processes. In the discrete case the entropy 

 was related to the logarithm of the probability of long sequences, and to the 

 number of reasonably probable sequences of long length. In the continuous 

 case it is related in a similar fashion to the logarithm of the probability 

 density for a long series of samples, and the volume of reasonably high prob- 

 ability in the function space. 



More precisely, if we assume p(xi ■ ■ - Xn) continuous in all the Xi for all n, 

 then for sufficiently large n 



log^ _ H' 



< € 



for all choices of (xi , • • • , Xn) apart from a set whose total probability is 

 less than 5, with 5 and e arbitrarily small. This follows from the ergodic 

 property if we divide the space into a large number of small cells. 



