642 BELL SYSTEM TECHNICAL JOURNAL 



If the noise is itself white, N = Ni and the result reduces to the formula 

 proved previously: 



C = IF log 



h^y 



If the noise is Gaussian but with a spectrum which is not necessarily flat, 

 Ni is the geometric mean of the noise power over the various frequencies in 

 the band W. Thus 



N, = exp^j Jog N{f)df 



where N{f) is the noise power at frequency/. 



Theorem 19: If we set the capacity for a given transmitter power P 

 equal to 



C = IF log — 



then 7] is monotonic decreasing as P increases and approaches as a limit. 

 Suppose that for a given power Pi the channel capacity is 



Pi + iV - 771 



W\og 



Ni 



This means that the best signal distribution, say p{x), when added to the 

 noise distribution q{x), gives a received distribution r(y) whose entropy 

 power is (Pi + iV — r;i). Let us increase the power to Pi + AP by adding 

 a white noise of power AP to the signal. The entropy of the received signal 

 is now at least 



H{y) = W log 2re(Pi + iV - ^i + AP) 



by application of the theorem on the minimum entropy power of a sum. 

 Hence, since we can attain the H indicated, the entropy of the maximizing 

 distribution must be at least as great and r; must be monotonic decreasing. 

 To show that t? -^ as P — > oo consider a signal which is a white noise with 

 a large P. Whatever the perturbing noise, the received signal will be 

 approximately a white noise, if P is sufficiently large, in the sense of having 

 an entropy power approaching P -\- N. 



25. The Channel Capacity with a Peak Power Limitation 



In some applications the transmitter is limited not by the average power 

 output but by the peak instantaneous power. The problem of calculating 

 the channel capacity is then that of maximizing (by variation of the ensemble 

 of transmitted symbols) 



Iliy) - H(n) 



