MATHEMATICAL THEORY OF COMMUNICATION 641 



mining the channel capacity C can be solved explicitly. However, upper 

 and lower bounds can be set for C in terms of the average noise power N 

 and the noise entropy power Ni . These bounds are sufficiently close to- 

 gether in most practical cases to furnish a satisfactory solution to the 

 problem. 



Theorem 18: The capacity of a channel of band W perturbed by an arbi- 

 trary noise is bounded by the inequalities 



log .^^ <C <W\og ^^ 



where 



P = average transmitter power 



N = average noise power 



Ni = entropy power of the noise. 



Here again the average power of the perturbed signals will be P + iV. 

 The maximum entropy for this power would occur if the received signal 

 were white noise and would be W log IweiP + iV). It may not be possible 

 to achieve this; i.e. there may not be any ensemble of transmitted signals 

 which, added to the perturbing noise, produce a white thermal noise at the 

 receiver, but at least this sets an upper bound to H{y). We have, therefore 



C = max H(y) - H(n) 



< W log 2Tre{P -\- N) - W log IweNi . 



This is the upper limit given in the theorem. The lower limit can be ob- 

 tained by considering the rate if we make the transmitted signal a white 

 noise, of power P. In this case the entropy power of the received signal 

 must be at least as great as that of a white noise of power P + TVi since we 

 have shown in a previous theorem that the entropy power of the sum of two 

 ensembles is greater than or equal to the sum of the individual entropy 

 powers. Hence 



max Hiy) > W log 27re{P + Ni) 



and 



C >Wlog iTciP -\- Ni) - W log lireNi 



P + iVi 



T^log 



Ni 



As P increases, the upper and lower bounds approach each other, so we 

 have as an asymptotic rate 



w 1 ^ + ^ 

 TFlog-^^ 



