MATHEMATICAL THEORY OF COMMUNICATION 630 



transmission is 



R = H{y) - H{n) 



i.e., the entropy of the received signal less the entropy of the noise. The 

 channel capacity is 



C = M2.xH{y) - H(n). 



Fix) 



We have, since y = x -\- n: 



H(x, y) = H{x, n). 

 Expanding the left side and using the fact that x and n are independeiA 



H{y) + Hy{x) = H(x) + H{n). 

 Hence 



R = H{x) - Hy{x) = H(y) - H{n). 



Since H{n) is independent of P{x), maximizing R requires maximizing 

 H{y), the entropy of the received signal. If there are certain constraints on 

 the ensemble of transmitted signals, the entropy of the received signal must 

 be maximized subject to these constraints. 



24. Channel Capacity with an Average Power Limitation 



A simple application of Theorem 16 is the case where the noise is a white 

 thermal noise and the transmitted signals are limited to a certain average 

 power P. Then the received signals have an average power P -\- N where 

 N is the average noise power. The maximum entropy for the received sig- 

 nals occurs when they also form a white noise ensemble since this is the 

 greatest possible entropy for a power P -\- N and can be obtained by a 

 suitable choice of the ensemble of transmitted signals, namely if they form a 

 white noise ensemble of power P. The entropy (per second) of the re- 

 ceived ensemble is then 



H{y) = W log lireiP + N), 



and the noise entropy is 



H{n) = W log lireN. 

 The channel capacity is 



P -\- N 



C = H(y) - H{n) = W log 



N 



Summarizing we have the following: 



Theorem 17: The capacity of a channel of band W perturbed by white 



