410 BELL SYSTEM TECHNICAL JOURNAL 



ures the amount received less the part of this which is due to noise. The 

 third is the sum of the two amounts less the joint entropy and therefore in a 

 sense is the number of bits per second common to the two. Thus all three 

 expressions have a certain intuitive significance. 



The capacity C of a noisy channel should be the maximum possible rate 

 of transmission, i.e., the rate when the source is properly matched to the 

 channel. We therefore define the channel capacity by 



C = Max {H{x) - Hy{x)) 



where the maximum is with respect to all possible information sources used 

 as input to the channel. If the channel is noiseless, Hy{x) = 0. The defini- 

 tion is then equivalent to that already given for a noiseless channel since the 

 maximum entropy for the channel is its capacity. 



13. The Fundamental Theorem for a Discrete Channel with 



Noise 



It may seem surprising that we should define a definite capacity C for 

 a noisy channel since we can never send certain information in such a case. 

 It is clear, however, that by sending the information in a redundant form the 

 probability of errors can be reduced. For example, by repeating the 

 message many times and by a statistical study of the different received 

 versions of the message the probability of errors could be made very small. 

 One would expect, however, that to make this probability of errors approach 

 zero, the redundancy of the encoding must increase indefinitely, and the rate 

 of transmission therefore approach zero. This is by no means true. If it 

 were, there would not be a very well defined capacity, but only a capacity 

 for a given frequency of errors, or a given equivocation; the capacity going 

 down as the error requirements are made more stringent. Actually the 

 capacity C defined above has a very definite significance. It is possible 

 to send information at the rate C through the channel with as small a fre- 

 quency of errors or equivocation as desired by proper encoding. This state- 

 ment is not true for any rate greater than C. If an attempt is made to 

 transmit at a higher rate than C, say C + i?i , then there will necessarily 

 be an equivocation equal to a greater than the excess Ri . Nature takes 

 payment by requiring just that much uncertainty, so that we are not 

 actually getting any more than C through correctly. 



The situation is indicated in Fig. 9. The rate of information into the 

 channel is plotted horizontally and the equivocation vertically. Any point 

 above the heavy line in the shaded region can be attained and those below 

 cannot. The points on the line cannot in general be attained, but there will 

 usually be two points on the Hne that can. 



