14 Information Storage and Neural Control 



send certain {i.e., probability equal one) information over such 

 a channel. It is clear, however, that by sending the information 

 in a redundant form, the probability of errors can be reduced. 

 For example, by repeating the message many times and by a 

 statistical study of the different versions of the message, the prob- 

 ability of errors can be made very small. One would expect, 

 however, that to make this probability of errors approach zero, 

 the redundancy of the encoding must increase indefinitely and 

 the rate of transmission must therefore approach zero. This is by 

 no means true. If it were, there would not be a well-defined 

 capacity, but only a capacity for a given frequency of errors or a 

 given equivocation, the capacity going down as the error require- 

 ments are made more stringent. Actually, the capacity C defined 

 earlier has a very definite significance. It is possible to send infor- 

 mation at the rate C through the channel, with as small a fre- 

 quency of errors or equivocation as desired, by proper encoding. 

 This statement is not true for any rate greater than C. If an attempt 

 is made to transmit at a higher rate than C, then there will neces- 

 sarily be an equivocation equal to or greater than the excess. 



To clarify the concept of equivocation, let us suppose there are 

 two possible symbols, and 1, and that we are transmitting at a 

 rate of 1,000 symbols per second with probabilities Po = Pi = 1/2. 

 Thus, our source is producing information at the rate of 1,000 

 bits per second (Shannon refers to this as the entropy of the 

 source). During transmission, noise introduces errors so that, on 

 the average, one symbol in 100 is received incorrectly (a as 1, 

 or 1 as 0). What is the rate of transmission of information? Cer- 

 tainly less than 1,000 bits per second since about one per cent 

 of the received symbols are incorrect. Our first impulse might be 

 to say the rate is 990 bits per second, merely subtracting the 

 expected number of errors. This is not satisfactory since it fails 

 to take into account the recipient's lack of knowledge of where 

 the errors occur. We may carry this to an extreme case and 

 suppose the noise so great that the received signals are entirely 

 independent of the transmitted signals. The probability of receiving 

 1 is one-half whatever was transmitted, and the same is true for 

 zero. Since about one-half of the received symbols are correct due 

 to chance alone, we could give the system credit for transmitting 



