MATHEMATICAL THEORY OF COMMUNICATION 407 



All of these entropies can be measured on a per-second or a per-symbol 

 basis. 



12. Equivocation and Channel Capacity 



If the channel is noisy it is not in general possible to reconstruct the orig- 

 inal message or the transmitted signal with certainty by any operation on the 

 received signal E. There are, however, ways of transmitting the information 

 which are optimal in combating noise. This is the problem which we now 

 consider. 



Suppose there are two possible symbols and 1, and we are transmitting 

 at a rate of 1000 symbols per second with probabilities po = pi = i . Thus 

 our source is producing information at the rate of 1000 bits per second. Dur- 

 ing transmission the noise introduces errors so that, on the average, 1 in 100 

 is received incorrectly (a as 1, or 1 as 0). What is the rate of transmission 

 of information? Certainly less than 1000 bits per second since about 1% 

 of the received symbols are incorrect. Our first impulse might be to say the 

 rate is 990 bits per second, merely subtracting the expected number of errors. 

 This is not satisfactory since it fails to take into account the recipient's 

 lack of knowledge of where the errors occur. We may carry it to an extreme 

 case and suppose the noise so great that the received symbols are entirely 

 independent of the transmitted symbols. The probability of receiving 1 is 

 J whatever was transmitted and similarly for 0. Then about half of the 

 received syntbols are correct due to chance alone, and we would be giving 

 the system credit for transmitting 500 bits per second while actually no 

 information is being transmitted at all. Equally "good" transmission 

 would be obtained by dispensing with the channel entirely and flipping a 

 coin at the receiving point. 



Evidently the proper correction to apply to the amount of information 

 transmitted is the amount of this information which is missing in the re- 

 ceived signal, or alternatively the uncertainty when we have received a 

 signal of what was actually sent. From our previous discussion of entropy 

 as a measure of uncertainty it seems reasonable to use the conditional 

 entropy of the message, knowing the received signal, as a measure of this 

 missing information. This is indeed the proper definition, as we shall see 

 later. Following this idea the rate of actual transmission, R, would be ob- 

 tained by subtracting from the rate of production (i.e., the entropy of the 

 source) the average rate of conditional entropy. 



R = H{x) - Hy{x) 



The conditional entropy Hy{x) will, for convenience, be called the equi- 

 vocation. It measures the average ambiguity of the received signal. 



