1 2 Information Storage and Neural Control 



and, therefore, 



/ = log jy^ = log 2 = 1 bit. 



If the message were a noisy one, then you might not be quite 

 certain that you received the signal for "boy" correctly. You may 

 nevertheless be willing to give four to one odds that it is a boy, 

 based on the noisy signal you received. In this case 



Va = .8 



and, thus 



I = log Yjx = log 1.6 = .68 bits, 



1/ Z 



which demonstrates the quantitative reduction in information due 

 to noise. 



In the case of no noise it is clear tlie pa is always unity and 



/ = -log pb. 



The important problems in tlie communication of information are, 

 however, concerned with the effects of noise. The maximum 

 amount of information that can be sent through a communication 

 channel in the presence of noise is a topic of particular usefulness 

 whicli we shall examine briefly. 

 Getting back to the expression 



it is interesting to note the implications of this definition. If, for 

 example, a communication system is so noisy that the message 

 has not reduced the receiver's uncertainty as to the event (i.e., 

 p^ = p^)^ then / = log 1=0 and no information has been received. 

 Thus, it is seen that a communication does not necessarily convey 

 any information. The communication must reduce the recipient's 

 uncertainty as to the events in question in order to convey infor- 

 mation. The mathematical definition I = log Pa/Pb is, therefore, 

 consistent with intuitive requirements for a measure of information. 

 One of the important problems in communication theory has 

 to do with the maximum rate at which information can be sent 

 over a communication channel which is disturbed by random 

 noise. This problem has fundamental implications for information 



