What is Information Theory? 11 



of a store with n states. In other words, if ah the states are equally 

 probable, then it is not possible to store the information any more 

 efficiently than one bit of information per message, on the average. 

 If there are some preferred states, i.e., if the pi are not equally 

 probable, then it can be shown that the average information per 

 message can range from zero to one bit. Zero information cor- 

 responds to the condition where a single state has unity probability 

 and all the other states have probability zero. As stated before, the 

 other extreme is attained when all the states are equally probable. 

 In other words, on the average, one must receive more information 

 to resolve fully the states of a completely random system (all 

 states equally probable) than to resolve the states of a less random 

 system (all states not equally probable). 



COMMUNICATION OF INFORMATION 



Let us now consider information theory as it pertains to the 

 communication of information. For this purpose, we define infor- 

 mation received as the diff'erence between the state of knowledge 

 of the recipient before and after the communication. In more 

 precise terms, information received is given by: 



where 



/ = log 



I = information received 

 Pa = probability of the event at the receiver after 



the message is received 

 Pb = probability of the event at the receiver before 

 the message is received. 



In receiving a message regarding the sex of a baby, for example, 

 this expression implies that if the receiver does not know the 

 baby's sex, then 



1 

 Vb = 2' 



and if you (the receiver) receive a signal that "the baby is a boy," 

 then 



Pa = I (provided the message is not noisy) 



