464 Information Theory and Biology /25 : 2 



To recapitulate, information theory treats information as the removal 

 of uncertainty. The theory measures information quantitatively in 

 accord with Equation 1, the base 2 logarithm being used because all 

 problems are reduced to equivalent yes-no type answers. In many 

 cases, it is hard to know the values of p and p t ; accordingly, Equation 1 

 is difficult to apply. In other cases, p and p t may be simply known. 



For example, suppose that the number of impulses transmitted by a 

 given nerve fiber are recorded for 1,000 seconds. Then one can make 

 out a distribution table and compute the probabilities pi as has been 

 done in Table III. Suppose a few minutes later, one measures the 



TABLE III 



Transmitted Spikes 



Total 1000 1.00 



number of impulses for one second and finds four of them. Then one 

 may write 



p = l.o p t = 0.25 / = log 2 ^ = 2 bits 



Pi 

 Because so many different possibilities exist, any definite number such 

 as four gives information. However, four is a relatively probable value, 

 so this is a minimum of information. If the measurement is repeated 

 a few minutes later and 10 impulses are measured, more information 

 is obtained because 10 is a priori less likely. In this case, the appropriate 

 values are 



p = l.O p t = 0.02 / = log 2 -^ = log 2 50 = 5.6 bits 



Pi 



For the system as described, p Q is always one. This is called a noiseless 

 system. If, instead of counting impulses, one recorded a current, it 

 might have read 9.6 impulses instead of 10. The answer is still probably 

 10 but the output probability p is no longer one. A reasonable choice 

 might be 



p = 0.6 pi = 0.02 / = 4.9 bits 



