1 Information Storage and Neural Control 



messages have to be stored in a system of binary storage units. 

 (It will be assumed that we are interested in preserving the exact 

 order of the replies and not simply in counting the number of 

 yeses and noes). Proceeding in the most obvious manner and using 

 one storage unit for each message, we should set down a sequence 

 such as this: 



YYYYYYYYNYYYYYYYYYYYYYYYNYYY . . . 



The question, "Are you a doctor?", expects the answer yes from 

 this medical group, and of the 128 messages there will be only 

 one or two no states. Therefore, it would be more economical to 

 store the positions of the noes in the sequence and convert the 

 numbers 9, 25, corresponding to the noes in the above sequence, 

 into the binary form as 0001001 and 0011001. Seven digits are 

 allowed because there are 2 messages altogether. Thus, the se- 

 quence could be coded into the sequence 



00010010011001 . . . 



This makes use of binary storage units just as in the original 

 sequence, but a much smaller number of them. It is understood, 

 as part of the code, that decoding proceeds in blocks of seven. 

 This avoids violating the binary form of marking off groups of 

 digits. The preceding code, which is only one of many that could 

 be devised, shows that a set of two-state messages can sometimes 

 be stored in such a way that each message occupies, on the average, 

 less than one bit of storage capacity. 



From such considerations, the following definition of information 

 content is suggested: 



n 



I = -2 ?^.Tog/>i 



whei'e p. = probability of the ith. state 



n = total number of states. 

 If all n states are equally probable, then it follows that pi = \/ n 

 for all values of /. Thus, substituting pi = 1/n into the expression 

 for /, we note that 



/ = log 71. 



From this it follows that if all n states are equally probable, the 

 information content is exactly equal to the information capacity 



