20 Information Storage and Neural Control 



providing one has previously had no information about the 

 system? 



Saltzberg: If you knew nothing" about the probability states of 

 the messages, then, of course, you would have very little engi- 

 neering data upon which to base an optimum design for a receiving 

 system. This question may pertain to the a priori probabilities 

 which are useful in choosing an appropriate code. This is analogous 

 to the Twenty Question game mentioned previously. If the ques- 

 tioner has some a priori knowledge of the probabilities, he can ask 

 questions in a specific order, depending on the probabilities, and, 

 on the average, will ask fewer questions to get a correct answer 

 than will someone who just asks questions at random. 



E. Roy John (Rochester, New York): It seems to me that 

 there is a large class of messages in which the a priori probability 

 cannot be evaluated by the receiver. One can think of messages 

 in which the rate of convergence of the total information of the 

 message is not linear for the components of the message and in 

 which the rate of convergence would depend upon the sequence 

 of the components. This might, as a matter of fact, be a charac- 

 teristic difference between certain languages. In a situation where 

 you do not have this advantage of being able to stipulate prob- 

 abilities — in which the probability of a given event is affected by 

 the preceding sequence — it seems to me you must modify your 

 treatment to provide an argument for the bit function, recog- 

 nizing that the information content of a specific event depends on 

 preceding events or context. Could you say something about how 

 you treat this kind of situation, since it seems to be much nearer 

 the situation in which we frequently find ourselves in the nervous 

 system than does the starting point from which you began here. 



Saltzberg: Your question is a good one. It refers to the effects 

 of inter-symbol influence on information content. The fact that 

 there are transitional probabilities which have to be taken into 

 account in determining the information content in language, for 

 example, is included in the mathematics of information theory. 

 These transitional probabilities have the effect of making the 

 information content of a sentence much less than that calculated 

 by assuming that the sequences of letters and words are inde- 

 pendent of their predecessors. I should comment at this point on 



