44 Henry Quastler 



course, results so obtained have to be used with a certain amount of restraint. 



Example: Rate of Information Transmission in Conversation — ^The working 

 of the approximation methods can be shown by two examples. The first 

 example is that which we used to illustrate the need for approximation methods; 

 namely, that of estimating the amount of information in conversation. 



We consider first the infomiation carried in words. To establish an upper 

 bound, we ask how much information must be transmitted so that the receiver 

 can recognize every single word spoken. 



This upper bound, in bits per second, is the product of the rate of words 

 per second times bits per word. A rate of 2.1 words per second is typical for 

 lively discussions. The number of bits per word in English context has been 

 estimated as 6.5 bits (±25 per cent). This yields 11 to 17 bits per second. 



Words are not the only method of communication between two persons 

 conversing face to face. It can be shown, however, that all other means of 

 communication add little to the total transmission rate. 



We will now try to establish a lower bound. Of course, no general lower 

 bound exists; it is easy to find examples where infomiation is transmitted at 

 the rate of 1 millibit per second, or less. What we want is an 'upper lower 

 bound' a lower bound of the amount of information transmitted between 

 people who try to communicate at some speed, and under reasonably favorable 

 conditions. Such a bound is obtained by analysis of pragmatic communication. 

 We look at situations where the verbal messages elicit or control actions. 

 We make an informational analysis of the relations between actions and verbal 

 messages. This will yield an amount of information demonstrably transmitted, 

 and it certainly represents a lower bound to the amount of information com- 

 municated. 



At this time, we have a single case where pragmatic communication has 

 been evaluated accurately in informational terms. Felton, Fritz and Grier (20) 

 measured the amount of pragmatic communication between an airplane pilot 

 coming in for a landing and the control tower operator. They found an average 

 rate of 2 bits per second, computed in terms of actual effects of the messages. 

 Both pilot and control tower operator have all interest to communicate as 

 fast as they can. On the other hand, they do so in the presence of a very high 

 level of noise which reduces verbal communication to probably about one 

 third of its optimum rate. 



We conclude, thus, that information transmitted through verbal communi- 

 cation is certainly not less than 2 bits per second nor more than 17 bits per 

 second, and very likely within the range between 6 and 12 bits per second. 

 This estimate is rough but not at all unrealistic. 



Example: Information Content per Printed Letter— A very elegant way 

 of computing an information measure under unfavorable conditions was 

 used by Shannon in his analysis of the 'entropy' of printed English (21). The 

 information content of a single letter is easily determined as a function of 

 relative letter frequencies. However, constraints between neighboring letters 

 lead to a reduction of information content, and in order to estimate this 

 reduction exactly one would have to investigate the probability distributions 

 for long sequences of letters. This is manifestly impossible. Shannon, therefore, 

 proceeded to estimate a related quantity; namely, the amount of information 



