20 Henry Quastler 



These uncertainties differ in any number of respects from each other. They 

 win be of interest in very different situations; the kind of infomiation needed 

 to produce certainty is not the same; neither is the usefulness of this information, 

 and so on. However, there is something in common between all uncertainties 

 which can be characterized by the probabihties: 



Probabihty of 'A' 60 



Probabihty of 'non-A' ... 1 — .60 



One aspect of this 'something-in-common' is that an arrangement of any 60 

 A's and 40 non-A's can be coded to represent any other 60 A's and 40 non-A's 

 —heads or tails, males or females, hits or misses, friends or foes. Once such 

 representation has been established, then the uncertainty concerning one 

 event will be abolished by information concerning the other. We have previously 

 equated the amount of information with the amount of uncertainty it removes. 

 Accordingly, it can be said that the amounts of uncertainty and information 

 must be equal in all situations characterized by a binary alternative with 

 probabilities .60 and .40. 



The foregoing consideration exposes the fundamental features of the 

 measure of information : 



(1) Information is a measurable abstract quantity; its value does not 

 depend on what the information is about, just as length, or weight, or tempera- 

 ture have values which do not depend on the nature of the thing which is long, 

 heavy, or hot ; 



(2) Information is related to the ensemble of possible outcomes of an 

 event; its value depends on the probabihties associated with these outcomes, 

 but not on their causes, and not on their consequences. 



What remains is the development of a measure which comphes with this 

 concept of 'amount of information'; this is merely a technical problem. An 

 obvious generalization states that whenever two events have the same number 

 of possible outcomes, and identical sets of probabihties are associated with 

 the two ensembles of possible outcomes, then these two events have identical 

 information contents. However, we wish to be able to compare events with 

 quite different probability sets; for instance, we wish to be able to say which 

 uncertainty is greater, that associated with a situation with three equiprobable 

 alternatives, or that where there are four possibilities with probabilities .8, 

 .1, .05 and .05. To answer such questions, we have to derive a measure which 

 is a single number, whatever the number of possible categories and their 

 associated probabihties. 



Such a measure is readily derived from the equivalence of uncertainty 

 with the information which removes it. We may represent the information 

 content of an uncertainty-removing piece of intelligence in any manner we 

 wish. We stipulate that this information should be represented in a standard 

 fashion, namely, by using a binary alphabet. In addition we stipulate that 

 the binary representation be coded in such a manner that the expected number 

 of symbols is minimized. We thus obtain a unique number; namely, the 

 minimum average number of binary symbols needed to abolish the uncertainty 

 associated with a given situation. This number will be called the amount of 

 uncertainty or information of this situation. 



