QUANTIFICATION OF PERFORMANCE IN A 

 LOGICAL TASK WITH UNCERTAINTY 



A. Rapoport 



Mental Health Research Institute, University of Michigan 



Abstract — Tasks to which information theory has been apphed characteristically do not involve 

 'reasoning', i.e. the drawing of inferences. The present paper explores the possibility of 

 applying information theory to measuring performance of logical tasks. We note at once 

 that any task in which a necessary conclusion must be reached from given information has 

 formally speaking no information content. From the information-theoretical point of view, 

 therefore, no information is gained in the process of solving a purely mathematical or logical 

 problem, no matter how 'complex'. 



There are problems, however, in which in addition to the making of inferences, information 

 must be obtained in the process of solution. Success of solution can be measured by the rate 

 of obtaining such information and by the degree of completeness with which it is utilized. 

 Assuming complete utilization at each step, the efficiency of solution depends on the efficiency 

 with which information is obtained. A classical example is the coin-weighing problem in 

 which a deviant coin and the direction of its deviation must be determined in the fewest 

 possible weighings. Information theory provides not only the minimum number of weighings 

 for such a problem but also a method for constructing the best 'strategy'. 



In the present paper a particular logical task with uncertainty is discussed from the infor- 

 mation-theoretical point of view. It is shown that the construction of an information-getting 

 strategy depends very strongly on the instructions given the subjects and on the inferences 

 which the subjects make from the instructions. Thus the practical problem of quantifying the 

 performance of a logical task carries within it certain ambiguities which must be resolved if 

 information theory is to be of use in psychological tests based on such tasks. 



Information theory is mainly concerned with a quantity called the amount of 

 uncertainty associated with a situation in which choices or guesses are made. 

 This uncertainty can be viewed as a measure of ignorance. For example, we 

 are the more ignorant of the value about to be assumed by a random variable 

 with a discrete domain, the more values it can assume and the more nearly 

 equi-probable these values are. 



Defining every situation of ignorance is a set of postulates with a subjective 

 flavor. Somebody is ignorant. At least this is the case in real situations involving 

 subjects whose state of ignorance is to be inferred. It may be argued from certain 

 philosophical points of view that this intrusion of subjective concepts is unsatis- 

 factory, and attempts should be made to circumvent them or to eradicate them 

 altogether. I don't want to take sides on this question, but only to point to 

 some of its manifestations by way of indicating its persistence. The question 

 has been raised in connection with the foundations of probabihty theory. 

 There the attempts to circumvent the subjective element have given rise to the 

 so-called 'objectivist school', which sought to define probabilities of events 

 'objectively' in terms of the relative frequencies of the events. Opposition to 



230 



