30 Henry Quastler 



In normal code representation, i.e. reduced to efficient binary operations, 

 the information functions have the following meaning: 



H(x) . . . .number of operations which specify x 



Hy{x) . . . .no. of operations which specify x if v is given 



T{x; v) . . . .no. of operations which apply to the specification of both x and v 



H(x, y)- ■ ■ .no. of operations which specify the whole system. 



Inspection of the graph shows that: 



H(x) > H,ix) 



H(y) ^ H,(y), 



that is, the conditional uncertainty cannot be greater than the unconditional 

 uncertainty.* 



Communication Systems 



When a system not only transmits information but exists primarily for that 

 purpose, then it is called a communication system. No class of two-oart systems 

 has received as much attention as that of the communication system. In a 

 simple communication system, tlie two parts are called the source and the 

 destination of information. The distinction between source and destination must 

 be based on external grounds; the informational relations between the two are 

 perfectly symmetrical. The relevant states of the source are called the inputs, 

 or signals sent, and the relevant states of the destination are the outputs, or 

 signals received. A single state is called a symbol, and a higher unit composed 

 of several symbols, a message. The conditional probabilities for each pair of 

 signals sent and received form a matrix called the channel. Note that the word 

 'channel' is again used in a sense wider than customary. A 'channel' may but 

 does not have to be a means of physically conveying information. For instance, 

 if two variables x and y do not affect each other but are both affected by a third 

 variable r, then knowledge of the state of x is likely to reduce the uncertainty 

 concerning the state of y, and vice versa; hence, information is transmitted 

 between the two variables, and they are connected by a 'channel' in the sense of 

 information theory — although they do not communicate with each other directly. 



* However, this is true only for an average conditional uncertainty, and does not apply to 

 every particular condition. The following example will help to fix the ideas: Consider a 

 diagnostic test for a certain disease; suppose the nature of the test and the occurrence of the 

 disease are such that in 98 per cent of the patients the test is negative ; that of the positive tests, 

 50 per cent are spurious ; and that virtually every case of the disease will give a positive test. 

 Then, if the test is not performed at all, the diagnostician's uncertainty as to the presence of the 

 disease in any given patient, is 



-(.99 log2 0.99 + .01 log2 0.01) = .081 bits/patient. 



If the test was negative then the uncertainty is zero. But, if the test is positive, the chances 

 are equal that it is or is not spurious; hence, the uncertainty is I.O bit, and the diagnostician is 

 more in doubt than he was before. However, the average uncertainty, conditional upon his 

 performing the test, is reduced to 



.98 X + .02 X 1.0 = 0.020 bits/patient. 



