48 



Henry Quastler 



14. T{x, y; z) = mutual reduction of uncertainty between x and y on one hand, 

 z on the other 

 = H(x, y) + Hiz) - H(x, y, z) 

 nx;y, z) = H(x) + H(y, z) = H(,x,y, z) 

 T(x;y; z) = total constraint in a tri-variate system 

 = H{x) + H(y) + H{z) - H{x,y, z) 



15. 



Test 



Actual 



H{y) = .40 



H(x) = .47 



H(x,y) 

 nx;y) 



.84 

 .03 



The informational value of the test is .03 bits. 



Its maximum possible infonnational value equals the amount of uncertainty before the 

 test, viz. .40 bits. 



16. 



2.3 X 5 X 60 = 690 bits/minute 



17. Begin by computing the output uncertainty. The probabilities of receiving each signal 

 are obtained as the sum of receiving it correctly (0.2 for Nos. 1 and 4, .178 for 2 and 3, .198 

 for 5) plus the addition due to errors (1/4 of the errors, for each erroneous transmission). 

 This procedure yields //(out) = 2.32 bits. Next, compute the ambiguities. These are zero for 

 symbols no. 1 and 4. For 2 and 3, the ambiguity can be computed as the sum of the information 

 needed to ascertain that an error has occurred (—0.11 loga 0.11 — 0.89 loga 0.89) plus the 

 information needed to find out which of the possible and equiprobable four errors has occurred, 

 which is 0.11 x 2.0 bits/symbol. Symbol no. 5 is treated similarly. The average of the ambi- 

 guities is 0.31 bits, hence T equals 2.32 — 0.31 or 2.01 bits — a loss of about one-sixth of the 

 input information. 



18. One solution is the following: 



11000 

 10101 

 OHIO 

 00011 



A single error will result in the reception of a word which is not in the code book. If one 

 follows the rule of substituting that message in the code book which differs from the received 

 one by one digit only, then every error (provided there is only one!) will be corrected. 



A five-digit binary message can carry five bits of information. If it is known that one error 

 has occurred somewhere in a group of five symbols, then the information needed to locate 

 the error is loga 5 = 2.33 bits. With maximum efficiency, one should use only 2.33/5 or 46.5 

 per cent of redundant information (which could be achieved by coding large sequences of 

 five-digit words!). In our case, the redundant information is 3/5 or 60 per cent, and we trans- 

 mit with an efficiency of 40/53.5 = 75 per cent. (Observe that there is less uncertainty if it is 

 known that there is one error in every five-symbol word, than when it is only known that the 

 error rate is 20 per cent !) 



