INCESSANT TRANSMISSION 9/15 



Thus the two measures are no more discrepant than are the two 

 ways of measuring "how far is point Q to the right of point /*" 

 shown in Fig. 9/14/2. 



Fig. 9/14/2 



Here P and Q can be thought of as corresponding to two degrees of 

 uncertainty, with more certainty to the right, and with a message 

 shifting the recipient from P to Q. 



The distance from P to Q can be measured in two ways, which are 

 clearly equivalent. Wiener's way is to lay the rule against P and Q 

 (as W in the Fig.); then the distance that Q Hes to the right of P is 

 given by 



(iQ's reading) minus (P's reading). 



Shannon's way (5" in the Fig.) is to lay the zero opposite Q, and then 

 the distance that Q is to the right of P is given by 



minus (P's reading). 

 There is obviously no real discrepancy between the two methods. 



9/15. Channel capacity. It is necessary to distinguish two ways of 

 reckoning "entropy" in relation to a Markov chain, even after the 

 unit (logarithmic base) has been decided. The figure calculated in 

 S.9/12, from the transition probabilities, gives the entropy, or variety 

 to be expected, at the next, single, step of the chain. Thus if an 

 unbiased coin has already given T T H H T H H H H, the un- 

 certainty of what will come next amounts to 1 bit. The symbol that 

 next follows has also an uncertainty of 1 bit; and so on. So the 

 chain as a whole has an uncertainty, or entropy, of 1 bit per step. 



Two steps should then have an uncertainty, or variety, of 2 bits, 

 and this is so; for the next two steps can be any one of HH, HT, 

 TH or TT, with probabilities j, |, ^ and i, which gives H = 2 bits. 

 Briefly it can be said that the entropy of a length of Markov chain 

 is proportional to its length (provided always that it has settled down 

 to equilibrium). 



Quite another way of making the measurement on the chain is 



179 



