INCESSANT TRANSMISSION 9/14 



(2) If applied to an information source, with several sets of proba- 

 bilities, the matrix of transition probabilities must he Markovian; 

 that is to say, the probabihty of each transition must depend only 

 on the state the system is at (the operand) and not on the states it 

 was at earlier (S.9/7). If necessary, the states of the source should 

 first be re-defined, as in S.9/8, so that it becomes Markovian. 



(3) The several entropies of the several columns are averaged 

 (S.9/12) using the proportions of the terminal equilibrium (S.9/6). 

 It follows that the theorems assume that the system, however it was 

 started, has been allowed to go on for a long time so that the states 

 have reached their equilibrial densities. 



Shannon's results must therefore be appHed to biological material 

 only after a detailed check on their applicability has been made. 



A similar warning may be given before any attempt is made to 

 play loosely, and on a merely verbal level, with the two entropies of 

 Shannon and of statistical mechanics. Arguments in these subjects 

 need great care, for a very slight change in the conditions or assump- 

 tions may make a statement change from rigorously true to ridicu- 

 lously false. Moving in these regions is like moving in a jungle full 

 of pitfalls. Those who know most about the subject are usually 

 the most cautious in speaking about it. 



Ex. 1 : Work out mentally the entropy of the matrix with transition probabilities 



(Hint : This is not a feat of calculation but of finding a peculiar simpUcity. 

 What does that 1 in the main diagonal mean (Ex. 9/5/1)? So what is the 

 final equilibrium of the system? Do the entropies of columns A and C 

 matter? And what is the entropy of 5's column (Ex. 9/11/6)?) 



Ex. 2: (Continued.) Explain the paradox: "When the system is at A there is 

 variety or uncertainty in the next state, so the entropy cannot be zero." 



9/14. A little confusion has sometimes arisen because Shannon's 

 measure of "entropy", given over a set of probabihties pi, P2, ■ ■ ■ , 

 is the sum of /?,• log/?,- multiplied by — 1 whereas the definition given 

 by Wiener in his Cybernetics for "amount of information" is the 

 same sum of Pi log Pi unchanged (i.e. multipHed by -fl). (The 

 reader should notice that p log p is necessarily negative, so the 

 multiplier "—1" makes it a positive number.) 



There need however be no confusion, for the basic ideas are 

 identical. Both regard information as "that which removes 



12 177 



