9/13 AN INTRODUCTION I O CYBERNETICS 



Then the average entropy (per step in the sequence) is 



0-449 X 0-811 + 0-429 X 0-811 + 0-122 x 1-061 = 0-842 bits. 



A coin spun repeatedly produces a series with entropy, at each spin, 

 of 1 bit. So the series of locations taken by one of the insects as 

 time goes on is not quite so variable as the series produced by a spun 

 coin, for 0-842 is less than 1-00. In this way Shannon's measure 

 enables different degrees of variety to be compared. 



The reason for taking a weighted average is that we start by 

 finding three entropies: 0-811, 0-811, and 1-061 ; and from them we 

 want one. Were they all the same we would obviously just use that 

 value, but they are not. We can, however, argue thus: When the 

 system has reached equilibrium, 45% of the insects will be at state B, 

 4?>% at W, and 12% at P. This is equivalent, as the insects circulate 

 between all the states, to saying that each insect spends 45% of its 

 time at B, 43% at W, and 12% at P. In other words, 45% of its 

 transitions will be from B, 43% from W, and 12% from P. Thus 

 45% of its transitions will be with entropy, or variety, of 0-811, 

 43% also with 0-811, and 12% with 1-061. Thus, transitions with an 

 entropy of 0-811 will be frequent (and the value "0-811" should 

 count heavily) and those with an entropy of 1 -061 will be rather rare 

 (and the value "1-061" should count little). So the average is 

 weighted: 88% in favour of 0-811 and 12% in favour of 1-061, i.e. 



45 X 0-811 + 43 X 0-811 + 12 x 1-061 

 weighted average = 45 4. 43 ^ 12 



which is, effectively, what was used above. 



Ex. 1 : Show that the series of //"s and T's produced by a spun coin has an average 

 entropy of 1 bit per spin. (Hint: Construct the matrix of transition pro- 

 babilities.) 



Ex. 2: (Continued.) What happens to the entropy if the coin is biased? (Hint: 

 Try the effect of changing the probabilities.) 



9/13. Before developing the subject further, it is as well to notice 

 that Shannon's measure, and the various important theorems that 

 use it, make certain assumptions. These are commonly fulfilled in 

 telephone engineering but are by no means so commonly fulfilled 

 in biological work, and in the topics discussed in this book. His 

 measure and theorems must therefore be applied cautiously. His 

 main assumptions are as follows. 



(1) If applied to a set of probabihties, the various fractions must 

 add up to 1 ; the entropy cannot be calculated over an incomplete 

 set of possibilities. 



176 



