9/11 AN INTRODUCTION TO CYBERNETICS 



ENTROPY 



9/11. We have seen throughout S.7/5 and Chapter 8 how information 

 cannot be transmitted in larger quantity than the quantity of variety 

 allows. We have seen how constraint can lessen some potential 

 quantity of variety. And we have just seen, in the previous section, 

 how a source of variety such as a Markov chain has zero constraint 

 when all its transitions are equally probable. It follows that this 

 condition (of zero constraint) is the one that enables the information 

 source, if it behaves as a Markov chain, to transmit the maximal 

 quantity of information (in given time). 



Shannon has devised a measure for the quantity of variety shown 

 by a Markov chain at each step — the entropy — that has proved of 

 fundamental importance in many questions relating to incessant 

 transmission. This measure is developed in the following way. 



If a set has variety, and we take a sample of one item from the 

 set, by some defined sampling process, then the various possible 

 results of the drawing will be associated with various, corresponding 

 probabilities. Thus if the traffic lights have variety four, showing 

 the combinations 



1 Red 



2 Red and Yellow 



3 Green 



4 Yellow, 



and if they are on for durations of 25, 5, 25 and 5 seconds respectively, 

 then if a motorist turns up suddenly at irregular times he would 

 find the lights in the various states with frequencies of about 42, 8, 

 42 and 8% respectively. As probabilities these become 0-42, 0-08, 

 0-42 and 0-08. Thus the state "Green" has (if this particular 

 method of samphng be used) a probabiUty of 0-42; and similarly 

 for the others. 



Conversely, any set of probabihties — any set of positive fractions 

 that adds up to 1 — can be regarded as corresponding to some set 

 whose members show variety. Shannon's calculation proceeds 

 from the probabilities by the calculation, if the probabilities are 



Pl, P2, . . . , Pn, of 



-P\ log;?! - P2^ogp2- . . . - p„ log/?,,, 



a quantity which he calls the entropy of the set of probabilities and 

 which he denotes by H. Thus if we take logs to the base 10, the 

 entropy of the set associated with the traffic lights is 



-0-42 logio 0-42 - 0-08 logioO-08 - 0-421ogioO-42 - 0-081ogioO-08 



which equals 0-492. (Notice that logio 0-42 = T-6232 = -1-0000 



174 



