INCESSANT TRANSMISSION 9/12 



+ 0-6232 = -0-3768; so the first term is (-0-42)(-0-3768), which 

 is + 0-158; and similarly for the other terms.) Had the logs been 

 taken to the base 2 (S.7/7) the result would have been 1 -63 bits. 



The word "entropy" will be used in this book solely as it is used 

 by Shannon, any broader concept being referred to as "variety" 

 or in some other way. 



Ex. 1 : On 80 occasions when I arrived at a certain level-crossing it was closed 



on 14. What is the entropy of the set of probabilities? 

 Ex. 2: From a shuffled pack of cards one is drawn. Three events are dis- 

 tinguished : 



El : the drawing of the King of Clubs, 

 E2: the drawing of any Spade, 

 £3: the drawing of any other card. 

 What is the entropy of the variety of the distinguishable events ? 

 Ex. 3 : What is the entropy of the variety in one throw of an unbiased die ? 

 Ex. 4: What is the entropy in the variety of the set of possibilities of the out- 

 comes (with their order preserved) of two successive throws of an unbiased 

 die? 

 Ex. 5 : (Continued.) What is the entropy of n successive throws ? 

 *Ex. 6 : What is the Umit of —p log p asp tends to zero ? 



9/12. The entropy so calculated has several important properties. 

 First, it is maximal, for a given number (n) of probabilities, when the 

 probabiHties are all equal. H is then equal to log n, precisely the 

 measure of variety defined in S.7/7. (Equality of the probabilities, 

 in each column, was noticed in S.9/10 to be necessary for the 

 constraint to be minimal, i.e. for the variety to be maximal.) 

 Secondly, different //'s derived from different sets can, with suitable 

 qualifications, be combined to yield an average entropy. 



Such a combination is used to find the entropy appropriate to a 

 Markov chain. Each column (or row if written in the transposed 

 form) has a set of probabilities that sum to 1 . Each can therefore 

 provide an entropy. Shannon defines the entropy (of one step of 

 the chain) as the average of these entropies, each being weighted by 

 the proportion in which that state, corresponding to the column, 

 occurs when the sequence has settled to its equilibrium (S.9/6). Thus 

 the transition probabilities of that section, with corresponding 

 entropies and equilibrial proportions shown below, are 



Equilibrial proportion 



175 



