INCESSANT TRANSMISSION 



9/18 



Ex. 2: Printed English has an entropy of about 10 bits per word. We can read 

 about 200 words per minute. Give a lower bound to the channel capacity 

 of the optic nerve. 



Ex. 3 : If a pianist can put each of ten fingers on any one of three notes, and can 

 do this 300 times a minute, find a lower bound to the channel capacity of 

 the nerves to the upper limbs. 



Ex. 4: A bank's records, consisting of an endless sequence of apparently random 

 digits, to 9, are to be encoded into Braille for storage. If 10,000 digits are 

 to be stored per hour, how fast must the Braille be printed if optimal coding 

 is used? (Hint: There are 64 symbols in the Braille "alphabet".) 



9/18. One more example will be given, to show the astonishing 

 power that Shannon's method has of grasping the essentials in 

 communication. Consider the system, of states a, b, c, d, with 

 transition probabiUties 



A typical sequence would be 



...bbbcabcabbcddacdabcacddddddabb... 



The equihbrial probabilities are 6/35, 9/35, 6/35, 14/35 respectively. 

 The entropy is soon found to be 0-92 bits per letter. Now suppose 

 that the distinction between a and d is lost, i.e. code by 



\ 



a b 

 X b 



d 

 X 



Surely some information must be lost? Let us see. There are 

 now only three states X, b, c, where X means "either a or d". Thus 

 the previous message would now start ...bbbcZbcXbbc 

 X X X c ... . The transition probabihties are found to be 



(Thus c-> Zmust be 1 because c always went to either a or d\ the 

 transitions from a and from d need weighting by the (equilibria!) 

 probabilities of being at a or d.) The new states have equilibria! 



185 



