INCESSANT TRANSMISSION 9/17 



arbitrariness. (Compare S.6/I4.) There is therefore every tempta- 

 tion to let one's grasp of the set under discussion be intuitive and 

 vague rather than expHcit and exact. The reader may often find 

 that some intractable contradiction between two arguments will be 

 resolved if a more accurate definition of the set under discussion is 

 achieved; for often the contradiction is due to the fact that the two 

 arguments are really referring to two distinct sets, both closely 

 associated with the same object or organism. 



Ex. 1 : In a Table for the identification of bacteria by their power to ferment 

 sugars, 62 species are noted as producing "acid", "acid and gas", or 

 "nothing" from each of 14 sugars. Each species thus corresponds to a 

 vector of 14 components, each of which can take one of three values. Is 

 the set redundant ? To how many components might the vector be reduced ? 



Ex. 2: If a Markov chain has no redundancy, how may its matrix be recognised 

 at a glance? 



9/17. It is now possible to state what is perhaps the most funda- 

 mental of the theorems introduced by Shannon. Let us suppose 

 that we want to transmit a message with H bits per step, as we might 

 want to report on the movements of a single insect in the pool. H 

 is here 0-84 bits per step (S.9/12), or, as the telegraphist would say, 

 per symbol, thinking of such a series as ...PWBWBBBWP 

 P P W B W P W . . . . Suppose, for definiteness, that 20 seconds 

 elapse between step and step. Since the time-rate of these events 

 is now given, /f can also be stated as 2-53 bits per minute. Shannon's 

 theorem then says that any channel with this capacity can carry the 

 report, and that it cannot be carried by any channel with less than 

 this capacity. It also says that a coding always exists by which 

 the channel can be so used. 



It was, perhaps, obvious enough that high-speed channels could 

 report more than slow; what is important about this theorem is, first, 

 its great generality (for it makes no reference to any specific 

 machinery, and therefore applies to telegraphs, nerve-fibres, con- 

 versation, equally) and secondly its quantitative rigour. Thus, if 

 the pond were far in the hills, the question might occur whether 

 smoke signals could carry the report. Suppose a distinct puff could 

 be either sent or not sent in each quarter-minute, but not faster. 

 The entropy per symbol is here 1 bit, and the channel's capacity 

 is therefore 4 bits per minute. Since 4 is greater than 2-53, the 

 channel can do the reporting, and a code can be found, turning 

 positions to puffs, that will carry the information. 



Shannon has himself constructed an example which shows 

 exquisitely the exactness of this quantitative law. Suppose a source 



183 



