The Domain of Information Theory in Biology 189 



in the transmitter, in the medium and, finally, in the receiver. It can thus be 

 stated as follows: A source cannot transmit more information than it has, a 

 receiver cannot register more information than it can display. This sounds 

 trivial, but the point is that information contents can be precisely estimated 

 in ways which are not trivial. The representation theorem implies that it is 

 possible to establish an upper bound of the flow of information simply by 

 investigating the terminals. It is, thus, a one-sided conservation principle; being 

 one-sided, it is not as strong as the two-sided conservation principles which are 

 so commonly used in physics. It becomes stronger in situations where one 

 may assume that the inequality approaches an equality. 



There are two conditions which are conducive to the establishment of full 

 conservation of information: one, that information is a valuable and critical 

 commodity, and two, that noise can be minimized. The concept that informa- 

 tion is the most precious commodity for living things has been formulated 

 strikingly by Schroedinger in his assertion that 'living things feed on orderli- 

 ness' — that they feed because they need fresh supplies of orderliness, not of 

 energy or matter (8). The need for fresh supplies of orderliness presupposes 

 that orderliness is somewhere lost, that is, that noise is present. This, however, 

 does not mean that noise is present everywhere. Some processes may occur in 

 'clockwork fashion', without loss of information. That is the case which 

 Schroedinger classifies as 'generation of order from order'. He suspects that 

 each individual act of transmission of genetic information from parent to 

 offspring occurs without serious loss of information. This idea agrees with the 

 current (Watson-Crick) model of DNA duplication; it recurs in Gamow's and 

 YcAs' models of information transmission from genetic to somatic material (9). 



3. The Noise-and- Redundancy Theorem 



Infonnation transfer from one body of information to another is not often 

 with clockwork regularity. As a rule, interferences occur which will more or 

 less affect the process of information interaction. Interference can be of many 

 kinds: the worst kind of interference is one the results of which are not pre- 

 dictable in detail. In this case, some information will be irretrievably lost. 

 However, in general some but not all order is lost. It is one of the most significant 

 results of information theory to have shown that order and disorder can be 

 measured by a common yardstick. Hence, it is possible to investigate the 

 quantitative relations between total information, noise, and remaining order- 

 liness. The second basic theorem of information theory states that the amount 

 of information effectively transmitted is exactly the amount of information 

 transmitted minus the amount of information lost because of noise. This implies 

 that a source can transmit a certain amount of information reliably in the 

 presence of noise provided it transmits more than the desired amount of 

 information. This surplus must be distributed over the whole activity because it 

 is never known which portions of the total activity will be interfered with by 

 noise; necessarily, the surplus takes the form of redundant information. Thus, 

 the second fundamental theorem states precisely the relation between amount of 

 information to be transmitted, amount of information which will be lost through 

 noise, and amount of redundant information needed to make up the loss. Like 

 the first fundamental theorem, it is a one-sided conservation principle; it limits 



