190 Henry Quastler 



the amount of order which can prevail in an 'order-from-disorder' situation. 

 Again, the one-sided conservation principle will become more powerful if it can 

 be assumed to approximate a two-sided conservation. However, very stringent 

 conditions must be fulfilled if one expects to use the second theorem. There is 

 some reason to believe that these conditions are at least approximated in some 

 biological situations; this is stated in Dancoff's principle (10). 



Dancoff' s principle deals with the economics of information. In 'noisy' 

 situations, information is lost and errors will occur unless they are checked 

 by redundant information. Now, errors may be costly, but so is redundant 

 information; accordingly, the optimum amount of redundant information 

 will be not that which makes all errors vanish, but that which minimizes the 

 sum of the cost of errors plus the cost of redundant information, plus the cost — 

 in information units — of error checking. Dancoff's principle asserts that any 

 organism or organization which has gone through competitive evolution has 

 approximated such an optimum; that is, it will commit as many errors as it 

 can get away with, and use the minimum of redundant information needed 

 to hold errors to this level. It follows from Dancoff's principle that the amount 

 of redundant information in a system is bound to be limited, even if it is a 

 system of enormous information content like a living thing. This is of great 

 interest particularly in radiobiology, because what radiation does very effectively 

 is to destroy information. 



4. The Estimation of Information Measures and the Search for Invariants 



It may well turn out that the qualitative and semi-qualitative applications 

 of information concepts are going to be the most important contribution of 

 information theory to biology. But, even successful qualitative applications 

 have very little power in excluding the possibility that other sets of concepts 

 could have been used just as successfully; besides, all scientists like to take 

 measures. Thus, the problem arises of estimating information measures 

 associated with biological structures and functions. 



One fundamental diflficulty appears immediately: information measures 

 are relative and not absolute ; hence, any information measure associated with 

 a given set of biological objects will depend on the set itself and on the scientist 

 who does the estimating. To be sure, one can establish objective bounds. 

 Thus, if a certain genetic locus is known to be capable of having thirty-two 

 distinct allelic states, which are transmitted to the offspring with equal prob- 

 ability given the proper conditions, then the information stored in this locus 

 cannot be less than five bits. If it is also known that the region containing 

 the locus under consideration comprises no more than, say, 20,000 atoms, 

 then the total information stored cannot be more than about 60,000 bits (10). 

 These brackets are safe, but they are too wide to be of interest. They can be 

 very much reduced if one introduces specific assumptions. For instance, if 

 the locus is known to contain no more than, say, 2 X 50 nucleic acid residues, 

 and if one assumes that the genetic information is completely coded in the 

 sequence of the residues on one strand of a double helix, with the information 

 carried by each residue corresponding to unconstrained selection from four 

 possibilities, then the upper bound is reduced to 100 bits — but its validity is 

 less absolute. 



