194 Henry Quastler 



approach this bound. One cannot transmit information through a channel 

 at a rate higher than the channel capacity, but it is very easy to transmit at a 

 lower rate. For instance: human capacity of transmitting information can 

 be limited on the input side, on the output side, or centrally; if the limitation 

 is central, then it can be due (a) to the limited channel capacity, but also to 

 limitations of (b) the rate at which discrete acts of information-processing 

 can be performed, of (c) the amount of information per single act, of (d) the 

 number of information-carrying components considered in each act, of (e) 

 the maximum amount of information per component, or finally, (f ) to inefficient 

 coding (14). Parallel situations are likely to exist in molecular biology. For 

 instance, Augenstine (18) discusses the fact that the amount of information 

 which can be coded into an amino acid sequence is considerably greater than 

 the amount of information needed to account for the functional specificity 

 of a protein. This could mean that the channel capacity is only fractionally 

 utilized, or that functional specificity is coded in an entirely different fashion. 



4. Information Measures and Other Aspects of Systems 



If the mechanism of a reaction is known, then the probabilities of all input- 

 output associations can be computed, and the information measures derived 

 from them. On the other hand, an infomiation measure does not define a 

 single mechanism — however, it imposes a condition with which input-output 

 tables and, by implication, mechanisms have to comply. For instance, in 

 the problem of the DNA-protein code studied by Gamow and Ycas (9), 

 "" • the infoiTnational analysis furnishes conditions which the code must fulfill 

 but does not yield the code itself. Accordingly, the informational analysis 

 has served, repeatedly, to reject a proposed mechanism. It can, of course, 

 never be used to prove a mechanism. 



Amount of information is in general related to the utility of being informed 

 — but the relation is not necessarily one of simple proportionality; in fact, 

 the utility of information is not always a monotonically increasing function 

 of its amount. Similarly, the information content of a structure is in general 

 related to the difficulty of construction, but the relation is not one of simple 

 proportionality. 



The 'amount of information' in a statement is related to its capacity of 

 carrying semantic information, but this capacity is rarely fully utilized (23). 



III. CONCLUSION 



I have tried to outline some of the applications and possible applications, 

 and I hope to have shown that there is much promise in this field. I have tried 

 to outHne some of the limitations of applying information theory— and I 

 hope to have shown that they are not serious, provided one is always aware 

 of them. To make more progress, we need much more mathematical work, 

 and we need very much more experimental work. In looking over the past of 

 information theory in biology, a very strong emphasis on theory— more or 

 hss rigorous— is obvious; although more theory is needed, the most pressing 

 need is now for a large body of good specific experiments. Also, it should be 

 rewarding to examine closely other related possibilities in theoretical biology. 



