25 : 2/ Information Theory and Biology 



465 



The information has been reduced by the "noise" in the system. 



The foregoing examples are admittedly oversimplified and not very 

 practical. Nonetheless, they illustrate the meaning of the word "infor- 

 mation" as used in information theory. A somewhat less simplified 

 picture of any system can be represented schematically as shown in 



Originator 



Figure I. Schematic diagram of an information system. 



Figure 1 . For a telegraph, the meaning of the boxes is obvious. For 

 the process of hearing, the originator might be a piano player. The 

 coder would be the piano. The transmission line represents the air. 

 The decoder is the ear of the listener, and the receiver is his central 

 nervous system. Similar analogies can be made for the synthesis of 

 proteins, the genetic processes, vision, and so on. 



In the example illustrated by Table III, vastly different amounts of 

 information were received from one second to the next. Information 

 theory calls the impulses received in a given period of time the message. 

 In a noiseless system, the information in a given message is 



h = -log 2 ^ 

 The average information per message H is then 



A/ 



h = 2 M 



or, using Equation 2 



M 



H = - 2 A lo S2A 



(2) 



(3) 



(4) 



The last equation is very similar to the statistical-mechanics definition 

 of entropy, except for a minus sign. Accordingly, the average informa- 

 tion in a message H is often called negative entropy. 



There are a number of other terms used in the "jargon" of information 

 theory, some of which are included here for completeness. 



A. Stochastic Process 



This is a process which "generates" symbols, for example, words or amino 

 acids, in a random fashion, but in which the frequency of occurrence 

 (that is, probability) approaches a limiting value as the number of 

 symbols is increased. For example, a stochastic process is tossing a 



