BIOLOGICAL ORDER AND ENTROPY 



Information and Negentropy 



This brings us to the negentropy principle of information. What 

 is information? We shall follow Brillouin's analysis. One considers a 

 problem involving a certain number of possible answers when no 

 information is available. When some information is gained, then 

 the number of possible answers is reduced, and "complete informa- 

 tion" means only one possible answer. Information is a function of 

 the ratio of possible answers before and after. 



The initial situation is 



/o = 



with Pq probable or possible outcomes. 

 The final situation is 



with Pi = 1, that is to say, one single outcome selected. 

 The information / is 



/i = K In Po 



This definition of information is based on scarcity. The lower the 

 probability, the higher the scarcity and the higher the information. 



In 1929, Leo Szilard discovered the existence of a connection 

 between information and entropy. And the demonstration has been 

 provided that information corresponds to negative entropy. 



The similarity between the two formulas of entropy 



S = k\nP 



and information 



/i = K In Po 



is obvious. If the Boltzmann constant is used, information is measured 

 in entropy units: 



/i = 1.38 X 10-i«lnPo 



The information of any specific organization can thus be expressed 

 in entropy units. For example, the negentropy of a telephone net- 

 work with 108 subscribers can be calculated to be 4 X 10""^. 



[91] 



