298 THE LANGUAGE AND CONCEPTS OF CONTROL 



Information and Entropy 



The broad use of the term "entropy" as a quantitative measure of the 

 amount of disorder in a system, or subsystem, was introduced in Chapter 7. 

 Now we carry the concept one step further. For communication, which re- 

 quires a description of a system in words or codings, the simpler the system 

 the simpler the information needed to describe it. Four sticks standing fixed 

 in a row (||||) is a very simple system, A, easily described; but the same four 

 sticks comprise an infinitely complex system, B, if the four sticks are thrown 

 off a roof-top and each stick allowed to assume any position and degree of 

 rotation during fall. The information required to describe A unambiguously 

 is small; likewise its entropy or disorder is low. By contrast the information 

 required to describe B unambiguously is relatively very large; its entropy or 

 disorder is high. 



Therefore, a measure of the quantity of information needed to describe some- 

 thing is the entropy of the system being described. 



It follows that if the information, S, put into a computational system such 

 as man becomes distorted for one reason or another, the changed informa- 

 tion is now 5" + AS, where AS is the distortion. It is always positive, in- 

 creasing the entropy. 



However, if two inputs, S l and S 2 , are faithfully recorded and analyzed, 

 and if from the two informations a third piece of information, a synthesis of 

 the two, occurs, then the total information needed to describe S } and S 2 is less 

 than the sum 5\ + S 2 , and the total entropy has thereby been decreased .... 

 One's information is now better organized. One remembers now a simple 

 principle which describes both systems 1 and 2. 



Measurement 



Measurement implies a reference. What is measured is a difference be- 

 tween two quantities, one of which is taken as the reference, against which 

 many similar quantities are measured. The fact that no two physical beings 

 are in all respects identical implies variation. Variation in turn introduces 

 uncertainty. 



There is an inherent uncertainty in all measurement, a principle first 

 propounded by Heisenberg. The formal statement of this is known as his 

 "uncertainty principle." It takes various forms, a simple statement of which 

 is the following: To make a physical measurement, energy must be trans- 

 ferred between the object and the measuring device; otherwise there is noth- 

 ing to detect; this transfer introduces uncertainty, because the object is not 

 now the same as it was before the energy was transferred: the smaller the 

 object the more difficult it is to measure its properties. 



However, in the macroscopic physical world, objects are big enough so 

 that this uncertainty is far smaller than are gross errors in measurement, 



