What is Information Theory? 17 



2. Cherry, Colin: On Human Communication — A Review, A Survey, and A 



Criticism. Massachusetts Institute of Technology, Technology Press, 

 1957. 



3. Feinstein, Amiel: Foundations of Irformation Theory. New York, 



McGraw-Hill, 1958. 



4. Gabor, D.: Lectures on Communication Theory. Massachusetts Institute 



of Technology Research Laboratory of Electronics, Technical Re- 

 port No. 238, 1952. 



5. Goldman, Stanford: Information Theory. Englewood Cliffs, New Jersey, 



Prentice-Hall, Inc., 1952. 



6. Khinchin, A. I.: Mathematical Foundations of Information Theory. New 



York, Dover Puljlications, Inc., 1957. 



7. Schwartz, Mischa: Information Transmission Modulation and Noise. New 



York, McGraw-Hill, 1959. 



8. Shannon, Claude E., and Weaver, Warren: The Mathematical Theory 



of Communications. Urbana, The University of Illinois Press, 1949. 



9. Stumpers, F. L. H. M.: Interpretation and Communication Theory. Lab- 



oratoria N. V. Philips Gloeilampenfabreiken, Eindhoven, Holland, 

 1959. 

 10. Woodward, P. M.: Probability and Information Theory, With Applica- 

 tions to Radar. New York, Pergamon Pixss, 1953. 



DISCUSSION OF CHAPTER I 



Heather D. Mayor (Houston, Texas): In case one wishes to 

 draw analogies from physics rather than from thermodynamics, 

 can you clarify something? Could we equate your conditional 

 entropy with, say, the Heisenberg uncertainty principle and your 

 noise ratio with the perturbations introduced in measuring the 

 system? 



Bernard Saltzberg (Santa Monica, California) : Yes, in terms of 

 inforination measure, uncertainty, or conditional entropy, and the 

 noise which gives rise to the uncertainty (/.^., equivocation) are 

 aspects of essentially ec^uivalent ideas. 



Mayor: And the Bohr generalized complementary principle in 

 biological systems — would that fit, too, with your intrinsic con- 

 cepts? For example, if we can find the exact position of a micro- 

 organism, it is difficult at the same time to establish with certainty 

 another parameter, such as its size. In a biological system, would 

 this approach fit with your generalized entropy concept? 



