CHAPTER 

 I 



WHAT IS INFORMATION THEORY? 



Bernard Saltzberg 



^^ INTRODUCTION 



XHE purpose of this paper is to describe the principles under- 

 lying" the quantitative aspects of the storage and communication 

 of information so that a better understanding may be gained of 

 the nature of efficient information storage, with its attendant 

 implications in coding and control processes, including neural 

 control. Insofar as possible, the discussion will avoid abstract 

 mathematical arguments and will be directed to those with little or 

 no previous acquaintance with probability or information theory. 



INFORMATION MEASURE 



Although information theory is an essentially mathematical 

 subject, a basic understanding of the underlying principles can 

 be acquired without resorting to complex mathematical argu- 

 ments. In simple qualitative terms, information, as defined by 

 Shannon, * is merely a measure of how much uncertainty has 



*The formal development of information theory originated in the work of Claude 

 E. Shannon of Bell Telephone Laboratories who published his fundamental paper, 

 "The Mathematical Theory of Communication," in 1948. In this paper he set up 

 a mathematical scheme in which the concepts of the production and transmission 

 of information could be defined quantitatively. Historically however, Shannon's 

 work stems from certain early basic observations in theoretical physics concerning 

 entropy. Boltzman (1894) observed that entropy is related to "missing information," 

 inasmuch as it is related to the number of alternatives which remain possible to a 

 physical system after all the macroscopically observable information concerning it 

 has been recorded. Leo Szilard (1925) extended this idea to a general discussion of 

 information in physics, and von Neumann (1932) treated information in quantuin 

 mechanics and particle physics. Information theory, as developed by Shannon, con- 



