Henry Quastler 



I. INTRODUCTION 



There appears to be a gap in the literature on information tiieory. One can 

 find several articles explaining what information theory is; several of those 

 are understandable to a reader with little knowledge of mathematics ; in fact, 

 some are at the general magazine level. One can also find a number of books 

 and articles explaining how information theory is to be used, but all of these 

 are on a highly technical level. I am not aware of any presentation which is 

 not highly technical and rigorous yet sufficiently explicit and pragmatic to 

 enable a reader to make some practical use of information theory. This paper 

 is intended to serve as a stopgap to fill a temporary current need. It is designed 

 to have some of the aspects of an elementary textbook, including a few exercises. 

 The examples are largely drawn from communication engineering and engineer- 

 ing psychology, these being the most convenient ways of interpreting information 

 theory; however, the whole theory could be expounded without any reference 

 to conscious communication. 



Information theory is based on the concept that information is measurable. 

 This idea is not new. In physics, the notion of a measurable relation between 

 information and degree of orderliness (entropy) dates back to Boltzmann's 

 work in 1872 and its development in 1929 by Szilard (1). In 1918, the statis- 

 tician R. A. Fisher (2) needed a criterion to assess the degree to which the 

 information contained in experimental data is utilized by a given statistical 

 procedure; he worked out a measure of information which has been used in 

 statistics ever since. Later, the need arose for a measure of information- 

 carrying potential as a consequence of the tremendous development of tele- 

 communication, and in 1928, R. V. L. Hartley (3) published such a measure. 

 In 1948, Wiener (4) observed that a measure of infomiation content is a basic 

 ingredient to the study of communication, which itself is a basic ingredient 

 to the study of control in its broadest sense. In the same year, the communi- 

 cation engineer C. E. Shannon (5) published an article on the mathematical 

 theory of communication which in several respects went beyond previous 

 studies. This article is highly technical; it is very difficult reading; it appeared 

 in a specialized journal {The Bell Systems Technical Journal) and it pertained 

 to no other field than telecommunication. It certainly did not look like an article 

 destined to reach wide popularity among psychologists, linguists, mathema- 

 ticians, biologists, economists, estheticists, historians, physicists . . . yet this 

 is what happened. In 1949, the University of Illinois Press issued a book (6) 

 which consisted of a reprint of Shannon's earlier article and a paper by Warren 

 Weaver; in this paper, the generality of the concept of 'amount of information' 

 was forcefully expounded. The literature on 'information' has been increasing 

 ever since at an almost explosive rate. 



What are the reasons for these startling consequences of such a highly 

 specialized article? One reason is, of course, that it is a very good article. 

 The other is that the concept of a measure of information fulfills a general 

 and deep need of our time. The sheer bulk of the information now available 

 increases at a rapid rate. Accordingly, the representation of information becomes 

 a more and more critical problem, and information theory offers general 

 principles concerning representation. Also, we are developing organizations 



