What is Information Theory? 13 



transfer rates in biological systems as well as for the neurophysio- 

 logical aspects of information transfer, which are to be treated in 

 later papers at this symposium. In order to discuss this problem, 

 it is necessary to define a few terms, namely: 



B = the bandwidth of the communication channel (this 

 defines the range of frequencies which can pass through 

 a system) 



S = received effective signal power 

 N = received effective noise power. 



In any communication system, the message from which the 

 recipient derives information is a combination of signal plus noise. 

 It can be shown (not without some mathematical difficulty, 

 however) that the maximum rate at which information can be 

 sent through a channel — which is 1) signal power limited by .S', 

 and 2) disturbed by random noise of power N — is given by 



R = B\og(l + S/N). 

 In other words, the maximum information that can be sent in 

 a time T is RT or 



I = BT log (1 + S/N). 

 The important implication of these formulae in the design 

 of communication systems resides in the fact that S/N, the signal- 

 to-noise ratio, is a function of B, the bandwidth of the channel. 

 Therefore, if one determines the dependence of signal-to-noise 

 ratio on bandwidth, it is possible to achieve a tradeoff between 

 S/N and B, which optimizes the information handling capacity 

 of the system. 



EQUIVOCATION 



This leads us to the more involved concepts of equivocation 

 and channel capacity and to Shannon's basic theorems on error 

 correction. The previously mentioned maximum rate at which 

 information can be sent through a channel, usually referred to 

 as the channel capacity C, is intimately related to these ideas and, 

 therefore, requires some elaboration and clarification. 



As Shannon has stated, it may seem surprising that we should 

 define a definite capacity C for a noisy channel, since we can never 



