MATHEMATICAL THEORY OF COMMUNICATION 637 



PART IV: THE CONTINUOUS CHANNEL 



23. The Capacity of a Continuous Channel 



In a continuous channel the input or transmitted signals will be con- 

 tinuous functions of time/(/) belonging to a certain set, and the output or 

 received signals will be perturbed versions of these. We will consider only 

 the case where both transmitted and received signals are limited to a certain 

 band W. They can then be specified, for a time T, by 2TW numbers, and 

 their statistical structure by finite dimensional distribution functions. 

 Thus the statistics of the transmitted signal will be determined by 



P(xi , . . . , r^,) = P(x) 



and those of the noise by the conditional probability distribution 



^xi...,r«(yi, " ' ,yn) = Pxiy). 



The rate of transmission of information for a continuous channel is defined 

 in a way analogous to that for a discrete channel, namely 



R = H{x) - Hy{x) 



where H(x) is the entropy of the input and Hy{x) the equivocation. The 

 channel capacity C is defined as the maximum of R when we vary the input 

 over all possible ensembles. This means that in a finite dimensional ap- 

 proximation we must vary P(x) = P{xi , • • • , Xn) and maximize 



- j P{x) log P{x) dx + jj P(x, y) log ^^ dx dy. 

 This can be written 



// 



"••''"^mpff,'-'' 



using the fact that j j P(x, y) log Pix) dx dy = j P(x) log P(x) dx. The 

 channel capacity is thus expressed 



C = Lim Max i f [ Pix, y) log ^' f^ dx dy. 

 T-*oo p{x) T J J ° P{x)P{y) 



It is obvious in this form that R and C are independent of the coordinate 



^(^, y) 



system since the numerator and denominator in log „, ^ „. . will be multi- 



^ *= P{x)P{y) 



plied by the same factors when x and y are transformed in any one to one 

 way. This integral expression for C is more general than H(x) — Hy{x). 

 Properly interpreted (see Appendix 7) it will always exist while H{x) — Hy(x) 



