MATHEMATICAL THEORY OF COMMVNICATION 651 



A partial solution of the general maximizing problem for determining the 

 rate of a source can be given. Using Lagrange's method we consider 



//[ 



^X:v, y) log ^§^) + M P{^. y)p{x, y) + v{x)F{x, y)\ dx dy 



The variational equation (when we take the first variation on P(x, y)) 

 leads to 



Py{x) = B(x) e-^'^""''^ 



where X is determined to give the required fidelity and B(x) is chosen to 

 satisfy 



j B(x)e-^'^'''^ dx = 1 



This shows that, with best encoding, the conditional probability of a cer- 

 tain cause for various received y, Py{x) will decline exponentially with the 

 distance function p(x, y) between the x and y is question. 



In the special case where the distance function p{x, y) depends only on the 

 (vector) difference between x and y, 



p(x,y) = p(x - y)- 



we have 



/ 



B{x)e~^'^'-'^ dx = 1. 



Hence B{x) is constant, say a, and 



Py{x) = a^'"^"-^^ 



Unfortunately these formal solutions are difficult to evaluate in particular 

 cases and seem to be of little value. In fact, the actual calculation of rates 

 has been carried out in only a few very simple cases. 



If the distance function p{x, y) is the mean square discrepancy between 

 X and y and the message ensemble is white noise, the rate can be determined. 

 In that case we have 



R = Min \H(x) - Hy(x)] = H(x) - Max Hy{x) 



with N = (x — y)^. But the Max Hy(x) occurs when y — x is a white noise, 

 and is equal to Wi log 2xe N where Wi is the bandwidth of the message en- 

 semble. Therefore 



R= Wi log IweQ - Wi log lireN 



where Q is the average message power. This proves the following: 



