MATHEMATICAL THEORY OF COMMUNICATION 403 



-up, log p,<H' <^- Zp, log p. 



As N increases — 2/>g log />, approaches Hj the entropy of the source and H' 

 approaches H. 

 We see from this that the inefficiency in coding, when only a finite delay of 



N symbols is used, need not be greater than — plus the diflference between 



the trae entropy H and the entropy Gx calculated for sequences of length N. 

 The per cent excess time needed over the ideal is therefore less than 



^ + J_ _ 1. 

 H ^ EN 



This method of encoding is substantially the same as one found inde- 

 pendently by R. M. Fano.^ His method is to arrange the messages of length 

 N in order of decreasing probability. Divide this series into two groups of 

 as nearly equal probability as possible. If the message is in the first group 

 its first binary digit will be 0, otherwise 1. The groups are similarly divided 

 into subsets of nearly equal probability and the particular subset determines 

 the second binary digit. This process is continued until each subset contains 

 only one message. It is easily seen that apart from minor differences (gen- 

 erally in the last digit) this amounts to the same thing as the arithmetic 

 process described above. 



10. Discussion 



In order to obtain the maximum power transfer from a generator to a load 

 a transformer must in general be introduced so that the generator as seen 

 from the load has the load resistance. The situation here is roughly anal- 

 ogous. The transducer which does the encoding should match the source 

 to the channel in a statistical sense. The source as seen from the channel 

 through the transducer should have the same statistical structure as the 

 source which maximizes the entropy in the channel. The content of 

 Theorem 9 is that, although an exact match is not in general possible, we can 

 approximate it as closely as desired. The ratio of the actual rate of trans- 

 mission to the capacity C may be called the efficiency of the coding system. 

 This is of course equal to the ratio of the actual entropy of the channel 

 symbols to the maximum possible entropy. 



In general, ideal or nearly ideal encoding requires a long delay in the 

 transmitter and receiver. In the noiseless case which we have been 

 considering, the main function of this delay is to allow reasonably good 



' Technical Report No. 65, The Research Laboratory of Electronics, M. I. T. 



