406 BELL SYSTEM TECHNICAL JOURNAL 



PART II: THE DISCRETE CHANNEL WITH NOISE 

 11. Representation or a Noisy Discrete Channel 



We now consider the case where the signal is perturbed by noise during 

 transmission or at one or the other of the terminals. This means that the 

 received signal is not necessarily the same as that sent out by the trans- 

 mitter. Two cases may be distinguished. If a particular transmitted signal 

 always produces the same received signal, i.e. the received signal is a definite 

 function of the transmitted signal, then the effect may be called distortion. 

 If this function has an inverse — no two transmitted signals producing the 

 same received signal — distortion may be corrected, at least in principle, by 

 merely performing the inverse functional operation on the received signal. 



The case of interest here is that in which the signal does not always undergo 

 the same change in transmission. In this case we may assume the received 

 signal £ to be a function of the transmitted signal S and a second variable, 

 the noise N. 



E = f(S, N) 



The noise is considered to be a chance variable just as the message was 

 above. In general it may be represented by a suitable stochastic process. 

 The most general t3Ape of noisy discrete channel we shall consider is a general- 

 ization of the finite state noise free channel described previously. We 

 assume a finite number of states and a set of probabilities 



This is the probability, if the channel is in state a and symbol i is trans- 

 mitted, that symbol J will be received and the channel left in state /3. Thus 

 a and jS range over the possible states, i over the possible transmitted signals 

 and j over the possible received signals. In the case where successive sym- 

 bols are independently perturbed by the noise there is only one state, and 

 the channel is described by the set of transition probabilities pi(j), the prob- 

 ability of transmitted symbol i being received as j. 



If a noisy channel is fed by a source there are two statistical processes at 

 work: the source and the noise. Thus there are a number of entropies that 

 can be calculated. First there is the entropy H{x) of the source or of the 

 input to the channel (these will be equal if the transmitter is non-singular). 

 The entropy of the output of the channel, i.e. the received signal, will be 

 denoted by H(y). In the noiseless case H(y) = H(x). The joint entropy of 

 input and output will be II (xy). Finally there are two conditional entro- 

 pies Hxiy) and Hyix)^ the entropy of the output when the input is known 

 and conversely. Among these quantities we have the relations 



H{x, y) = H{x) + HJy) = H{y) + Hy{x) 



