MATHEMATICAL THEORY OF COMMUNICATION 



417 



If each input symbol has the same set of probabihties on the Hnes emerging 

 from it, and the same is true of each output symbol, the capacity can be 

 easily calculated. Examples are shown in Fig. 12. In such a case Hx{y) 

 is independent of the distribution of probabilities on the input symbols, and 

 is given by — S />» log />, where the pi are the values of the transition proba- 

 bilities from any input symbol. The channel capacity is 



Max [Hiy) - H.iy)] 



= MaxZ?(y) + S/>aog/>i. 



The maximum of H{y) is clearly log m where m is the number of output 



a b c 



Fig. 12 — Examples of discrete channels with the same transition probabilities for each 

 input and for each output. 



symbols, since it is possible to make them all equally probable by making 

 the input symbols equally probable. The channel capacity is therefore 



C = log w + 2 piXogpi. 



In Fig. 12a it would be 



C = log 4 - log 2 = log 2. 



This could be achieved by using only the 1st and 3d symbols. In Fig. 12b 

 C = log 4 - S log 3 - i log 6 

 = log 4 - log 3 - I log 2 

 = log4 2^ 

 In Fig. 12c we have 



C = log 3 - i log 2 - i log 3 - J log 6 

 3 



= log 



2*3^6' 



