REQUISITE VARIETY 11/8 



the previous section has proved that Vq cannot be less, numerically, 

 than the value of V^j — K^. Thus K^'s minimum is Vj) — Vj^. 



If Vj) is given and fixed, V^ — Vj^ can be lessened only by a 

 corresponding increase in Vj^. Thus the variety in the outcomes, 

 if minimal, can be decreased further only by a corresponding increase 

 in that of R. (A more general statement is given in S.11/9.) 



This is the law of Requisite Variety. To put it more picturesquely : 

 only variety in R can force down the variety due to D; only variety can 

 destroy variety. 



This thesis is so fundamental in the general theory of regulation 

 that I shall give some further illustrations and proofs before turning 

 to consider its actual apphcation. 



11/8. (This section can be omitted at first reading.) The law is of 

 very general applicabihty, and by no means just a trivial outcome of 

 the tabular form. To show that this is so, what is essentially the 

 same theorem will be proved in the case when the variety is spread 

 out in time and the fluctuation incessant — the case specially con- 

 sidered by Shannon. (The notation and concepts in this section 

 are those of Shannon's book.) 



Let D, R, and E be three variables, such that each is an informa- 

 tion source, though "source" here is not to imply that they are acting 

 independently. Without any regard for how they are related 

 causally, a variety of entropies can be calculated, or measured 

 empirically. There is H{D,R,E), the entropy of the vector that has 

 the three as components; there is Hd{E), the uncertainty in E when 

 Z)'s state is known; there is H^J^R), the uncertainty in R when both 

 E and D are known; and so on. 



The condition introduced in S.11/5 (that no element shall occur 

 twice in a column) here corresponds to the condition that if R is 

 fixed, or given, the entropy of E (corresponding to that of the out- 

 come) is not to be less than that of D, i.e. 



H^{E) > H^{D). 



Now whatever the causal or other relations between D, R and E, 

 algebraic necessity requires that their entropies must be related so 

 that 



H{D) + H^{R) = H{R) + Hj,{D), 



for each side of the equation equals H{R,D). Substitute Hg_{E) 

 for Hji{D), and we get 



H(D) + H^{R) < H(R) + H^{E) 



< H{R,E). f^ 



207 





