SECT. 20.] Theory of the Average. 485 



pose, for instance, that m errors were first taken and aver 

 aged, and then n similarly taken and averaged. These 

 averages will be nearly, but not quite, equal. Their sum or 

 difference, these, of course, are indistinguishable in the end, 

 since positive and negative errors are supposed to be equal 

 and opposite, will itself be an error , every magnitude of 

 which will have a certain assignable probability or facility of 

 occurrence. What we do is to assign the modulus of these 

 errors. The actual result again is simple. If c had been 

 the modulus of the single errors, that of the sum or dif 

 ference of the averages of m and n of them will be 



20. So far, the problem under investigation has been 

 of a direct kind. We have supposed that the ultimate mean 

 value or central position has been given to us; either a 

 priori (as in many games of chance), or from more immediate 

 physical considerations (as in aiming at a mark), or from ex 

 tensive statistics (as in tables of human stature). In all 

 such cases therefore the main desideratum is already taken 

 for granted, and it may reasonably be asked what remains to 

 be done. The answers are various. For one thing we may 

 want to estimate the value of an average of many when com 

 pared with an average of a few. Suppose that one man has 

 collected statistics including 1000 instances, and another has 

 collected 4000 similar instances. Common sense can recog 

 nize that the latter are better than the former; but it has no 

 idea how much better they are. Here, as elsewhere, quanti 

 tative precision is the privilege of science. The answer we 

 receive from this quarter is that, in the long run, the 

 modulus, and with this the probable error, the mean error, 

 and the error of mean square, which all vary in proportion, 



