ON THE METHOD OF LEAST SQUARES. 19 



in which system there are always the same number of equations 

 as there are unknown quantities to be determined. 



The next investigation of the principle of the method of 

 least squares which I shall attempt to analyze is that of Laplace. 



LAPLACE'S DEMONSTRATION. 



If, in order to determine x from the equations of condition 

 stated in the last paragraph, we multiply the first by /^, the 

 second by // 2 , &c., and add : (/^ 2 , &c. fulfilling the conditions 



= 0, &c. = 0) 

 we find x 



and if we assume that Syt-te is equal to zero, then the resulting 

 value of x is Syu-F: the error of this determination being the 

 quantity 2/-te, which we have assumed to be equal to zero, 

 without knowing whether it really is so or not. 



Now supposing there are n equations of condition, and p 

 quantities to be determined, and that n is greater than p, then 

 we see that there are n factors ^, yu- 2 .../i< n , and p conditions for 

 them to fulfil. They may therefore be subjected to n p addi- 

 tional conditions. 



This being premised, let us consider the probability that the 

 quantity 2^e will not be less than a, or greater than /3, a and /5 

 being any quantities whatever. The law of probability of error 

 at each observation being given, the question is evidently analo- 

 gous to the common problem of finding the chance, that with a 

 given set of dice the number of points thrown shall not be less 

 than one given number or greater than another. 



We may therefore suppose that the probability in question 

 has been determined: call it P. Suppose also that we have 

 taken a = I and /3 = Z, I being any positive quantity. 



Then P is a function of Z, and of ^. ../* n . 



Let us now so determine ^ r ..//, n , (subject to the conditions 

 already specified,) that P may be a maximum. When this is 

 done, it follows that there is a greater probability that the error 



22 



