ON THE METHOD OF LEAST SQUARES. 25 



The system (ft') contains as many equations as there are un- 

 known quantities x, y, &c. I proceed to show that if x be 

 determined from this system, its value will be the same as if it 

 had been obtained from the most advantageous system of fac- 

 tors, namely, that which is determined by means of (B) and ((7). 

 In order to prove this, we multiply equations (ft') by X 1? X 2 , &c., 

 and add the results. Then, in virtue of (G) 



Or, 



V V 



x = (X, a t + \1 1 + &c.) jj- + (\ a t + X 2 2 + &c.) F f + &c. 



**l K 2 



that is to say, as is seen on referring to (B), 



as before; which proves that the system (ft') gives the same 

 value for x as the most advantageous system of factors. More- 

 over, as (ft?) is symmetrical in x and a, y and 5, &c. it is clear 

 that it .will also give the most advantageous values for y and 

 the other unknown quantities. 



When the law of probability of error is the same at every 

 observation ^ = 2 = &c. and (ft 1 ) reduces itself to (ft) given at 

 p. 18 as the result of the method of least squares. In the 

 general case, it expresses the modification which the method of 

 least squares must undergo, when all the observations are not of 

 the same kind, namely, that instead of making the function 

 2 (ax + by + &c. F) 2 a minimum with respect to xy, &c., we 



must substitute for it the function 2 (ax+ ly + &c. F) 2 , and 



K 



then proceed as before. 



Such, in effect, is Laplace's demonstration, except that he 

 supposed the law of error the same at each observation. The 

 form in which I have presented it is wholly unlike his. The 

 introduction of Fourier's theorem enables us to avoid the theory 

 of combinations, and also the use of imaginary symbols. It 

 must be admitted that there are few mathematical investigations 



