340 J. F. ENCKE ON THE METHOD OF LEAST SQUARES. 



X — n ■=■ 0, 



0? — w' = 0, 



a? — «" = 0, 

 and each equation of condition is the general expression of the 

 error of the observation in any hypothesis as to x. Conse- 

 quently, if the constant h belong to this set of observations, so 

 that for it 



h - ''" -^^ 

 <f(A)=-^. ; 



then the general expression for the probability of one error in the 

 first observation will be in every assumption as to x, 



h - h^ (-r - ")' 

 e 



and the joint probability of the concurrence of m errors in 

 these observations will be 



^™ _ A2 1 (x - «)= +{'- «')' + (' - »'r- } 



This probability will be greatest when the sum of the squares of 

 the remaining errors according to an adopted hypothesis is the 

 least possible, and consequently according to 



Proposition II. — That hypothesis as to x in which the sum of 

 the squares of the remaining errors is an absolute minimum, is the 

 most probable of all possible hypotheses. 



This minimum may be obtained either by the differential cal- 

 culus, by which 



2{x -n) + 2{x- n') + 2 {x - n") =0, 



or 



n + 7i' + n" + 



X = ; 



m 



thus the arithmetical mean, as was before laid down, is the most 

 probable value in equally good observations. But when x is left 

 undetermined, we may give to the sum of the squares of the 

 errors such a quadratic form, that both the most probable value 

 of X, and the remaining minimum squares of the errors, may at 

 once proceed therefrom. For the sake of brevity we will desig- 

 nate the sum 



n ]-n' + n" by [n] 



n^ + n'''-\- n"^ .... by M. ^ "*'' 



This mode of designation will be extended in the sequel to any 



