J. F. ENCKE ON THE METHOD OF LEAST SQUARES. 325 



not result are excluded ; consequently, in reference to the ob- 

 served value M, the relative probability of HyP' I. 

 _ m 



m + m' + m'" 

 and that of Hyp. II. 



_ m' 



m + m + m" ' 

 or they are to each other as m : m, and in consequence of 



the equation m -\- n ■= m -^ n', as : — ; ;, or as 



^ VI + n m -\- n 



<f> A : <!> A'. Hence follows the proposition : 



II. 



The probabilities of two hypotheses, which are equally probable 

 before the observation is made, and which exclude each other, 

 are directly proportional to the probability of the errors, or 

 system of errors, proceeding from them. 



Consequently, if the magnitudes M are found by a kind of 

 observ'ation of which it is by other means known what errors 

 may occvir in it, and in what proportion, or for which the law 

 of the probability of the errors ^ A is known, (which is in- 

 dependent of the use to be afterwards made of these observa- 

 tions for determining one or more unknown values,) then the 

 probability of each hypothesis as to x, y, z, is proportional to 

 the product 



(/> A . (/) A' . </) A" . </) A'" = n, (3.) 



where A, A', A", A'" are the errors which remain over in each 

 hypothesis. The most probable hypothesis will be that in 

 which n is a maximum, or in which, in differentiating, d H 

 becomes = 0. On account of the mutual independence of the 

 quantities x, y, z, this equation divides itself into the separate 



equations ^=0, ^=0, ^ = 0. 



Generally, each 



A = M - V. 



If consequently, before the substitution of a numerical value for 

 X, y, z, the functions M — V be designated by v, so that ' 



M - V = V, M' - V = v', M" - V" = v\ &c. ; 

 and if, for the sake of easier differentiation, we make 



log. fi = log. </> A -I- log. A' + log. A'" 



