324 J. F. ENCKE ON THE METHOD OF LEAST SQUARES. 



bination of two A and one A' in any otherwise arbitrary ar- 

 rangement 



= (<^A)2</,A', 



whence the above formula is deducible. 



To obtain the second proposition, let us consider the case in 

 which any observation has given the value of M. Now com- 

 pare together two hypotheses as to x, y, z. Let 



Hyp. I oc=p, tj = q, z = r 



II X =-p\y — q\ z = r'. 



Before M is observed, we have no measure of the relative 

 probabilities of these two hypotheses, or of any others ; there- 

 fore, before the observation they must be regarded as equally 

 probable. But after M has been found. Hyp, I. will give the 

 error A, with the probability (p A, and Hyp. II. will give 

 the error A', with the probability </> A'. If we denote by m 

 the number of cases in which, assuming Hyp. I., M will pro- 

 ceed from it, and by n the number of cases in which, by the 

 same supposition, M will not be obtained, then will 



d> A = 



m -\- n 



Let m' and ni have the same signification in Hyp. II., then 



<^A' = 



m + ri' 



But besides these two suppositions, of either Hyp. I. or Hyp. 

 II. being the true one, there are also cases in which neither are 

 true, and amongst these there may be some which, in certain 

 cases, give M. Let the signification of m" and «" for all other 

 hypotheses be the same as above, then the number of all possible 

 cases will be = m -I- ?i -f m' + w' + m" + »" ; therefore the pro- 

 bability of Hyp. I., before the observation is made, 



_ m -\- n 



~ w + w + w' + »' + m" + n'" 



and that of Hyp. II., before the observation is made, 



m' ■\- n' 



~~ m + n +m' + n' + m" + n" ' 



these two values must be considered equal, whence it follows 

 that 



m + n = m' + n'. 



But after M has actually been found, the cases where it does 



