84 Mr. Ivory on the Method of the Least Squares. 



Tantageous, or the most probable, solution is when the sum 

 of the squares of the errors is a minimum. Hence 



vl/ = ^ + f'= + e"^ + &c. ; 



consequently 



c being the base of the hyperbolic logarithms. It has been 



shown that the integral 



fde<p{e'^)=k/dec-''"'\ 



taken between the limits + cv), must be equal to unit: and 



hence -ft'cj * — 



kfd ec =— . y'7r=l; 



wherefore h = — ^^, and 



In order to determine h we must employ another considera- 

 tion. The integral 



f(^de.(^{e^) = -^/e' dec~^'^^ 



taken between the limits + cv), is equal to the sum of the 

 squares of the errors multiplied by their respective probabi- 

 lities ; and it is therefore the limit to which the mean of the 

 sum of the squares of the errors converges as the number 

 of the observations increases. Now the integral is equal to 



-—rj-; and, if we denote by e, e, e", &c., the errors of the most 

 advantageous solution, or those of which the sum of the 

 squares is a minimum, we shall have very nearly when the 

 number of observations is great 



1 _ t^ + i^ -f t"a -f &c. 



n being the number of the errors. Hence, employing the 

 summatorial prefix S, we get. 



^=n/2^ 



Thus every thing is known in the function expressing the 

 probability of an error. The quantity h may be considered 



as measuring the precision of the observations. For — is 



s «* 

 proportional to — ^ — ; and as the latter quantity is independent 



of 



