394 Hypertjeometrical Series and Frequency Distributions. 



then be the sum of two quantities, of which the one is 

 proportional to S and the other to r — S, thus being a linear 

 function of S. 



Note II. — We have shown above that the condition for 

 linear regression is that the following relations between the 

 moments are valid : 



Pa, 1 P20~Pa+l, PlV 

 Pi , p Po2 = Po, /3-|-l PlV 



Introducing the notations 



2. - pi > 



'v o- % a % J ' 

 we may write the conditions thus 



a! Pa+i,o=2 ttj l -p'2 a+li o = 0, 



When the regression is not linear the quantities p { and 

 p ., or some of them, will not disappear. In another place 

 I shall soon demonstrate that the equations to the curves of 

 regression, when the correlation is only moderately skew, 

 may be expressed in a very convenient form with the aid of 

 the coefficients p i and p ,. 



By Pearson's definition there is no correlation when the 

 regression is linear and parallel to the axes. Though this 

 definition seems to me to be not quite sufficient, as it does 

 not necessarily coincide with the definition required from 

 the standpoint of the theory of probability, i. e. that the 

 variates should be separated in the correlation function, it is 

 any way the best one to have recourse to when we have no 

 adequate correlation function available. In the sense of 

 Pearson's definition the variates will be independent of each 

 other if all the coefficients p, p 30 , p 03 , p 40 , p 0i , &c. are zero. 

 p is the usual coefficient of correlation ; as the quantities 

 /°3o» /°03? /°40> P04? & c -5 are abstract numbers, independent of 

 any units, I propose that they be called the coefficients of 

 correlation of higher order. The numerical factors are 

 inserted for purposes of which I hope soon to give the 

 explanation. 



