L. ISSERLIS 
isr 
Putting i = z we deduce from (17) 
Mean value of djp^ydp^2 = {p„z^ ~ fxyPz-^l^ (18)- 
and putting y = .r in this result. 
Mean value of dp^idp^i^ (p^i^i — px«p^--)/N (19). 
If we multiply (10) and (11 ), sum for all samples and divide by the number 
of samples we deduce 
Pxyzt ~ PxvPzt __ 1 {P xyz- ~ Pxv Pz-) _ 1 iPxvt- ~ PxvPt-) 
~ PxvPzt 2 PxvPzr 2 P^yPt^ 
_ 1 iVx'zt - PxvPzt) _ 1 {Pv^zt - Pv^Vzt) ^ 1 {Px"-z-^ - Px'Pz ^ 
2 Px^Pzt 2 py2p,t 4 p^,p,, 
1 {pa,2fi - Px^Pfz) ^ 1 (j?,y2,. - pyz p,-^ _^ 1 {pyH^--J)y^) 
This result like Sheppard's formula for cr^r niuch simpler when expressed 
in reduced- ■moments. Let us write 
Px^ymz'H^ _ 
SO that is unity and q,.y = r^y. The numerical term in (20) is 
- 1 + i (4) + I (- 4) 
or zero, hence 
- r":;;^, - 2 1~7;;, + ^^^7, J + 4 ^'^^-^^^ + ^^-^'^ + ^^-^^-^ + '^^^ 
, , . (21). 
In the same notation Sheppard's formula becomes 
To find the correlation between i\y and r,,^ we have only to replace t by oc in 
(21), thus 
^'""r^y^r^^J^r^^rJ^xyrxz 
?x- i/2 1 ( Ixyz- + Ix^y , Qx^z + 1xy -z\ , ^ /„ , „ , ^, , ^ \ 
~ 9 I — ^ J + I (?^-^- + + ^-i^'y- + 
' xy' xz \ 'a:?/ ' X2 •' * 
(23). 
(3) These correlation coefficients will simplify if the regression be linear and 
simplify to a considerable extent if at the same time the distribution be normal. 
For with linear regression 
Npxhjz =■ SSS (n^yzxh/z) f 
X y s 
= SS {n^yX^y X z^,j), 
X y, 
where z^y is the mean value of z for given values of x and //. 
* For the denominator of left-hand side, cf. Biometrika, Vol. ix. p. 4. 
f The origin being taken at the mean. 
^\ (22)*. 
