466 
PROFESSOR K. PEARSON AND MISS A. LEE ON THE DISTRIBUTION 
of rise and fall at cori’elated stations are independent of the special relations between 
the coefficients, which we have selected to illustrate them, they are really a deduction 
from the sign of the regression coefficient, or of the coefficient of partial correlation, 
which has the same sign. 
14. Case (iv .)—Prediction from any Number of Correlated Stations. 
The general formulse are given, p. 302 of the memoir above cited, namely : 
Xy = 
_ ^ hi3 I I ^fi ■ 
R CTg R CTg R (Xj, 
with a probable deviation of 0'6745 where R is the determinant below 
and Rp 2 is the minor formed by leaving out the column and row. 
1, 
^i2> 
03> 
’ir • 
1, 
’23. 
’’24 • 
’bn 
1, 
’’34 • • • 
’in 
'^42 > 
’’43. 
1 . . . 
In order to obtain close prediction, it might be supposed that all that is necessary 
is to take a sufficiency of correlated stations. This is very far from being the case. 
The true test of closeness of prediction is the smallness of ^(R/R|;^), and this can 
often be obtained by a few well-selected stations better than a great number. In 
order to roughly illustrate this, suppose the correlation coefficients of the stations to 
be all of the same order of magnitude, i.e., about e, then the order of a/(R/RiJ for 
1, 2, 3, 4 ... 71 stations 
is given by 
P(i + «) (1 —«). f ~ ITTt) 
V(' + rT3£)(i - e) .... /\/(i + - ')> 
or the prediction is only increased in certitude in the ratio of ^ t^'king 
an indefinitely great number of stations, or, since e cannot be greater than unity, it 
can only be increased in the ratio of to 1. Our object should accordingly be to 
make ^(R/R^i) as small as possible"^ by a fit selection of comparatively few stations 
* It is easy to illustrate its vauisliing by taking one station equally correlated with (w ■ 
whicli are equally correlated among themselves (r). lu this case we have 
■1) others (p), 
