This is 'borne out by the next stage^ in which variables XI and Xk are 

 consistently present^ but now the added contributions by the other three 

 variables does not cover a very wide range. Thus, the three strongest 

 combinations differ by only about 1%. Similarly, combinations of four 

 Xs at a time show a limited range of contributions, with perhaps a sug- 

 gestion that variables X2 (wave period) and XIO (water depth) are slightly 

 stronger than angle of wave approach, X9. 



In considering the implications of the analysis, we may estimate 

 the relative importance of the several variables in the following way: 

 select XI first, with 63-1'^ of the total sum of squares of Y attributed 

 to it. Then, because the gap between the pair (Xl, X^) and the next 

 competitor (Xl, XIO) is about Wfo, choose X^ as the second strongest 

 variable. Its contribution, in the presence of XI , is {ih.l - 63. l) = 

 11.0']^. From here on the choice is less clear, but if we tentatively 

 accept wave period (X2) as the third strongest, we obtain a contribution 

 of (75'9 - 7^'l) = 1.8^ from it. Similarly, if we tentatively accept 

 water depth (XIO) as the fourth strongest, we obtain (78. 1 - 75-9) = 

 2.2°lo from it. Lastly, the contribution of angle of wave approach (X9) 

 is found by the relation (78.7 - 78. l) = 0.6fo. The relatively small 

 contributions attributable to X^, X9, and XIO suggest that, except for 

 the two strongest variables, there is little to choose from among the 

 other three, which add on the average about l.^io each, in contrast to 

 63'^ l3y mean grain size and 11'^ by wave height. 



Examination of the weakest variable in the set is also illuminating. 

 As table C shows, wave period "accounts for" only 1.1^ of the sum of 

 squares of Y. Yet on the strictly least-squares basis of choice used 

 here, this becomes of about equal rank with X9 and XIO, which individually 

 contribute something more than 3io when taken alone. 



The IBM 1620 and 709 programs used in this study compute the linear 

 coefficients and the sums of squares reduction for all combinations of Xs, 

 as stated, and the output lists the Xs involved and the corresponding per- 

 cent SS reduction. If intermediate output (such as the coefficients) is 

 desired, the program produces this by way of a control card. Details are 

 given in Krumbein, Benson, and Hempkins (196^+) . 



Implications of Linear Regression Analysis 



An important aspect of the present method of analysis is that 

 the general linear model, as used here, examines only the linear re- 

 lations among the variables, although the model itself can be extended 

 to some non-linear cases, providing that the gs always remain linear. 

 It is sometimes instructive to examine the matrix of linear correlation 

 coefficients along with the multiple regression output, to see whether 

 additional light is shed on the regression by the linear relations 

 (interlock) among the various Xs used in the study. 



Table D is the upper right half of the Pearson product-moment- 



