nSHERY BULLETIN: VOL 77, NO. 4 





= -NI2a^ + S{l^,K,to)l2{a^) = 

 a 2 = S(i,k,to)IN. 



Thus the problem of ML estimation for the von 

 Bertalanffy curve reduces to finding LS estimates 

 oi(l^,K,tQ). Note that this is a general property of 

 the normal error model, since no special properties 

 of the von Bertalanffy curve have been used. 



The normal equations for finding ML estimates 

 are obtained by taking the partial derivative of 

 S{h^,K,t„) with respect to the unknown parame- 

 ters and setting the results equal to zero (i.e., dS/ 

 Ol, = 0; dSldK = 0; f*S/f)?o = 0). Because these 

 equations do not allow a simple solution, the 

 graphical Ford-Walford method (Ricker 1975; ac- 

 tually the regression of Z,+i on /,) has been widely 

 used. The Ford-Walford plot, in addition to a plot 

 of average length at age, should be adequate for 

 determining the age range following the von Ber- 

 talanffy curve. 



For the von Bertalanffy curve, proper ML esti- 

 mates can only be found using iterative al- 

 gorithms. A number of authors (Stevens 1951; 

 Tomlinson and Abramson 1961; Allen 1966) have 

 suggested specialized algorithms. Although these 

 algorithms may have advantages when computers 

 are not available, the easiest way to obtain ML 

 estimates is to use any of the general purpose 

 nonlinear LS computer programs available in 

 BMD (Dixon 1976), BMDP (Dixon 1977), or SPSS 

 (Nie et al. 1975). These programs have the flexibil- 

 ity of allowing complicated curves to be fit to data 

 sets, which is especially useful if differences in 

 growth curves among different populations are to 

 be tested statistically. For example, it might be 

 necessary to fit different growth curves to several 

 populations, but with the constraint that the tos 

 be equal. 



It should be remembered that LS solutions ob- 

 tained iteratively may be local rather than global 

 minimizations of Sil-,,K,to). With this in mind, 

 initial values provided to any iterative procedure 

 must represent the best available information. 1 

 recommend that the Ford-Walford method be used 

 to calculate initial values. This guarantees that 

 any LS solution which is obtained has smaller 

 residual sum of squares than the Ford-Walford 

 estimates. 



LEAST SQUARES METHODS 

 OF ESTIMATION 



Under differing assumptions on the error var- 

 iance, four different LS methods of estimation are 

 appropriate. When these assumptions are met, 

 each method provides ML estimates under the 

 likelihood model (Equation (1)). 



Let lijbe the length of theyth individual of age t, , 

 and let/, and s,^ be the sample mean and sample 

 variance of the lengths of individuals of age /,, 

 based on a sample size n,. For each method, the J 

 assumption on the error variance and the appro- 

 priate sum of squares to be minimized when this 

 assumption is correct, is given below. 



(a) All /y have constant variance: 



(b) All 7, have constant variance: 



v(/,-/i(^))2. 



(c) The variance of/ varies with <, , and at age t^ is 

 equal to ct^: 



S(n,/s,2)(/,.-,j(r,))2. 



(d) All /y have constant variance (i.e., the same 

 assumption as (a) above): 



Eni^ll-^(ti)f. 



The dependent variables in methods (a) and (b) 

 are formally of the form described by the likeli- 

 hood model ( Equation ( 1 )). By this it is meant that 

 they have constant variance, and assuming nor- 

 mality and independence, their likelihood is de- 

 scribed by Equation (1). The dependent variables j 

 in methods (c) and (d) can be transformed into the ' 

 form of Equation (1). This can be done by placing 

 the weights w^ = (n,/s,^) (method (c)) and w, = «, 

 (method (d)) within the squared expressions. 

 Doing so for methodjc) gives the pseudoobserva- 

 tion y, = (Vn,/s,)7, with expectation £(y, ) 

 = {\ n,/s,)fi.{t,) and variance asymptotically 

 equal to unity (as s,-*<r,). Assuming normality ^ 

 and independence, the asymptotic likelihood of | 

 these y, is described by Equation (1). Similarly for | 

 method (dl, the pseudoobservation becomes y, = 

 (\ n,)/, with expectation £(y,) = (Vn,)/x(i,) and 

 variance cr^. Again assuming normality and inde- 



766 



