Poole et al.: A proposed stock assessment method and it application to bowhead whales 



147 



/io is close to jj and r^ is small. 

 We show later the practical im- 

 plications of choosing an esti- 

 mator for which the data either 



do( o'l'lilordo not ( 



afi, ) eventu- 



ally dominate the prior. 



An alternative approach that 

 does not involve conditioning on 

 /J is a fully Bayesian analysis 

 where both jj and a~ are random 

 variables. This approach re- 

 quires the specification of a joint 

 prior on jj and a'^. One such 

 prior is the indifference or "ref- 

 erence" prior (Jeffreys, 1961) 



p(/j,a )-— 2-- 



The resulting marginal poste- 

 rior mean of cr- is the best Bayesian estimator of this 

 parameter and is given by 





1 



^(3) 



■I" 



( = 1 



■xV 



To investigate the difference between the MCA esti- 

 mate in Equation 1 and the ad hoc "Bayes" approach 

 in Equation 2, we performed some simple simula- 

 tions. For each of nine combinations of ^q and Iq, a 

 random sample of size n - 1000 was drawn from a 

 Mp=50, 0^=100) distribution. In each case, ct,"!,, a, 



^2 

 Cf(3) 



(2)' 



and o^ij were determined. The results are 

 shown in Table 1. 



The results show very poor performance for fTm, 



and good, similar performances for a, 



i2i' 



^2 

 C^(3i' 



and 



o^^^ . In this simple case, the analyst would presum- 

 ably never choose MCA or the ad hoc "Bayes" method 

 over optimal estimators, such as a^;^, and 6~ij. The 

 key point of this example is that if conditioning on 

 nuisance parameters is to be used, the strategy pre- 

 sented in Equation 2 appears to be preferable to the 

 MCA estimate in Equation 1. As mentioned earlier, 

 the use of Equation 2 has a severe limitation of its 

 own; we therefore do not regard it as a viable alter- 

 native approach. 



There is considerable literature on the role of con- 

 ditioning in inference. Reid (1995) has presented a 

 review of recent developments. 



Confidence interval estimation 



The example in the section "A simple point estima- 

 tion example" illustrates poor performance for MCA 

 with regards to point estimation. Similar problems 

 occur when constructing confidence intervals. 



Consider the following example: letX, ~ Uiy-dy, y 



+ Oy^fori - 1 100, denote a random sample from a 



uniform distribution with bounds specified by the given 

 functions of 6 and y. Let y be the nuisance parameter. 

 The unconditional MLEs are y = ( max X, + min X, )/2 

 and 9 = (max X, - min X, )/(max X, + min X,). The con- 

 ditional MLE of given yis 6y- (maxX, -mmX,)/(2yi. 



Suppose the nuisance parameter, y, has a U(a,b) 

 prior, < a <b. MCA would proceed as follows: 



1 Sample /'- [/(a,6). 



2 Use the parametric bootstrap (e.g. Efron and 

 Gong, 1983) to sample X,' ~ U(y-Q j ,Y + ey),i = 

 1, . . . ,100. 



3 Find the conditional MLE, By., using the boot- 

 strap data, and conditioning on the current y\ 



4 Store dy and go to step 1. Use the collection of 

 dy's to obtain a confidence interval with the 

 quantile method. 



The ad hoc "Bayes" method, which relies on sampling 

 from the posterior, proceeds with the same steps ex- 

 cept that step 1 is replaced by 



1. Sample y' from its posterior distribution. 



Here, if we think of the likelihood as a function of y 

 only, then the posterior for yis proportional to 



n 



2dy 



I{Y-dY<X, <y + ey) 



ha <y<b), 



where I is an indicator function given by 

 \1 if a: is True 



I(x) = 



ifx is False. 



