2. Correct statistical and modeling techniques should be used to develop estimates of 

 the parameters of the basic components. 



3. Where possible, the estimated parameters and model forms of these components should 

 be validated with a data source independent from that first used to estimate the parameters. 



4. Like its components, the validation of the simulator should also be made on a data 

 set independent of those used to derive the simulator. This is necessary only if the simulator 

 is to be applied to studies other than those used in its development [which is the case most 



of the time in forestry applications). This or these independent data set(s) should also have 

 a form compatible with the underlying structure of the simulator and should cover as wide a 

 range of conditions as possible. 



5. Validation should occur on the basic attributes that the simulator produces. For 

 example, if the simulator is an individual tree/distance dependent one, then validation should 

 occur on the tree level attributes rather than on stand level attributes. Aggregation tends 

 to cover up inconsistencies. 



6. Validation should occur over long enough time to allow valid tests (or comparisons) 

 of long- as well as short-term responses and of subtle changes that take a long time to 

 manifest themselves. 



7. Where possible, some type of "statistical" test should be used to aid in quantifying 

 validity. 



Applying the Rules 



The first three rules are normally applied during the model building process, while the 

 last four rules are applicable to the final simulator. To incorporate the first two rules in 

 this study, the following steps were taken in the model building process: (1) an exhaustive 

 examination of the literature to identify major components of the simulator, the factors 

 influencing these components, and applicable model forms; (2) an insistence that the various 

 equations used to model these components meet the expected model forms and behave reasonably 

 across the expected input data range; (3) identification of the correct statistical tools to 

 model the components; and (4) where possible, a test of the assumption of both the models and 

 statistical tools. 



1 decided, however, to develop a validation process that combined rule 3 with the last 

 four rules so I could evaluate each component by hovvr it influences the prediction of future 

 diameter distributions, and by the necessity of saving time and conserving monetary resources. 

 To combine rule 3 with the last four rules, it was necessary to make four different validation 

 runs. The first run used actual values for all components except upgrowth; that is, only 

 upgrowth was predicted. The second run used predicted upgrowth and mortality. The third run 

 used predicted upgrowth, mortality, and conversion. The final run predicted all components 

 (upgrowth, mortality, ingrowth, and conversion) . In this way, the effect of adding each 

 predicted component to the simulator could be evaluated by how the simulator predicts the 

 basic attribute of interest, the future diameter distributions (rule 5). 



Because two blackjack pine equations were developed (one with the even-aged data and one 

 without), both equations were tested on the first set of validation runs. The equation that 

 performed best was the final blackjack pine growth equation. A similar approach was used on 

 the second set of runs to evaluate various mortality models. 



Validation runs were made on the data reserved for that purpose (rule 4). Additional 

 work was needed to make diameter distribution predictions past 1940. The difference between 

 the pre- and post-1940 periods was the number of diameter classes that could be compared (for 

 pre-1940, all diameter classes 4 inches and greater could be compared while, after 1940, only 

 diameter classes 8 inches and greater could be compared) . 



The choice of what statistics and tests were appropriate for comparing predicted to the 

 actual diameter distributions (rule 7) proved to be difficult. After reviewing or trying a 



32 



