CHAPTER 8 
CONCLUSION FOR PART I 
In Part II, we have discussed the model fitting technique or parametric analysis 
method for a random process. Comparing this method with those discussed in Part I for 
nonparametric analysis, we can conclude as follows: 
1 
tv 
The examples in Sections 5.2.1 to 5.2.5 show that the autoregressive (AR), mov- 
ing average (MA), or autoregressive moving average (ARMA) model of rather 
low order can represent pretty complicated looking processes with various 
frequency characteristics. 
For a given process, generally we can fit an autoregressive (AR), moving average 
(MA), or autoregressive moving average mixture type (ARMA) model of certain 
finite order. From the sample autocovariance functions at rather low lag num- 
bers, we can estimate the finite number of parameters of the model and the 
variance of the residual errors, once the order of the model has been determined. 
The characteristics of the discrete parameter model are closely related to those of 
the correlations of that model. To find the finite number of parameters to express 
the process, the correlation function until the finite order is sufficient, as men- 
tioned in item (2). Sample correlations at low lag numbers are more reliable than 
those at higher lag numbers, and there is no need to be concerned about the lag 
windows as there is in the nonparametric analysis (the so—called Blackman— 
Tukey method), where, ideally, we need the correlation function over 
— © to + © of the lag numbers. This concern about lag windows and consistent 
estimates, resulted in many efforts to find good windows. Not using the lag win- 
dow is also one reason why we can get better results by the model fitting 
technique then from rather short records. 
The order of the AR(k), MA(J), or ARMA(k, J) models that are to be fitted to a 
sample process, as the best from a statistical point of view, can be determined by 
Akaike’s information criteria (AIC) as the combination of the parameters (k, /) 
which minimizes AIC. 
Evaluation of the parameters of the AR(k) model is easily performed by solving 
the Yule Walker equation, which is a linear relation of parameters. The solution 
by Yule Walker’s equation is a good approximation of the solution by the maxi- 
mum likelihood and least squares method. On the contrary, for the MA and 
ARMA models, estimation of parameters is more troublesome and difficult and 
involves solving the nonlinear least squares equations. 
As far as stability and invertibility are fulfilled, the AR and ARMA processes of 
finite order can be inverted to the MA process of infinite order and, conversely, 
the MA and ARMA processes of finite order can be inverted to an AR model of 
infinite order. Usually, however, in this approximation, the order increases from 
the original MA or ARMA order. The order of the AR model that best approxi- 
mates the original MA or ARMA is obtained through the AIC criteria. In this 
approximation, estimation of parameters becomes much easier than for the origi- 
nal MA or ARMA process as mentioned in item (5). Accordingly, AR model 
fitting is usually used practically for general processes. 
247 
