5.4.4 Estimation of Parameter aj ... a, and u 
In this section, we assume that the order n of the autoregressive process AR(v) is 
already known. Determination of the order n is discussed in the next section. Here the 
Statistical considerations for the estimation are treated in some detail, compared with the 
case for AR(2) in Section 5.2.3. As the most general type of AR(n) process, we assume 
the nonzero mean process to be 
(X;—M) + ay(X-1 —M) + A2(X-2-M) ++ > + + An(Xin—M) = €,. (5.226) 
By analogy with the multivariate regression of X; with X;_,- - - X;_n, with regressive 
coefficients a), a2: - - a, and residual error €; , we can use the least squares method to 
estimate a,- - - dp, #,ande,, that is, to minimize 
N 
OWiai---an)= > ef 
t=n+l 
N 
2 
= Sy {X-w) +aXi1 —bM)+:- “+ On(Xen —H)} > 227) 
t=n+l 
The reason for this lower limit n + 1 is that we do not have observations Xo, 
X1,° +X hah) from which to derive €,,- - -€, . Nis the total number of observations 
of X;. The residual error €, is assumed to be Gaussian and independent of each other. 
Accordingly, the joint probability distribution of €,4;: - -€, 1S 
1 N-n “= er 
20, 
€ 5€ 9) 0).0(s = II e € 
P(En+1,€n+2 N) Jano, pa 
Paw ae 
om 2 
= e — € (5.228) 
Tine.) | 208 © 
Tone) ana = 
=| ——_ ex ,41,42° °° an) (5.229) 
lems. P 22 Qu 2 n 
Here the variables are transformed from (€n41,€n42° ° °€n) tO (Xn41,Xnv2,° ° * Xn) 
using 
189 
