Ens = (Xn41 —H) + 21(Xn—M) + A2(Xn-1—M) > + + An(X1 —) 
€n42 = (Xn42 —[) GP A(Xn41 — pb) ote a(x, —[) BCR OES a,(X2 —L) 
(5.230) 
€n = (Xy—M) + y(Xn_1 —L) + A2(Xn_2 -—M) +> + + + Ap(Xn_n— W). 
From Eq. 5.230 the Jacobian to transform p(E,, 1>€n+2° ° °€N) tO P(Xn+i1,Xn42° * Xn) 
is 
Es! O(En+1 En+2° 01-0 En) =a (5 231) 
O(Xn+1 Xn+2° * Xn) 7 
Therefore from Eq. 5.229, 
1 —n 
XG AG, a) ahs -X, = x 
P( +1 +2 N) J2n0¢ 
i 2 
exp—sq DY [Ki-w) taKer—w) +--+ ann} 
€ t=n+1 
N-n 
1 -1 
exp| —> 5@1,€2° ° -An)} |. 5.232) 
This is actually the conditional joint distribution, given that the initial observations 
X, - + -X, remain fixed at their realization X,- - - X,. Here n is assumed to be 
relatively small compared with the number of observations N. We can then use this 
P(X nai.° * *X ny) as the adequate approximation of likelihood function of 
/1,41,42° - - Gp, given the observations X;, X2- - - Xy. Then the log—likelihood 
function L is 
LW, 41, a2, igi 7 an) = log PXn+1,Xn+25 af ete Xn) 
N 
—(N—-n) 1 2 
= ————_log(2 o2)-—5 ys {(X,—m) + a1(Xa—-p) + . + dn(Xin — H)| : 
20¢ t=n+l 
(5.233) 
Therefore the maximum likelihood estimate for a, - - - a, is to maximize L, i.e., to 
minimize 
a 2 
Ous,41,42° + -a,)= > {&-m) + CX) tl ot an(Xin— 1] . (5.234) 
t=n+l 
190 
