N-n 
AIC(g) = (- 2)| -—log 2 | + 2 
= (N—n)log62+2(n+ 1). (5.256) 
AIC (aq) min. 
Fig. 5.46. AIC(q) vs. g. 
The AIC (q) is plotted against n as in Fig. 5.46, which shows where the AIC (7) has 
a minimum value. This minimum value is called MAIC, the minimum AIC, and this 
method for obtaining that MAIC is called the MAIC method. Akaike*!4? made clear that, 
based on information theory, AIC is a measure of close fimess of approximation = a 
measure of minimum difference: (statistical model—true model). 
So if we express the statistical distribution of the occurrence of the data for the 
model as p(x) and the statistical distribution of the true data derived from the true struc- 
ture of the data, as g(x), to minimize AIC is to minimize 
x 
[= | patog®, 
q(x) 
the so—called Kulback’s*’ information criterion. Also the so-called information entropy 
S(q,p) is defined as S(q,p) =~ —J. So to minimize AIC is to maximize the entropy 
S(q,p). Accordingly, the minimum AIC criteria are also called the maximum entropy 
criteria. 
FPE was introduced to find the order of the AR model. AIC is, however, a very 
general criterion and can be used for ARMA or MA(m) models too. For the ARMA 
(n,m) model, we can use this criterion setting 
AIC(n, m) = (N—m—n)logd2+ 2(n+m+ 1). (5.257) 
Plotting AIC(n,m) over an appropriate grid of n, m we can adopt (n,m) that minimize 
AIC(n,m). 
197 
