examined in section 2. Comments on a few 
iterative techniques and some experimental work 
conclude the paper. 
1. A_representation of the joint distri- 
bution. Let X denote the set of all points 
x=(x), Koy see %y) with each x,=0 or 1. Since 
there are 2" points in X any parametric descrip- 
tion of arbitrary probability distribution will, 
in general, require (2N-1) independent para- 
meters. The particular parametric represen- 
tation considered here is due to Bahadur. 
Using E, to denote the expected value when 
the underlying distribution is p(x), define for 
each isl, 2...N, 
mi = P} xi = 1f EO), o<¢m, <4; 
“iy (x,-m)/) m;(1-m;) ; 
ey = Epl2iz;), i<gj 3 
cae Bl 252524) Gj Kee 
T12,.N ~ By(2129---%y). 
Further define 
N 
Pi (X1,Xp5 ++ -Xy) = is (1-m;) 1-*45 
so that p(x; ,x5,...xXy) denotes the joint 
probability distribution of the x; when 
(1) the x;s are independently distributed and 
(2) they have the same marginal distributions 
as under the distribution p(x). It is shown 
by Bahadur that for every X3(Xq XQ 500 oy) in xX, 
p(x) = p-y)-f(x) 
where = 
f(x)=1 + Oh en eee 
(x) & ss Paper 
teeet Tyo yZyZore Bye 
The 2N_n-1 correlations and the N marginal 
frequencies m; are the parameters which 
determine the probability distributjon p(x). 
In order that an arbitrary set of 2°-N-l real 
numbers r serve as the correlation parameters 
of a probability distribution p(x) for any 
set of numbers my with o¢€m;<1, it is neces- 
sary and sufficient that f(x) be non-negative 
for each x. 
The distribution p(x) can now be approxi- 
mated by distributions of lower order. Thus 
p(x) is a first order approximation to p(x), 
Po(x) = py(x).| 1+ 2 4445, | 
nes 
is a gecond order approximation to p(x), 
and so on. For 1<€ m< N, the approximation 
P-m (*) has the interesting and useful property 
that it is the only distribution of order not 
exceeding m, under which any set X51 9% 429+ X jm 
of m variables has the game joint distribution 
as under the given o(x). Of course, approxima- 
tions to p(x) may also be obtained by retaining 
various selected terms in the expansion for f (x) 
and dropping the remaining terms, Because any 
approximation to p(x) is obtained by dropping 
terms of f(x) a classification procedure based 
on it will not do as well as the same procedure 
when p(x) is used. 
iE 
) 
2. Application to the analysis of some 
pattern recognition networks. The represen- 
tation for the joint distribution p(x) can now 
be used to examine the capabilities of various 
pattern recognition networks which have been 
proposed. Typically? these are linear summation 
networks with thresholds (see Fig.l), which 
operate on the weighted outputs of selected 
groups of "retina" elements, the gelection being 
usually random. Variations on this scheme are 
reported by Hawkins from whose article Fig. 2 
is taken. 
Consider Fig. 1 first. Of the total 
x= (oe XQ 2+ 2X ) each summation unit gets some 
subset, with the x;'s multiplied by variable 
weights. Let a;; denote the weights between 
the retina elements and the summation units, 
where a; can take on the value o, and let le 
be the thresholds for the response units. Then 
for a given response unit, say No. 1 the oper- 
ation for producing an output is described in 
general by 
(877%) + 89)%> t+ «++ * Ay y%Xy) 
+ (a10x] + apoXo +... * ayoXy) 
eee SR 
N 
or ze 5X5 Le 
i=l 
Consequently a weighted sum of input variables 
is used to perform classification for the type 
of network described. Rather than obtain the 
coefficients (a,,45, ...ay) from assumption 
concerning the functional forms of the proba- 
bility distributions or from a program of 
estimation, interest has centered on starting 
from an arbitrary initial state (aj, a3,.,.ar) 
and using iteration based on experfence, 1-@-, 
some learning procedure, to go from the initial 
state to a desired final state. 
The evaluation of the classification 
capabilities of the network in its final state 
is considered here. The problem of using ex- 
perience to go from an initial state to a final 
desired state is commented on in the next section. 
341 
