290 



Information Storage and Neural Control 



A B 



kX 



Figure 6 



Nevertheless, it is possible to cope with these three kinds of 

 noise — 0, signal, and synapsis — as long as the output of a neuron 

 depends, in some fashion, on its input by an anastomotic net to 

 yield an error-free capacity of computation. This is completely 

 impossible with neurons having only two inputs each. The best 

 we can do is to decrease the probability of error. Consider, for 

 example, a net like that of Figure 6 to compute [X], where each 

 neuron is supposed to have 0=3, but each drops independently 

 to 2 with a frequency p. As long as p is less than 0.5, the net 

 improves rapidly as the product of the p's of successive ranks 

 decreases. The trick here is to segregate the errors. 



The moment we look at neurons with three inputs, the picture 

 changes completely; but to describe this change we need to increase 

 the complexity of our logical symbols by putting a circle on the X, 

 so that inside it is C, outside not C, as in Figure 7(a). Now con- 

 sider a net to compute some function, say, all or else none. We 

 can schematize this, as in Figure 7(b). The dash is a "don't-care" 

 condition; it may be a 1, or 0, or any p that you choose. This net 

 makes no mistakes. Let us suppose that each of the first rank 



