Anastomotic Nets Combating Noise 



295 



0.4 - 



0.2 - 



In the theory of information concerned with communication, 

 there is a theorem, due to Shannon, that, by proper encoding 

 and decoding, if one transmits at something less than the capacity 

 of a noisy channel, one can do so with as small a finite error as 

 one desires by using sufficiently long latencies. Except for things 

 like X and X, no one before Cowan and Winograd was able to 

 show a similar information-theoretic capacity m computation. 

 They have succeeded for any computation and for any depth of 

 net, limited only by the reliability of the output neurons. The 

 trick lay in a diversification of function in a net that was sufficiently 

 richly interconnected. Their fundamental supposition is that with 

 real neurons the probability of error on any one axon does not 

 increase with the complexity of its neuron's connections. The 

 recipients of most connections are the largest and, consequently, 

 the most stable neurons. Again, it is the richest anastomosis that 

 combats noise best. 



ACKNOWLEDGMENT 



I wish to acknowledge the contributions of those who have 

 worked with me in this endeavor, namely: Anthony Aldrich, 

 Michael Arbib, Manuel Blum, Jack Cowan, Nello Onesto, Leo 

 Verbeek, Sam Winograd, and Bert Verveen. 



