9/5 AN INTRODUCTION TO CYBERNETICS 



the probabilities. Such a table would be, in essence, a summary of 

 actual past behaviour, extracted from the protocol. 



Such a sequence of states, in which, over various long stretches, 

 the probability of each transition is the same, is known as a Markov 

 chain, from the name of the mathematician who first made an 

 extensive study of their properties. (Only during the last decade 

 or so has their great importance been recognised. The mathematical 

 books give various types of Markov chain and add various qualifi- 

 cations. The type defined above will give us all we want and will 

 not clash with the other definitions, but an important qualification 

 is mentioned in S.9/7.) 



The term "Markov chain" is sometimes applied to a particular 

 trajectory produced by a system (e.g. the trajectory given in Ex. 1) 

 and sometimes to the system (defined by its matrix) which is capable 

 of producing many trajectories. Reference to the context must show 

 which is implied. 



Ex. 1 : A system of two states gave the protocol (of 50 transitions): 



ABABBBABAABABABABBBBABAABABBAAB 

 ABBABAAABABBAABBABBA. 



Draw up an estimate of its matrix of transition probabilities. 

 Ex. 2: Use the method of S.9/2 (with the coin) to construct several trajectories, 



so as to estabUsh that one matrix can give rise to many different trajectories. 

 Ex. 3: Use a table of random numbers to generate a Markov chain on two 



states A and B by the rule : 



If 



Ex. 4: (Continued.) What is its matrix of transition probabilities? 



9/5. Ex. 9/4/1 shows how the behaviour of a system specifies its 

 matrix. Conversely, the matrix will yield information about the 

 tendencies of the system, though not the particular details. Thus 

 suppose a scientist, not the original observer, saw the insect's 

 matrix of transition probabilities: 



166 



