386 HITCHCOCK. 



The number of terms h on the left is said to denote the extent of the 

 operation 6 or of the equation. 



It is known that any matrix A is equivalent to a dyadic A or 2aa' 

 where a and a' are vectors. Any term A x B is therefore equivalent 

 to a sum of terms of the form aa' • X- bb' where a, a', b, and b' are 

 vectors in space of A 7 dimensions and X is a required dyadic. Since 

 by the definition of double dot product we have ab:xy = axby, 

 and since X might be written 2xy we shall have 



aa' Xbb'= ab'a'b:X (112) 



which is the transformation at the basis of the method to be exhibited. 

 Thus the linear matrix equation is equivalent to 



<p:X = 2MN:X= C (113) 



where <p is a dyadic whose antecedents and consequents are themselves 

 dyadics, that is a double dyadic. 



Since <p: X = d(x), 6 and <p satisfy the same Hamilton-Cayley equa- 

 tion. One method of solving the linear matrix equation is therefore 

 by putting 



x= =e- »C = — [0"- 1 - roi0*-» H h (- l) n -%n-i/] (- 1)"-'C (114) 



m„ 



where mi- ■ -m n are the same coefficients as in (28) and where m n is 

 assumed not zero. 



17. Transformation back to Matrices. 



If the matrices A and B are known in such form that transformation 

 to (113) is convenient, any of the methods exhibited in Part I may be 

 used to calculate the coefficients m. A general solution, however, 

 demands ability to compute them directly from Ay -Aj, and Bi- -Bh 

 without the necessity of first forming <p by the rule (112). In other 

 words we need to transform the scalars (36) from the language of 

 double dyadics to that of matrices, taking account of (112). This 

 may be conveniently accomplished by a partly symbolic notation, 

 omitting subscripts and summation signs which refer to the terms 

 1, 2,---h of the matrix equation but occasionally preserving those 

 which refer to the p! terms of the expansion of (35). We thus write 

 symbolically 



<p = ab' | a'b = MN (115) 



