356 Prof. Doukin on certain Questions relating to 



6. For convenience, as well as to avoid the responsibility of 

 introducing new terms, the ordinary language will be generally 

 used in the rest of this paper ; it being understood that by the 

 " probability " of a hj'pothesis is meant the quantity of belief 

 which ought to be given to it by a person in a determinate state 

 of information respecting it. The word hypothesis is used to 

 denote anything, or rather any proposition, which can be believed 

 or disbelieved ; including, of course, even MHmeflnin^ propositions, 

 which can be truly believed or disbelieved by a mind which 

 believes that they have, or may have, a meaning. 



7. The solutions of the ordinary direct problems of probabi- 

 lities must always be made to depend, more or less explicitly, 

 upon the theory of combinations ; and the ordinary processes 

 are sufficiently satisfactory in such cases. But in the treatment 

 of inverse problems, which are by far the most important and 

 frequent in practice, 1 think that considerable advantage might 

 be gained by the introduction of the following preliminary 

 theorem, which, if it ought not rather to be called an axiom, is 

 certainly as evident before as after any proof which can be given 

 of it. 



Theorem. — If there be any number of mutu.ally exclusive hy- 

 potheses, /<i, 1k2, hg . . . , of which the probabilities relative to a 

 particular state of information are ;j„ p^, J^s ■ ■ ■ } ^^'^^ i^ ^^^ ^^- 

 formation be gained which changes the probabilities of some of 

 them, suppose of /?,„+i and all that follow, without having other- 

 wise any reference to the rest, then the probabilities of these latter 

 have the same ratios to one another, after the new information, 

 that they had before ; that is, 



P'\ --iK -ih- •■• -P'm =Pl ■Pi-Ps- '•' -Pm, 



where the accented letters denote the values after the new infor- 

 mation has been acquired. 



The most important case of this theorem is the following : — 

 If there be n hypotheses h-^, h^^, . . . h„, which in a certain state 

 of information are believed to be exhaustive, and of which the 

 probabilities are then p^, pc^, . . .pn', and if it is afterwards dis- 

 covered cither that some of them must be rejected or that others 

 must be admitted, or both, without any further information as 

 to those of the original set which are retained, then these latter 

 have the same mutual ratios after the new information that they 

 had before. 



By means of these theorems, I think that we may not only 

 shorten processes*, and avoid the necessity of constructions by 



* The following may be taken as an example. Let there be two hypo- 

 theses A, B, which are beheved to be entirely independent, and of which 

 the probabilities relative to this state of information are a, b. It is after- 



