BAYES' THEOREM 277 



Case 1. — After a ball was drawn it was replaced and the bag was 



shaken thoroughly before the next drawing was made; 

 Case 2. — A drawn ball was not replaced before the next drawing. 



These two cases become essentially identical when the total number of 

 balls in the bag is very large compared with the number drawn. 

 Case 1 will serve as an introduction to Bayes' problem; later we will 

 find it highly desirable to consider Case 2. 



We are confronted with (.1/ + 1) possible hypotheses or causes 

 before the drawings took place: 



1 — the unknown value of x is Xo = 0/M, 



2 — the unknown value of x is .ti = 1/M, 



3 — the unknown value of x is .T2 = 2/i/, 



k -{- I — the unknown value of x is Xh = k/M, 

 J/ + 1 — the unknown value of x is Xm = M/AI = 1. 



Let w{xk) be the a priori existence probability for the ^'th hypothesis; 

 by this is meant the probability in favor of the ^'th hypothesis based on 

 whatever information was available regarding the contents of the bag 

 prior to the execution of the drawings. 



Let B{T, N, Xk) be the a priori productive probability for the ^'th 

 hypothesis ; by this is meant the probability of obtaining the observed 

 result (r whites in N drawings) when the value of x is klM. 



Then, the a posteriori probability, or probability after the observed 

 event, in favor of the ^'th hypothesis is 



P w{xk)B{T, N, Xk) ..x 



E w{x,)B{T, N, xu) 



t=o 



For Case 1 of our bag problem we have 



B{T, N, X,) = [t) ■^'''^^ ~ ''"'•^"'"''' 



where Ij.) represents the number of combinations of N things 



2 This is the Laplacian generalization of Bayes' formula, although in some text- 

 books it is referred to as " Bayes' Theorem." A relatively short demonstration of it is 

 given by Poincare in his Calcul des Probahilites. See also Fry, Probability and Its 



Engineering Uses, Art. 49. 



