Fundamental Principles of Scientific Inquiry. 381 



Now it appears certain that no probability is ever deter- 

 mined from experience alone. It is always influenced to 

 some extent by the knowledge we had before the experience. 

 In the notation of our previous paper, let P(j? : q) denote 

 the probability of the proposition p, given the data q. We 

 had the general proposition 



O La. 



= ?(g:p.h)Y(p:h). . . (1) 



Here let p be the general law under consideration, q the 

 propositions found true by experiment, and h the knowledge 

 we had before the experiment. If q are implied by p and It 

 together — in other words, if the observations satisfy the 

 law — we have ■* 



F(q:p.Ji) = l (2) 



Hence p/ . ^ 



P(P'«-*> = ?gTTf (3 > 



Thus the verification of a consequence of a hypothesis 

 divides its probability by the prior probability of that 

 consequence. 



If q u q 2 , ... g n denote successive verified consequences, we 

 find by repeated applications of (3) that 



P(p : ft . q % • • • qn . K) = P(/> : h)/P[ 9l : li)V(g 2 : 9l . h) . . . 



V{g n :g l .g 2 . ~..q n _ 1 .h). (4) 



Now, if a general law ever has a finite probability, and the 

 probabilities P(#i : A), Pfe : g\ • ^) 5 • • • are always less than 

 some number finitely less than unity, a sufficient number of 

 verifications would make the probability of the law greater 

 than unity, which is impossible. Hence at least one of the 

 following alternatives must be true : — 



(1) The general law, however often verified, can never 



have a probability finitely different from zero. 



(2) As the number of verifications of the law increases, 



the probability that the next verification will be 

 successful approaches arbitrarily near to certainty. 

 The latter alternative evidently agrees perfectly with 

 ordinary scientific inference. It may be noticed, however, 

 that it does not imply that the probability of the law 

 approaches indefinitely near to unity when the number oi 

 verifications increases enough, but only that the probability 

 that the next inference from it will be correct does so. 



