﻿

S/k 



the Partition of Energy. 833 



number, in each case attributing to the A's the amount 

 of energy they had at the moment of isolation. Now 

 if these quantities are calculated, it will be found that, 

 to the approximation (4'5), they all have the same value. 

 This value is easiest to find for the maximum number. 

 It is unnecessary to take an assembly of* systems of more 

 than one type, as we have seen that the additive property 

 will hold. We have 



S/k = log M ! — X r log a r \ + % r a~ r \ogp r . 



AVe must here make the unjustified application of 

 Stirling's theorem to numbers some of which will un- 

 doubtedly be small; it should be possible to justify the 

 process, but we shall not do so. Then, making use of 



(2-1), (2-5), we have 



V^-M[log./-log^.d^log/], . . (5-1) 



= Mlog/+Elog(l/3) 3 ....'. (5-2) 

 since ^ 



E = M^log/. 



Equations (5*1) and (5'2) remain equally true for a 

 group of free molecules to the same approximation. This 

 formula for S is the direct consequence of Boltzmann's 

 Hypothesis, and S has the necessary additive property for 

 combining the parts of the assembly. Moreover, it agrees 

 completely with the entropy of thermodynamics in all cases 

 where they can be compared : this agreement justifies our 

 use of (4\5) in these calculations. But it is indifferent 

 whether we define the entropy as the total, average, or 

 maximum number of complexions, and (4*5) is always 

 inexact ; it is therefore unsatisfactory to make the formal 

 definition of non-fluctuating entropy in any of these ways. 

 Now (5*1) and (5*2) give precisely the thermodynamic ex- 

 pressions in all comparable cases, and this suggests a direct 

 definition in terms of partition functions. We may thus 

 suppose that the combinatory processes are correctly looked 

 after by the partition functions, and may define the entropy 

 by either of the relations (5*1) or (5*2). Pending its formal 

 identification with the entropy of thermodynamics, we shall 

 describe it as the " statistical entropy." 



