We want to calculate the entropy of a gas of N identical particles in a box of volume V.
The entropy is defined as k ln(G) where G is a measure of the
number of accessible states of the system. We specify the state of a particle
classically by giving its position and velocity or more appropiately its
position and momentum. Let's say the box is at some fixed temperature and
this results in each particle having a certain average momentum and energy.
The momentum will not always be exactly this average value but will vary
according to some probability distribution. The width of its distribution
is part of that uncertainty of which entropy is a measure. Let's quantify
the width of the distribution by the standard deviation of the momentum, p.
It is also reasonable to assume that the value for
p
will depend only on the temperature and not say on the volume or geometry
of the box and neither should it depend on the number of particles.
Now consider that in specifying the system exactly we would have to
give three coordinates for position and three coordinates for momentum
for each of the N particles. We thus have 3N + 3N variables which forms
a 6N dimensional "phase" space of possible system configurations. The 3N
momentum coordinates would each have a standard deviation of p
and there being 3N of them the hyper-volume in phase space in which we
expect to find the system will be proportional to (
p)3N.
In addition there are the N triples of position coordinates each triple
residing in the volume V of the box so the over all region in phase space
will have measure G ~ VN · (
p)3N.
For convenience let's assume a distance unit
x
= V1/3 so that we may write G ~ (
x
p)3N.
Part of the reason for using distance units is that there is a fundamental
unit called "action" which is in units of energy·time = distance·position.
We then see that G will as it now stands be in units of (action)3N.
There is a problem here however because we wish to take the natural
logarithm of G to calculate the entropy, S. We should only apply abstract
mathematical functions to unitless quantities. Looking at it another way
we don't want our entropy to depend on our choice of distance and momentum
units. Consider that S = k ln(G) is equal to zero when G = 1 so
we need a minimal unit for G which will define the minimum entropy state.
The simplest thing to do is assume an arbitrary minimal unit of action,
H. Then we can define G to be the unitless quantity G = (x
p/H
)3N. And then we have a formula for entropy:
We can also write this in terms of the box volume V:
Aside from calculating how W depends on temperature and such this is the standard (naive) calculation of the classical entropy. There is however a slight problem and that is the manner in which this formula for S depends on volume.
Suppose you have two boxes with identical volume, V, in which are an equal number of gas particles, N. You then have a total entropy of S = S1 + S2 = 2Nk ln(VW). Now suppose the boxes are placed side by side and the partition between them is removed. We now have a new box with volume 2V and with 2N particles so that the entropy of this system is
The entropy has changed and in fact has increased:
If each particle has a label so that they are distinguishable then this change reflects the entropy of mixing. After opening the partition the particles mix and afterwards by replacing the partition we no longer have the same particles on the same sides of the partition. This however is not realistic. If the two boxes contained identical particles then removing and replacing the boxes should be a reversible process. Hence the change in entropy should be zero. This is Gibbs' paradox.
Put another way entropy should be an extensive quantity. One should be able to partition the volume arbitrarily and the total entropy should be the sum of the entropies over the partition. This would then imply that the volume dependence should occur only in the form of a density. Instead of V we should see V/N at least for large N.
The resolution is to replace G = (VW/H3)N with G' = G/N!.
Hence S = Nk ln(VW) - k ln(N!).
There is an approximation formula for N! which is valid for large N called Stirling's formula:
so ln(N!) = (N + 1/2)ln(N) - N + 1/2 ln(2).
As N get's very large this will be approximately N ln(N) as the other terms do not grow as fast with increasing N.
Hence we get S = Nk ln(VW) - Nk ln(N) = Nk ln(VW/N).
Now the entropy depends (in the large particle number limit) on the quantity V/N which is one over the particle density. We now have entropy as an extensive quantity. Double the particle number and the volume at the same time and we get:
Sure enough the entropy has doubled. There is then no difference between two boxes with volumes V and particle counts N and a single box with volume 2V and particle count 2N. At least in the large scale limit.
Remember however that we used an approximation for ln(N!). If we use the more precise approximation in Stirling's formula we notice that the entropy still increases slightly when we combine the two boxes. Observe:
So again combining the two boxes would yield:
We see that there is still a slight increase in the entropy but note it is proportional only to the logarithm of N as opposed to N itself which is going to be very much smaller than the scale of the total entropies for typical scales where N is a very big number. In fact this process is not quite reversible and should yield a slight increase in entropy. Once we remove the partition and the particles are allowed to mix the number of particles in each half of the box can no longer be assured of being exactly N. For example at any point in time there may be N+3 particles on one side and N-3 on the other. A more complete treatment must be made to include these possibilities. Should we continue to open and close the partition the entropy will not change beyond this increased value. The formula above will no longer be valid for the seperated boxes will no longer be quite valid at this point. The deviation from exactly symmetric particle numbers will be accompanied by deviations of the energies and temperatures in each half box.
Hence x
p/H
> 1 or
x
p
> H.
By experimentally observing systems with different particle types and
geometries we observe that the choice of zero entropy seems to require
a unit of action, H. In addition this fundamental action unit appears
to be independent of the types of particles and geometry of the system.
We do specifically find, by the considerations of units and this type of
invariance, that it is more appropiate to work with momentum instead of
say velocity when defining the particle state. The fundamental nature
of units of action appears in the Legrangian formalism of both classical
and quantum mechanics. The amazing thing is the natural appearance
of this minimum action unit as a fundamental constant of nature.
This constant,
defines an absolute minimum localization of a particle's state in (x,p)
space. We have thus from thermodynamic observations alone "discovered"
Heisenberg's uncertainty principle:
(The choice of the factor of two has to do with the other applications of this fundamental constant and its historical determination.)
The second point is that there are some statistical considerations implied when we are dealing with identical particles. The above derivation of the entropy of a system of gas particles has assumed what are called Maxwell-Boltzmann statistics. There are two alternative possibilities Bose-Einstein statistics and Fermi-Dirac statistics which more deeply invoke the quantum nature of the particles.
----More on this later.----
Comments and questions are welcome:
jambaugh@ix.netcom.com