Entropy?
How can this be measured?

It is often stressed that entropy be related to information. However, entropy is an objective quantity, only depending on the system and not on an observers information about it. How can this quantity be measured? A procedure is presented to get the entropy of, say, a glass of water. Like for potential energy, only entropy differences have physical meaning, and a reference state is needed to give absolute entropies. Based on this reference state, absolute entropies of a few substances are given, and a practical example shows how such values can be obtained.


Measuring entropy

Entropy change within a system equals the heat added reversibly to it divided by the temperature of the system

dS = dQ(rev)/T

Like for potential energy, only changes can be determined, except if a reference state is defined. For potential energy, the reference state is arbitrarily fixed to zero Joules at infinite distance of the force center. The reference state of entropy is a perfect crystal at zero Kelvin. Entropy of this reference state is arbitrarily fixed to zero Joules per Kelvin.

With this definition, an absolute entropy can be attributed to every substance at specified conditions like temperature and pressure.

For instance, Moore (W.J. Moore, physical chemistry, Longman 1972) gives Third Law entropies for a bunch of substances like:

Table 1: Third Law Entropies
Substances in the standard state at 298.15 K

Substance Entropy Substance Entropy
H2130.59D2 144.77
HD143.7N2191.5
O2205.1Cl2 223.0
HCl186.6CO197.5
CO2213.7H2O (gas)188.72
NH3192.5CH4 186.2
Water (liquid)70.0Methanol127
Ethanol161Diamond2.77
Graphite5.694Ag42.72
Cu33.3Fe27.2
NaCl72.38AgCl96.23

Like potential energy changes, entropy changes can always be measured. How can you find out how much the entropy of that glass of water over there changes if it is heated by 13.4 K?

This is what you have to do, exactly following the above definition

  1. set entropy equal to zero: S = 0
  2. set temperature equal to actual temperature of the water: T = T(start)
  3. set end temperature equal to temperature + 13.4 K: T(end) = T(start) + 13.4 K
  4. add a tiny little bit of heat dQ, so tiny that temperature of the water does not rise appreciably
  5. record the temperature T and divide the heat portion dQ by it
  6. add the result to S
  7. if temperature is larger than T(end) then end; S is the quantity you have been looking for
  8. else proceed with the next tiny portion of heat as stated in the fourth step above
What you were actually doing here is taking the integral over dQ/T from T(start) to T(start) + 13.4 K. For this small temperature range, the heat capacity c of the water can be taken as constant, so that

dQ = c * dT

and

where the indentifier at the left hand side denotes DeltaS, a difference between two entropies. The integral can be evaluated to

DeltaS = c * ln (T(end) / T(start))

where
Delta
refers to a difference, S(end) - S(start) the entropy at temperature T(end) minus the entropy at temperature T(start) in this case
T
the respective temperatures, T(start) the lower one and T(end) = T(start) + 13.4 K

For larger temperature ranges, the heat capacity c cannot be taken as constant.

Since integrating is computing the area between lower and higher limits of temperature, the horizontal axis and the function line, the above procedure can be depicted as shown in the diagram. We are actually computing the shaded area:

If there are phase changes between T(start) and T(end), discontinuities in the curve are to be expected.

With increasing temperature, entropy increase becomes smaller and smaller, because the heat capacity does not increase as fast as temperature does. In fact, for an ideal gas, heat capacity is not dependend on temperature. In this latter case, c/T plotted against T is a hyperbola.

This fact is important since it enables heat engines. No car's engine, no airplane's jet propulsion, no nuclear power plant would work if entropy would not change like this with temperature.

Phase changes

To get the figures in the table 1 above, the integration was performed from T = 0 K to T = 298.15 K. In this temperature range, the heat capacity is not constant. The dependence of heat capacity on temperature has to be determined experimentally.

Liquids and gases undergo phase transition on cooling; what entropy change is to be expected on, say, melting?

While melting, if the melting system is in equilibrium, temperature stays constant at the melting point. The heat supplied, DeltaQ, is completely used up for melting, so nothing is left to rise temperature. From the definition of entropy, the heat supplied (reversibly) need only be divided by the (constant) temperature to get entropy change. Since both heat and temperature are positive, entropy change is posivite, too, thus entropy increases on melting. According to this, entropies of liquids are larger than those of solids, as a glance at table 1 above confirms.

An example

(Heat capacities)/T of argon (Ar) are given in the graph below (values are taken from Försterling and Kuhn, "Moleküle und Molekülanhäufungen", Springer, Berlin 1983)

As in the example graph above, entropy is the area confined by the curve, the horzontal axis and the perpendicular lines at zero and at 298.15 K. The graph was plotted only to 120 K. In the gas phase (above 87.29 K), argon behaves very close to ideal, and for ideal gases the heat capacity is independend of temperature. Note the jumps from solid to liquid and from liquid to gas. Phase changes are discontinuities, so physical properties are expected to change discontinually there.

With these data, and knowing that heat of melting of Ar is 1.18 kJ/mol and heat of vaporization 6.52 kJ/mol, the entropy change of argon from zero Kelvin to any temperature in this range can be calculated. Since at zero Kelvin, entropy (of a perfect Ar-crystal) is nil by definition, standard entropy of Ar at normal temperature and pressure is calculated to be approximately 154 J/K/mol

_________


© Copyright on text and pictures 1998, Gian Vasta, except for the reading daemon (thank you very much, Marijke!)
You may use this text freely only for non-commercial and (logical and! ) educational purpose. If you find it useful, please drop a note to me.

Home Articles Visuals Glossary FAQ Links

Last update Feb. 24, 1998 gVa