[Previous][Contents]
Copyright © 2021 jsd
Previous Up Next

22  The Relevance of Entropy

The concept of entropy is important in the following areas, among others:

  1. cryptography and cryptanalysis, i.e. secret codes
  2. communications, data storage, and information theory, including error-correcting codes and data-compression codes
  3. computer science, including machine learning, etc.
  4. pattern recognition, including speech recognition, optical character recognition, face recognition, etc.
  5. librarianship
  6. the design of experiments (reference 13)
  7. physics in general, including the physics of computation, cosmology, and astrophysics
  8. the design of thermal machinery, including refrigerators, heat pumps, piston engines, turbine engines, rocket engines, etc.
  9. nuclear engineering, including reactors and weapons
  10. fluid dynamics
  11. chemistry and chemical engineering

Very roughly speaking, the items higher on the list can be assigned to the “information theory” camp, while the items lower on the list can be assigned to the “thermodynamics” camp. However, there is tremendous overlap between the two camps. The approach of understanding the microscopic quantum states and using that to explain macroscopic observables such as energy, entropy, temperature, etc. is called statistical mechanics; see e.g. reference 28 and reference 49. Examples of this include

a)  The physics of computation is squarely in both camps; see reference 36, reference 37, and reference 50.
b)  Things like Maxwell demons and Szilárd engines are squarely in both camps; see reference 51 and reference 52.
c)  Demagnetization refrigerators (as described in section 11.10) are in both camps, because you can quantify the entropy either by microscopic state-counting or by macroscopic thermal measurements. The macroscopic molar heat capacity of copper nuclei is R ln4, where the 4 is the number of microscopic states per nucleon.
d)  When you have a three-dimensional gas phase in equilibrium with a two-dimensional gas phase (i.e. adsorbed particles freely moving along a surface), the equilibrium between the two phases depends partly on a Boltzmann factor (involving the surface binding energy), but also depends on a state-counting factor. There are more states in the bulk than there are on the surface. This involves a Boltzmann factor (which gives the probability per microstate) as well as state-counting factors that scale like volume/Λ3 and area/Λ2. This allows us to calculate the entropy via ∑i Pi log(1/Pi) … and the result agrees with the entropy associated with macroscopic measurements of energy and temperature. In particular it gives us a very simple and useful way of calculating the surface density as a function of bulk density and temperature.
e)  Similar microstate-counting factors appear in the Saha equation, which uses statistical mechanics to make a quantitative prediction of ionization levels as a function of density. See reference 53 for a discussion of how such things scale as a function of the temperature and volume of the system.
f)  Similar microstate-counting factors permit an understanding of the scaling (with respect to density or concentration) of the equilibrium quotient and equilibrium “constant” in chemical reactions. The state-counting and the scaling are definitely nontrivial in the case of reactions such as F2 ↔ 2F, where the number of reactant particles is different from the number of product particles. This is discussed in some detail in reference 53. Once again, microscopic state-counting is key to understanding macroscopic observations such as densities and reaction rates.

So: we have multiple lines of evidence telling us that entropy is entropy. It is the same entropy, no matter whether you measure it in bits or in joules per kelvin (section 9.5).

As mentioned in chapter 2, you can’t do thermodynamics without entropy.

Also: entropy is one of the great elegant ideas of all time. C.P. Snow compared not knowing about the second law to never having read a work by Shakespeare.


Previous Up Next
[Previous][Contents]
Copyright © 2021 jsd