[Previous][Contents]
Copyright © 2021 jsd
Up Next

0  Introduction

0.1  Overview

Real thermodynamics is celebrated for its precision, power, generality, and elegance. However, all too often, students are taught some sort of pseudo-thermodynamics that is infamously confusing, lame, restricted, and ugly. This document is an attempt to do better, i.e. to present the main ideas in a clean, simple, modern way.

The first law of thermodynamics is usually stated in a very unwise form.   We will see how to remedy this.

The second law is usually stated in a very unwise form.   We will see how to remedy this, too.

The so-called third law is a complete loser. It is beyond repair.   We will see that we can live without it just fine.

Many of the basic concepts and terminology (including heat, work, adiabatic, etc.) are usually given multiple mutually-inconsistent definitions.   We will see how to avoid the inconsistencies.

Many people remember the conventional “laws” of thermodynamics by reference to the following joke:1

  0)   You have to play the game;
  1)   You can’t win;
  2)   You can’t break even, except on a very cold day; and
  3)   It doesn’t get that cold.

It is not optimal to formulate thermodynamics in terms of a short list of enumerated laws, but if you insist on having such a list, here it is, modernized and clarified as much as possible. The laws appear in the left column, and some comments appear in the right column:

The zeroth law of thermodynamics tries to tell us that certain thermodynamical notions such as “temperature”, “equilibrium”, and “macroscopic state” make sense.   Sometimes these make sense, to a useful approximation … but not always. See chapter 3.

The first law of thermodynamics states that energy obeys a local conservation law.   This is true and important. See section 1.2.

The second law of thermodynamics states that entropy obeys a local law of paraconservation.   This is true and important. See chapter 2.

There is no third law of thermodynamics.   The conventional so-called third law alleges that the entropy of some things goes to zero as temperature goes to zero. This is never true, except perhaps in a few extraordinary, carefully-engineered situations. It is never important. See chapter 4.

To summarize the situation, we have two laws (#1 and #2) that are very powerful, reliable, and important (but often misstated and/or conflated with other notions) plus a grab-bag of many lesser laws that may or may not be important and indeed are not always true (although sometimes you can make them true by suitable engineering). What’s worse, there are many essential ideas that are not even hinted at in the aforementioned list, as discussed in chapter 5.

We will not confine our discussion to some small number of axiomatic “laws”. We will carefully formulate a first law and a second law, but will leave numerous other ideas un-numbered. The rationale for this is discussed in section 7.10.

The relationship of thermodynamics to other fields is indicated in figure 0.1. Mechanics and many other fields use the concept of energy, sometimes without worrying very much about entropy. Meanwhile, information theory and many other fields use the concept of entropy, sometimes without worrying very much about energy; for more on this see chapter 22. The hallmark of thermodynamics is that it uses both energy and entropy.

energy+entropy
Figure 0.1: Thermodynamics, Based on Energy and Entropy

0.2  Availability

0.3  Prerequisites, Goals, and Non-Goals

This section is meant to provide an overview. It mentions the main ideas, leaving the explanations and the details for later. If you want to go directly to the actual explanations, feel free to skip this section.

1.
There is an important distinction between fallacy and absurdity. An idea that makes wrong predictions every time is absurd, and is not dangerous, because nobody will pay any attention to it. The most dangerous ideas are the ones that are often correct or nearly correct, but then betray you at some critical moment.

Most of the fallacies you see in thermo books are pernicious precisely because they are not absurd. They work OK some of the time, especially in simple “textbook” situations … but alas they do not work in general.

The main goal here is to formulate the subject in a way that is less restricted and less deceptive. This makes it vastly more reliable in real-world situations, and forms a foundation for further learning.

In some cases, key ideas can be reformulated so that they work just as well – and just as easily – in simple situations, while working vastly better in more-general situations. In the few remaining cases, we must be content with less-than-general results, but we will make them less deceptive by clarifying their limits of validity.

2.
We distinguish cramped thermodynamics from uncramped thermodynamics as shown in figure 0.2.

On the left side of the diagram, the system is constrained to move along the red path, so that there is only one way to get from A to Z.   In contrast, on the right side of the diagram, the system can follow any path in the (S,T) plane, so there are infinitely many ways of getting from A to Z, including the simple path AZ along a contour of constant entropy, as well as more complex paths such as AYZ and AXYZ. See chapter 19 for more on this.

  Indeed, there are infinitely many paths from A back to A, such as AYZA and AXYZA. Paths that loop back on themselves like this are called thermodynamic cycles. Such a path returns the system to its original state, but generally does not return the surroundings to their original state. This allows us to build heat engines, which take energy from a heat bath and convert it to mechanical work.

There are some simple ideas such as specific heat capacity (or molar heat capacity) that can be developed within the limits of cramped thermodynamics, at the high-school level or even the pre-high-school level, and then extended to all of thermodynamics.   Alas there are some other ideas such as “heat content” aka “thermal energy content” that seem attractive in the context of cramped thermodynamics but are extremely deceptive if you try to extend them to uncramped situations.

cramped
Figure 0.2: Cramped versus Uncramped Thermodynamics

Even when cramped ideas (such as heat capacity) can be extended, the extension must be done carefully, as you can see from the fact that the energy capacity CV is different from the enthalpy capacity CP, yet both are widely (if not wisely) called the “heat” capacity.

3.
Uncramped thermodynamics has a certain irreducible amount of complexity. If you try to simplify it too much, you trivialize the whole subject, and you arrive at a result that wasn’t worth the trouble. When non-experts try to simplify the subject, they all-too-often throw the baby out with the bathwater.

4.
You can’t do thermodynamics without entropy. Entropy is defined in terms of statistics. As discussed in chapter 2, people who have some grasp of basic probability can understand entropy; those who don’t, can’t. This is part of the price of admission. If you need to brush up on probability, sooner is better than later. A discussion of the basic principles, from a modern viewpoint, can be found in reference 1.

We do not define entropy in terms of energy, nor vice versa. We do not define either of them in terms of temperature. Entropy and energy are well defined even in situations where the temperature is unknown, undefinable, irrelevant, or zero.

5.
Uncramped thermodynamics is intrinsically multi-dimensional. Even the highly simplified expression dE = − P dV + T dS involves five variables. To make sense of this requires multi-variable calculus. If you don’t understand how partial derivatives work, you’re not going to get very far.

Furthermore, when using partial derivatives, we must not assume that “variables not mentioned are held constant”. That idea is a dirty trick than may work OK in some simple “textbook” situations, but causes chaos when applied to uncramped thermodynamics, even when applied to something as simple as the ideal gas law, as discussed in reference 2. The fundamental problem is that the various variables are not mutually orthogonal. Indeed, we cannot even define what “orthogonal” should mean, because in thermodynamic parameter-space there is no notion of angle and not much notion of length or distance. In other words, there is topology but no geometry, as discussed in section 8.7. This is another reason why thermodynamics is intrinsically and irreducibly complicated.

Uncramped thermodynamics is particularly intolerant of sloppiness, partly because it is so multi-dimensional, and partly because there is no notion of orthogonality. Unfortunately, some thermo books are sloppy in the places where sloppiness is least tolerable.

The usual math-textbook treatment of partial derivatives is dreadful. The standard notation for partial derivatives practically invites misinterpretation.

Some fraction of this mess can be cleaned up just by being careful and not taking shortcuts. Also it may help to visualize partial derivatives using the methods presented in reference 3. Even more of the mess can be cleaned up using differential forms, i.e. exterior derivatives and such, as discussed in reference 4. This raises the price of admission somewhat, but not by much, and it’s worth it. Some expressions that seem mysterious in the usual textbook presentation become obviously correct, easy to interpret, and indeed easy to visualize when re-interpreted in terms of gradient vectors. On the other edge of the same sword, some other mysterious expressions are easily seen to be unreliable and highly deceptive.

6.
If you want to do thermodynamics, beyond a few special cases, you will have to know enough physics to understand what phase space is. We have to count states, and the states live in phase space. There are a few exceptions where the states can be counted by other means; these include the spin system discussed in section 11.10, the artificial games discussed in section 2.2 and section 2.3, and some of the more theoretical parts of information theory. Non-exceptions include the more practical parts of information theory; for example, 256-QAM modulation is best understood in terms of phase space. Almost everything dealing with ordinary fluids or chemicals requires counting states in phase space. Sometimes this can be swept under the rug, but it’s still there.

Phase space is well worth learning about. It is relevant to Liouville’s theorem, the fluctuation/dissipation theorem, the optical brightness theorem, the Heisenberg uncertainty principle, and the second law of thermodynamics. It even has application to computer science (symplectic integrators). There are even connections to cryptography (Feistel networks).

7.
You must appreciate the fact that not every vector field is the gradient of some potential. Many things that non-experts wish were gradients are not gradients. You must get your head around this before proceeding. Study Escher’s “Waterfall” as discussed in reference 4 until you understand that the water there has no well-defined height. Even more to the point, study the RHS of figure 8.4 until you understand that there is no well-defined height function, i.e. no well-defined Q as a function of state. See also section 8.2.

The term “inexact differential” is sometimes used in this connection, but that term is a misnomer, or at best a horribly misleading idiom. We prefer the term ungrady one-form. In any case, whenever you encounter a path-dependent integral, you must keep in mind that it is not a potential, i.e. not a function of state. See chapter 19 for more on this.

To say the same thing another way, we will not express the first law as dE = dW + dQ or anything like that, even though it is traditional in some quarters to do so. For starters, although such an equation may be meaningful within the narrow context of cramped thermodynamics, it is provably not meaningful for uncramped thermodynamics, as discussed in section 8.2 and chapter 19. It is provably impossible for there to be any W and/or Q that satisfy such an equation when thermodynamic cycles are involved.

Even in cramped situations where it might be possible to split E (and/or dE) into a thermal part and a non-thermal part, it is often unnecessary to do so. Often it works just as well (or better!) to use the unsplit energy, making a direct appeal to the conservation law, equation 1.1.

8.
Almost every newcomer to the field tries to apply ideas of “thermal energy” or “heat content” to uncramped situations. It always almost works ... but it never really works. See chapter 19 for more on this.

9.
On the basis of history and etymology, you might think thermodynamics is all about heat, but it’s not. Not anymore. By way of analogy, there was a time when what we now call thermodynamics was all about phlogiston, but it’s not anymore. People wised up. They discovered that one old, imprecise idea (phlogiston) could be and should be replaced two new, precise ideas (oxygen and energy). More recently, it has been discovered that one old, imprecise idea (heat) can be and should be replaced by two new, precise ideas (energy and entropy).

Heat remains central to unsophisticated cramped thermodynamics, but the modern approach to uncramped thermodynamics focuses more on energy and entropy. Energy and entropy are always well defined, even in cases where heat is not.

The idea of entropy is useful in a wide range of situations, some of which do not involve heat or temperature. As shown in figure 0.1, mechanics involves energy, information theory involves entropy, and thermodynamics involves both energy and entropy.

You can’t do thermodynamics without energy and entropy.
     

There are multiple mutually-inconsistent definitions of “heat” that are widely used – or you might say wildly used – as discussed in section 17.1. (This is markedly different from the situation with, say, entropy, where there is really only one idea, even if this one idea has multiple corollaries and applications.) There is no consensus as to “the” definition of heat, and no prospect of achieving consensus anytime soon. There is no need to achieve consensus about “heat”, because we already have consensus about entropy and energy, and that suffices quite nicely. Asking students to recite “the” definition of heat is worse than useless; it rewards rote regurgitation and punishes actual understanding of the subject.

10.
Our thermodynamics applies to systems of any size, large or small ... not just large systems. This is important, because we don’t want the existence of small systems to create exceptions to the fundamental laws. When we talk about the entropy of a single spin, we are necessarily thinking in terms of an ensemble of systems, identically prepared, with one spin per system. The fact that the ensemble is large does not mean that the system itself is large.

11.
Our thermodynamics is not restricted to the study of ideal gases. Real thermodynamics has a vastly wider range of applicability, as discussed in chapter 22.

12.
Even in special situations where the notion of “thermal energy” is well defined, we do not pretend that all thermal energy is kinetic; we recognize that random potential energy is important also. See section 9.3.3.

1
This is an elaboration of the jocular laws attributed to C.P. Snow. I haven’t been able to find a more precise citation.

Up Next
[Previous][Contents]
Copyright © 2021 jsd