8

Entropy is quite often introduced with the help of a deck of cards. Because there are so many cards, and thus an enormous number of microstates (orderings of the cards), if we assume equipartition of "energy" (equal distribution of the cards among the available spots in the ordering), then the number of "ordered macrostates" (e.g. microstates in which cards are perfectly ordered within their respective suits) is incredibly small compared to the total number (52!).

This got me thinking though: is it possible to derive a sort of "temperature" for a deck of cards?

Let's say that the number of cards functions as a sort of "energy" (number of cards is conserved, after all). If we have two different decks

  1. with 2 cards
  2. with 9 cards

then it is possible to establish a "temperature":

If we change the energy by one card in each deck, then the entropy change is:

  1. $\log(3!) - \log(2!) = \log(3!/2!) = \log(3)$
  2. $\log(10!) - \log(9!) = \log(10!/9!) = \log(10)$

Temperature is defined as the change in energy over the change in entropy -- so whatever unit we choose for the energy the "temperatures" will be proportional to:

  1. $1/\log(3)$
  2. $1/\log(10)$

And thus the larger deck is at a "lower temperature" than the smaller one.


However, this is sort of at odds with my understanding of the connection between energy, entropy, and temperature.

For instance, let's consider two boxes of gas instead.

If we put the same amount heat energy in to the two different boxes of gas, the "lower temperature" box will be the one for which that heat energy increases the entropy more. I.e. the increase in the number of microstates made available to the gas by the introduction of the heat energy will be larger for the cooler box of gas than it will be for the hotter box of gas, given the same heat -- if there are already a large number of microstates available (the gas is "hot"), then the additional energy will not change in the number of microstates as much as if there were a smaller number of microstates available to begin with (the gas was "cold").

The deck of cards seems to behave in an opposite way: the introduction of the same amount of "energy" to each deck increases the entropy more for the larger deck than the smaller one. The larger deck has more microstates available to begin with and its entropy was increased more by the introduction of the same amount of energy.


Where is the flaw in this analogy?

Is the "energy" not properly defined?

Is the temperature improperly defined?

Am I wrong in my way thinking about how to interpret the temperature of a box of gas?

Where does the analogy break down?

Qmechanic
  • 201,751
D. W.
  • 1,175
  • Should I really by thinking about the relative increase in the entropy? I.e. for the first deck the relative change is log(3)/log(2!)=1.58, and for the second log(10)/log(9!)=0.18, and so the temperature (one over the relative difference) of the second deck is actually higher? – D. W. Oct 06 '19 at 07:26
  • Relevant post: https://physics.stackexchange.com/questions/231017 – DanielSank Jun 08 '20 at 01:42

2 Answers2

3

The analogy is not well justified. In chemical systems, temperature can be introduced as $$T = \left(\frac{\partial U}{\partial S}\right)_{V,N}.$$ The constant specifiers are important. You can't introduce additional constraints like $U = \text{const}\cdot N$. And if energy was just constantly proportional to $N$, independent (explicitly) of $S$ and $V$, the above formula would give zero.

A situation analogous to yours in physics would be an ensemble of ordered particles, each of which can only be in a single level of an energy $\epsilon$. The only source of entropy is the combinatorics of the order. The only way the system can gain energy is by acquiring new particles. If it is closed, it has no way of exchanging heat with its surroundings, either way. There is no notion of temperature in such systems. They are, by definition, not hotter or colder than anything around them.

UPDATE based on comments.

OK, let us abstract from the "number of cards = number of particles" I have implicitly assumed and consider some imaginary system that can absorb energy in multiples of some quantum $ε$ and has $(U/ε)!$ microstates representing a macrostate of energy $U$.

This looks simple enough to analyze so in order to see how it behaves in thermal contact with another body, let us make an explicit calculation.

Consider a quantum-mechanical harmonic oscillator aligned such that $ħω = ε$, neglect the zero state energy. Let the compound system has, say, a total energy budget of 5 quanta. In a microcanonical situation, the following states have all the same probability:

  1. oscillator at $5ħω$, deck of cards empty,
  2. oscillator at $4ħω$, deck of cards has one card,
  3. oscillator at $3ħω$, deck of cards has two, ordered AB or BA (2 distinct states),
  4. oscillator at $2ħω$, deck of cards has three: ABC, ACB, BAC, BCA, CAB, CBA (6 distinct states),
  5. oscillator at $1ħω$, deck of cards has 24 distinct states,
  6. oscillator at $0ħω$, deck of cards has 120 distinct states.

This is a total of 154 microstates of equal energy that by assumption the system can transit between freely. Statistically, they all have equal probabilities in the long-term limit. Clearly, in most cases the oscillator is at zero, and nonzero energies have decreasing probabilities. Taking the partial totals: $$p_0 = 78\%,\ p_1 = 16\%,\ p_2 = 4\%,\ p_3 = 1\%, p_4 = p_5 < 1\%.$$

From the perspective of the oscillator, this looks somewhat like a thermal distribution at some temperature. (It would work better if I had many of them, but for this little demonstration this will suffice.) Let's see what happens if we pour some more energy into the system. Generalizing to $N$ total excitations:

Probability of $n$ excitations in the h.o., $N-n$ cards in the deck, $p_n = (N-n)! / \sum_{k=0}^{k=N} k!$

For $N\gg1$, the largest factorial by far dominates all the other terms in the sum combined, so let's simplify to $p_n \approx (N-n)! / N!$. Also, using Stirling's formula:

$$p_n \approx (N-n)! / N! \approx \left(\frac{N-n}{e}\right)^{N-n} \left(\frac{e}{N}\right)^N = \left(1 - \frac{n}{N}\right)^N \left(\frac{e}{N-n}\right)^n $$

In the first term we spot an approximant of the exponential ($n \ll N$):

$$p_n \approx e^{-n} \left(\frac{e}{N-n}\right)^n = \frac1{(N-n)^n} \stackrel{n \ll N}{\approx} \frac1{N^n} $$

Remember that Maxwell-Boltzmann distribution of the harmonic oscillator alone at temperature $T$ gives

$$p_n = (1 - e^{-ε/(kT)}) e^{-nε/(kT)}$$

and for $kT \ll ε$,

$$p_n \approx e^{-nε/(kT)} = (e^{-ε/(kT)})^n = \frac1{(e^{ε/(kT)})^n}.$$

Comparing the two, we see that for large $N$, the harmonic oscillator thermalizes at temperature $T$ for which $e^{ε/(kT)} = N$, which is

$$T = \frac{ε}{k \ln N}.$$

This would indeed confirm that in our model system the temperature is a decreasing function of the total energy $U = Nε$, or the total energy is a decreasing function of temperature, opposed to the usual. If we add some energy quanta to the system, they would rather find their way into the deck than into the oscillator, and even draw some more energy from the latter along, cooling it down.

Because this is an equilibrium, the same temperature would be ascribed to the deck of cards. Of course, the $N$ particles are split between the two. But for all practical purposes, $n \approx 0$ and $N-n \approx N$, so we can actually use this formula unchanged.

The explanation of this strange result is the superexponential explosion of the number of microstates of the deck of cards with energy. It is normal that this (and its logarithm, entropy) is an increasing function (in systems allowing saturation, entropy can start decreasing at some point; these then allow for negative temperatures), but in all common situations entropy stays concave. What is unusual about our "deck of cards" system is that it is a convex function in energy. This also means negative heat capacity, which brings in further paradoxes.

So, to answer your questions, there is no flaw in your logic; it is just a rather unusual model with very surprising properties. There may be something preventing it from being physical, I haven't tried to answer that.

The Vee
  • 1,317
  • Brilliant, thank you! – D. W. Oct 07 '19 at 09:39
  • 1
    I am not totally convinsed by this answer. You have considered a system with 2 conserved quantities and then said that in moving to this system with only 1 conserved quantity the correct analogy to draw is to a system which only has a particle number, rather than one which only has an energy. What statistical property distinguishes an energy like quantity and partical number like one? How does this enter into the formalism of statistical mechanics? Put another way, if we replace temperature with chemical potential everywhere in the question, don't we hit the exact same problem? – By Symmetry Oct 07 '19 at 14:26
  • For a true physical system, like the one you mention in your last paragraph, the only difference I can see is that if I place it in contact with another system I can look a what flows in and out, because energy and particle number are not ultimatly the same quantity, but I cannot see the distinction for a system that seemingly should obey a similar formalism, but inherently cannot be coupled in this way. – By Symmetry Oct 07 '19 at 14:30
  • There is one thing in your explanation that throws me off: isn't it possible to define a temperature for system at constant volume and constant number of particles, even when the system has an energy which is dependent on the number of particles -- for instance, an ideal gas? An ideal gas has an energy which can depend linearly on the number of particles (at constant temperature) and it's temperature is can be calculated by way of the entropy (via classical density of states). Wouldn't your comment imply that an ideal gas doesn't have a temperature? – D. W. Oct 07 '19 at 18:59
  • @BySymmetry I assumed that the number of cards was the number of particles and they brought their energy with them. Surely, one could then ask about the chemical potential and run into similar questions. I let go of that assumption and tried a more concrete approach. – The Vee Oct 08 '19 at 09:11
  • @D.W. Well, the principal difference with ideal gas is that at constant $N$ it can still hold various energies, which then give rise to the Lagrange dual, temperature. I somehow assumed that number of cards was an actual number of particles (driven by trying to map it to some practical physical analogy) and that the energy was only a function of $N$. Please see my extended answer considering what's probably closer to what you had in mind. – The Vee Oct 08 '19 at 09:17
1

Despite the question already having an accepted answer, I think I've come up with an alternative way of associating a "temperature" with a deck of cards. Assume that, instead of suits and usual labels, the cards are just numbered $1$ through $n$. To each microstate of the deck (i.e. its ordering), we can assign the energy as the number of inversions in this microstate (i.e. number of pairs $i$, $j$, $i<j$ such that the label on the $i$'th card is greater than the label on the $j$'th card). A perfectly ordered deck $(1, 2, ..., n) $ then has zero energy, and a reverse ordered deck $(n, … , 1)$ has energy $n (n-1)/2$. Then we populate the states according to the Gibbs distribution and proceed as usual for a discrete system. In this setup, $T = 0$ corresponds to a fully ordered deck, and $T = \infty$ corresponds to a completely mixed deck (all orderings are equally likely). Essentially, such a system is just a discrete system with weird degeneracies of energy levels, so it should be well defined.

I wasn't yet able to determine the properties of the entropy with respect to the number of cards $n$.