8

Entropy is incredibly useful as a mathematical tool. But what does it actually mean? I understand that the Boltzmann entropy is defined by:

$$S=k\ln{\Omega}$$

With $\Omega$ being the multiplicity of the system. As pointed out by many other QA pairs on this site, this mathematical definition is highly favorable and helps simplify calculations.

But beyond being just a helpful mathematical tool, does entropy have a physical interpretation? I have heard a few explanations but they all seem to fall short. For example,

1) Entropy is a measure of disorder.
2) Entropy is a measure of heat flow.
3) Entropy is a measure of energy.

My problems with each of these I hope are quite reasonable.

1) "Disorder" is a completely ambiguous term. If I don't clean up my house for a year it will certainly be disordered but it has not been evolving exploring every possible macrostate and thus the entropy hasn't been increasing, rather it has stayed at zero.

2) Put a gas in a sealed box with an attached vacuum chamber. Remove the partition and allow the gas to fill both chambers. The entropy of the system has increased and yet no heat flowed into/out of the system.

3) The units of $k$ are $J/K$. This doesn't make sense based on unit analysis alone.

What I'm looking for is one of two things. A concrete example of the physical interpretation of entropy or a solid statement that can explain why it has no physical interpretation and is a pure statistical phenomenon.

jkeuhlen
  • 1,207

3 Answers3

2

Consider an isolated system with a large number of degrees of freedom. An example could be a quantum computer that is able to compute the exact time evolution of a gas of 10^23 molecules undergoing an ideal free expansion.

In such a system, time evolution is always unitary and thus reversible. Clearly the entropy of the quantum computer is equal to zero and stays equal to zero no matter what dissipative thermodynamic process it is simulating.

So, there is no escape from the fact that entropy of an incompletley specified system is just the amount of information needed to specify the exact physical state of such a system. In case of the quantum computer, the given initial state unambiguously specifies the final state. You can then still define the effective thermodynamic entropy of the gas after it has undergone the free expansion by considering the ensemble of all hypothetical systems that would have the same macrostate as the gas under consideration.

Clearly, then, $\Omega$ does not refer to the number of final states that could really have been the outcome of the free expansion of a perfectly isolated system (obviously, most of them would not evolve back into the initial state under time reversal). It is simply that given only the final state, the macroscopic observables like the total volume, pressure etc. with some specified accuracies cannot pin down the exact physical state, there are then $\Omega$ microstates that would yield the same outcomes for these macroscopic observables (within the specified ranges).

Then if $\Omega$ is not the real number of the possible physical states of the system, then how come in statistical physics we average over all these $\Omega$ states and obtain correct answers? The reason is that the real physical states are randomly distributed among all of the $\Omega$ states, the ensemble of the $\Omega$ states is statistically representative of the far smaller group of real physical states.

Count Iblis
  • 10,114
0

2) Put a gas in a sealed box with an attached vacuum chamber. Remove the partition and allow the gas to fill both chambers. The entropy of the system has increased and yet no heat flowed into/out of the system.

But heat has flowed from the box to the vacuum chamber! (Whatever heat is!)

0

A really helpful discussion of this question is given by Daniel F. Styler, "Insight into entropy," Am. J. Phys. 68 (2000), pp. 1090 - 1096.

A (macro)state with high entropy is one that corresponds to many microstates, i.e. the system has many ways that it could configure itself on the microstate level to achieve the given macrostate. Styler gives some nice examples showing clearly why "disorder" is a bad metaphor, and suggests a better one might be "freedom," i.e. the freedom of a system to choose its microstate. In your example of the partitioned chamber, when the partition is removed the number of microstates availble to the system corresponding to thermodynamic equilibrium increases.

Note that the entropy of a single configuration (the snapshot of a system, giving its complete microstate) is not defined. You have to know what macrostate is represented by that snapshot, and how many other microstates correspond to the same macrostate. In that sense, entropy is "a pure statistical phenomenon" AND has "a physical interpretation."

As for the units, it turns out that your definition of entropy is equivalent in reversible thermodynamic processes to $dS = \dfrac{dQ}{T}$, i.e. heat flow to a system divided by absolute temperature (think of heat flowing into ice and melting it at a fixed temperature, where the entropy increases). So the dimensions of [energy / temperature] are correct. In fact, this can be used to define a system's temperature in cases where it would be unclear otherwise. (That also explains why equating entropy to heat flow is not accurate; it's only true in the limited context of reversible, isothermal processes.)

pwf
  • 2,969