8

I was trying to understand what entropy means in the context of classical mechanics, but unfortunately I'm now more confused than I started. Reading, for example, the Wikipedia article on the Second Law of Thermodynamics, or the wonderful answer in this question (Is a world with constant/decreasing entropy theoretically impossible?) I can't decide if said law is

1) an empirical law ("we've never seen heat move from a cold body to a warm one.")

2) a soft statistical law ("in a Hamiltonian system, if you sample the initial conditions from a distribution satisfying properties $X$, $Y$, and $Z$, and measure entropy as a function of time using formula $W$, then entropy will increase over time with high probability (in the technical sense; and in particular, is not guaranteed to increase.")

3) a hard law ("there is a function $E(\mathbf{q},\dot{\mathbf{q}})$ over phase space which is guaranteed to monotonically increase for any Hamiltonian system and any initial conditions.")

My impression is that physicists consider the correct interpretation to be (3): that in any system some quantity called "entropy" is guaranteed to increase, not only just with high probability. But I cannot reconcile this interpretation with what I know about classical mechanics, in particular, that a) Hamiltonian systems are time-reversible, and b) their flow preserves the symplectic form.

In another question that I found when searching this site (Is there any proof for the 2nd law of thermodynamics?) several explanations were proposed to (a) in different answers, including that the Second Law only holds for unbounded systems, or that the Second Law relies on the incorrect assumption that past states of a Hamiltonian system are uncorrelated(??)

But (b) seems even more concerning to me, since conservation of the symplectic form implies, in a precise way, what I would think is the opposite of the second law: if you sample initial conditions in phase space, and evolve them over time, they will not "spread out," thanks to the symplecticity of the flow.

What gives? Certainly it is intuitive that some form of the Second Law holds for classical systems (if I cluster particles together in a box and assign them random velocities, they will spread out and will not cluster together again, with high probability), but how does one define entropy and the Second Law in this setting rigorously?

Qmechanic
  • 201,751
user2617
  • 631
  • 1
    You could read Jayne's papers, after some Shannon's information theory. –  Nov 09 '17 at 14:44
  • Closely related https://physics.stackexchange.com/q/81465/226902 https://physics.stackexchange.com/q/10690/226902 – Quillo Sep 30 '23 at 11:37

1 Answers1

4

The second law can be scrutinised in various frameworks. One such framework is classical thermodynamics. And in this framework the second law is probably an empirical law. (Right? I'm not an authority on Clausius theories.)

But we have modern, more detailed, models of physical systems (statistical mechanics, classical mechanics). In such frameworks we can model a system as an initial probability distribution that is time-evolved by a Hamiltonian. Here we start running into some issues. One can show, by applying the Liouville theorem, that the entropy of a system remains constant under Hamiltonian evolution. This does not exactly violate the second law (which says that that entropy does not decrease), but it isn't really satisfying either. Note however what kind of system is described by Hamiltonian evolution. First, the system must be completely isolated from its environment (if not the Hamiltonian will include terms with some uncertainty, that will increase the entropy). Second we have to know the exact nature of the Hamiltonian, and since most Hamiltonians depend on states of the system itself, we have to know the state of the system (to an infinite precision). This is clearly impossible (perhaps except for the most simple quantum systems). Thus, for realistic Hamiltonians—where the phase space trajectory of any point contains some uncertainty—the entropy of a system does strictly increase (until its maximum is reached).

Then you might say that this doesn't make sense because it only has to do with the observers inability of knowing the system, and you want to know what the intrinsic entropy of the system does. To which I would respond that there is no such thing as intrinsic entropy, and that entropy of a system is always expressed in terms of correlations between physical systems (such as a system and an observer). But note that there is not consensus in the field about such ideas.

I hope this added some clarity to your question?