I was trying to understand what entropy means in the context of classical mechanics, but unfortunately I'm now more confused than I started. Reading, for example, the Wikipedia article on the Second Law of Thermodynamics, or the wonderful answer in this question (Is a world with constant/decreasing entropy theoretically impossible?) I can't decide if said law is
1) an empirical law ("we've never seen heat move from a cold body to a warm one.")
2) a soft statistical law ("in a Hamiltonian system, if you sample the initial conditions from a distribution satisfying properties $X$, $Y$, and $Z$, and measure entropy as a function of time using formula $W$, then entropy will increase over time with high probability (in the technical sense; and in particular, is not guaranteed to increase.")
3) a hard law ("there is a function $E(\mathbf{q},\dot{\mathbf{q}})$ over phase space which is guaranteed to monotonically increase for any Hamiltonian system and any initial conditions.")
My impression is that physicists consider the correct interpretation to be (3): that in any system some quantity called "entropy" is guaranteed to increase, not only just with high probability. But I cannot reconcile this interpretation with what I know about classical mechanics, in particular, that a) Hamiltonian systems are time-reversible, and b) their flow preserves the symplectic form.
In another question that I found when searching this site (Is there any proof for the 2nd law of thermodynamics?) several explanations were proposed to (a) in different answers, including that the Second Law only holds for unbounded systems, or that the Second Law relies on the incorrect assumption that past states of a Hamiltonian system are uncorrelated(??)
But (b) seems even more concerning to me, since conservation of the symplectic form implies, in a precise way, what I would think is the opposite of the second law: if you sample initial conditions in phase space, and evolve them over time, they will not "spread out," thanks to the symplecticity of the flow.
What gives? Certainly it is intuitive that some form of the Second Law holds for classical systems (if I cluster particles together in a box and assign them random velocities, they will spread out and will not cluster together again, with high probability), but how does one define entropy and the Second Law in this setting rigorously?