23

The question might have some misconceptions/ sloppy intuition sorry if that's the case (I'm not a physicist).

I seem to have the intuition that given a system of $N$ charged particles in 3D space colliding (under the effect of gravitational forces and electrostatic forces) elastically with each other, then the evolution of this system is symmetric with respect to time reversal. In the sense that if I record a video of the evolution of this mechanical system and then play it backwards then the resulting video will look like something that can happen in our universe. If this intuition is correct, then it should be easy to prove mathematically from the uniqueness theorem of ordinary differential equations.

I also seem to have the idea that statistical mechanics is nothing but the situation described above with $N$ being very large (particles in a gas are moving under the effect of gravitational and van der Waal forces and nothing else, no?). Thus, I would expect that the evolution of a thermodynamic system with respect to time should be symmetric with respect to time reversal. However this seems to contradict the second law of thermodynamics. Where did I go wrong?


After seeing some of the responses to my question I wish to add the following :

I am NOT trying to refute the second law mathematically (lol :D). As you can see above I don't provide any mathematical proofs . I specifically said "If my intuition is correct, then it should be easy to prove mathematically ". That means I am skeptical about my own intuition because: 1) I don't back it up with a proof, 2) it is in contradiction with a well established law such as the second law.

Glorfindel
  • 1,424
Amr
  • 562
  • 5
    The microscopic dynamics is time-reversible, but it's infeasible to study macroscopic system using microscopic dynamics. Our description of a macroscopic system always involves some coarse-graining, and it is this coarse-graining that leads to irreversibility. – d_b Oct 03 '21 at 01:58
  • 1
    @d_b Do you mean that the time irreversibility phenomenon arises due to the transition from small values of N to large values of N ? It sounds counterintuitive to me. If the system satisfies time reversal symmetry locally everywhere (microscopic like you say) then I would expect it to satisfy the time reversal symmetry globally (macroscopic), no ? – Amr Oct 03 '21 at 02:03
  • 11
    No, that's not what I mean. If we could keep track of the full dynamics of all of the particles, then we would have reversibility even for arbitrarily large values of $N$. But we don't keep track of the full dynamics. At best we keep track of a coarse-grained or averaged version of the dynamics. It's not that the physical laws become time-irreversible for large $N$, it's that our description of the system — which we are forced into by practical necessity — becomes time-irreversible once we coarse-grain. – d_b Oct 03 '21 at 02:34
  • 2
    @d_b Aha. I think I understand the distinction you are pointing to now. I still find it counterintuitive/unexpected that this process of coarse graining will introduce an asymmetry that didn't exist – Amr Oct 03 '21 at 02:37
  • 3
  • You might enjoy Sean Carroll’s book on this subject, From Eternity To Here. – rob Oct 03 '21 at 13:34
  • Possibly a duplicate of this: https://physics.stackexchange.com/q/648449/247642 – Roger V. Oct 04 '21 at 07:00
  • Imagine you put your N particles in a small cube insde your 3D space, and then remove the walls and let the particles scatter. If you film the process, and then play the video in reverse, would you still argue than a bunch of particles spontaneusly assembing into a cube is something that might happen? – Dmitry Grigoryev Oct 04 '21 at 12:27
  • 1
    Is it worth pointing out there is a sense in which the 2nd law is symmetric in time? If you pick a random configuration according to your favourite smooth measure in configuration space, and evolve it forwards in time, as time increases the entropy should not decrease (2nd law). Conversely if you take the same state, and run the dynamical equations in reverse, evolving the state backwards in time, as time decreases the entropy also should not decrease. – ComptonScattering Oct 04 '21 at 22:19

13 Answers13

28

The arrow of time in thermodynamics is statistical.

Suppose you have a deterministic system that maps from states that can have character $X$ or character $Y$, to other states that can have character $X$ or character $Y$. The system is such that, for a randomly selected state $X_n$ or $Y_n$, the probability that the system will map it uniquely and deterministically to a state with character $Y$ is $10^9$ times larger than the probability that the system will map it uniquely to a state with character $X$.

Then, given any state $X_n$ or $Y_n$ and the number of times $N$ we have iterated the system, we can run time backward by reversing the iteration of the system and get the corresponding past state, because each state is mapped uniquely and deterministically.

However, if we can only measure the character of the system, we might note that the system originated in a state with character $X$, and, after an unknown number of iterations of the system, it was in character $Y$.

We would correctly note that states with character $X$ always evolve into states with character $Y$ if you wait a while. We could call this the "X-to-Y law" and express it mathematically. If we start with a certain number $x$ of states with character $X$ and number $y$ of states with character $Y$, then after iterations $N$,

$x = 10^{-9N}x_0$ and $y = y_0 + x_0-x$.

However, there is no corresponding "Y-from-X law". If we don't know $N$ and $Y_n$ exactly, we can only speak statistically. And statistically, the chances are overwhelming that, given some state with character $Y$, the state at some previous iteration also had character $Y$. This means we can't reverse the direction of time in our mathematical expression of the "X-to-Y law".


A more plain language explanation:

Suppose you have an oxygen tank and a nitrogen tank in a room and their mass ratio is the same as the mass ratio as that of air. The room pressure is assumed to always equalize with ambient pressure and temperature.

The 2nd law of thermodynamics says that, if you open the tanks and wait half an hour, the oxygen and nitrogen will all be out and the air will be exactly the same as it was before.

The time-reversed 2nd law of thermodynamics says that, any time you're in a room with normal air in it, somebody must have opened an oxygen and nitrogen tank half an hour ago.

g s
  • 13,563
  • Thanks for your answer. I will need some time to digest it. By the way, how would you respond to this question :

    If a closed mechanical system evolves from state 1 to state 2 , then it is also feasible that it starts from state 2 and ends at state 1.

    However, the above statement does not remain true if one replaces the words "mechanical system" by "thermodynamic system". How could that happen if a "thermodynamic system" is nothing but a mechanical system consisting of a really large number of particles ?

    – Amr Oct 03 '21 at 02:31
  • If the only possible states are 1 and 2 and the system is deterministic, then knowing you are in state 2 always maps back to state 1. If you have limitless numbers of possible states, knowing you are in a state with 2-like character tells you nothing about the past. – g s Oct 03 '21 at 02:36
  • We can identify systems that have a thermodynamic flavor that have that kind of character, although they're not actually thermodynamic. For instance, if your system is a running internal combustion engine, and the piston is in the state "up" and you know the spark rate of the cylinder, you know that one spark timing ago, it was in state "down" and two spark timings ago it was in state "up" and so on. The up-ness and down-ness of the cylinder is time reversible. – g s Oct 03 '21 at 02:41
  • However, the exploded-ness of the fuel is not time reversible, because for every one possible way an explosion can un-explode, there are a near infinity of possible ways the explosive can explode. – g s Oct 03 '21 at 02:42
  • Hmmmm okay, but isn't a thermodynamic system (when precisely modelled) deterministic? I mean in the sense that coin tossing is deterministic if precisely modelled – Amr Oct 03 '21 at 02:42
  • I explicitly and repeatedly described the hypothetical system in my answer as deterministic. – g s Oct 03 '21 at 02:56
  • I think I may have understood your answer. You are saying that in a thermodynamic system one can go from state 1 to state 2 and vice versa provided states 1,2 contain all the info about the position and velocities of every particle in our system. In that sense of the word state, a thermodynamic system will exhibit a time reversal symmetry just like a mechanical system. However, in real life we don't have access to all this data but instead have access to macroscopic measurements like pressure, temperature, density,.... – Amr Oct 03 '21 at 03:16
  • Obviously, these data are not a complete description of the thermodynamic system as in the previous sense of the word state (the one about the position, velocity of every particle). Thus given the macroscopic data of state 1 and state 2 of a thermodynamic system , it is extremely unlikely that an arbitrary thermodynamic system with the macroscopic data of state 2 to evolve to the macroscopic data of state 1. Did I get it right ? – Amr Oct 03 '21 at 03:19
  • Yes, that's right. – g s Oct 03 '21 at 04:02
  • 3
    Good answer, but perhaps your plain language explanation can emphasise the statistical argument a bit more by saying that, when the tanks are open, it is in principle possible that all N molecules move for a while into the N tank just by chance, but there are much fewer states with this character than states in which the N molecules are spread over both tanks. So when you open the tank, you start with a character that's very unlikely and you end up with a character that is much more likely. – Stephan Matthiesen Oct 03 '21 at 09:41
  • @g.s. Actually, because of time reversal, there is one possible way it can unexplode for every possible way it can explode. – user253751 Oct 05 '21 at 13:32
  • you're in a room I think this unintentionally implies the presence of an observer, and that the observer's presence is significant, which reading it again isn't really what you were saying at all – Rodney Oct 05 '21 at 15:47
12

A long comment.

Thermodynamics can be shown mathematically to be an emergent theory from statistical mechanics. Its laws are observational laws, deduced from variables and their measurements, that are needed to get the equations that map and predict the behavior of temperature, pressure etc.

Classical mechanics is deterministic, given the equations of motion of the individual particles time can be reversed, i.e. have the ensemble in a time reversed set. It is the great number of variables needed for the particles that makes the probability of a time reversed system to exist, infinitesimally small, given the number of particles in a mole ($\sim6\times10^{23}$), leading to the second law.

The successful emergence (mathematically) of the probabilistic thermodynamic theory from an underlying deterministic system, is what keeps going a number of theorists searching for a deterministic level from which the probabilistic quantum mechanics theory can emerge ( not successfully up to now, but that is another story).

Buzz
  • 16,031
anna v
  • 233,453
  • 2
    "what keeps going a number of theorists searching for a deterministic level from which the probabilistic quantum mechanics theory can emerge"، isn't that goal impossible by Bell's theorem ? – Amr Oct 03 '21 at 18:24
  • 1
    @Amr for local theories, I believe, – anna v Oct 04 '21 at 05:43
  • Yes but... as I understood Zeh's book "The Physical Basis of The Direction of Time" Boltzmanns H theorem can not be derived without postulating additional time asymmetric equations. – Harald Rieder Oct 05 '21 at 19:10
7

Yes there absolutely is a well-known apparent (classical) contradiction here.

Classical mechanics is symmetric under $t \to - t$, i.e. for any given motion the reverse motion is also possible.

As you suggested - we can try to justify it with ode uniqueness theorems that are based on saying the ode's/pde's we started from actually possess some geometric curve as it's unique solution, but the point is that the particles travel along geometric curves which exist, regardless of our use of differential equations that claim to predict what those paths are.

Classical statistical mechanics is based on the existence of classical mechanics and is simply a tool to avoid the inherent impracticality of actually solving gigantic systems of coupled equations and imposing initial conditions.

On a fundamental level it is thus absolutely possible to reverse the microscopic behavior of the constituent particles of a closed system to which the law of entropy applies and so find the system moving in reverse.

But the law of increase of entropy just says [1] that out of all the possible states of the closed macroscopic system, most of the states the closed system can evolve into will have their entropy decreased, it doesn't say that ones with less entropy don't also exist - the 'reverse' closed system can still exist.

The big issue is that, if the laws of statistical mechanics are symmetric under $t \to - t$, then it would mean ([1], Sec. 7) not only that the most probable state a closed system can evolve into has a greater entropy, it also means that the state had to have arrived from a state with greater entropy too, which would mean entropy can decrease, contradicting the claim that entropy never decreases (apart from fluctuations) in closed systems.

There is simply no classical resolution to this so far.

That classical statistical mechanics has some issues with the notion of entropy is not surprising. We can only justify defining it as $S = \ln \Delta p \Delta q$ on a classical level, leading to well-known issues since it depends on the choice of units and changes by an additive constant on changing units, so all we can physically talk about is differences of entropy classically.

In quantum mechanics, things fundamentally change.

First of all, using quantum (statistical) mechanics, we can define entropy intrinsically without any nonsense about choices of units.

Furthermore, the very first argument we made about particles going along geometric curves is just completely gone - we simply cannot even argue physically about reversing the system and knowing what it will do as time goes backwards.

Although the equations of e.g. non-relativistic quantum mechanics can be interpreted as consistent if time goes backwards, what matters now is the measurement process.

The measurement process in (canonical) quantum mechanics is absolutely not symmetric with respect to time, as one can see for example in my summary of the quantum mechanical measurement process here.

Thus, the measurement process in quantum mechanics implies a physical in-equivalence of the two directions of time [1], a well-known fundamental asymmetry with respect to time, indeed the kind of asymmetry implied by the law of increase of entropy.

So the notion of entropy is not only more natural from a purely quantum mechanical perspective, even the law of increase of entropy and it's asymmetry with respect to time seems to make more sense too.

Although it has not been proven [2], a suggestion [1] has been made that the law of increase of entropy is a macroscopic expression of the microscopic in-equivalence of directions of time in the quantum measurement process.

References:

  1. Landau and Lifshitz, "Statistical Physics", 3rd Ed.
  2. Sadovskii, "Statistical Physics", 1st Ed.
bolbteppa
  • 4,041
  • 1
    Entropy is a subjective quantity in statistical mechanics. Because it is the expactation value of information you may get out of a part of the world, it should always be infinite. Every continuous variable like x or p encodes an infinite number of bits. To get a finite value you must introduce some coarse graining. This coarse graining comes out of quantum mechanis because [x,p] ~ Planck's constant. – Harald Rieder Oct 05 '21 at 19:26
  • @HaraldRieder There's obviously nothing subjective about entropy in statistical mechanics. Your comment about dividing an infinite quantity by something to 'coarse grain' it to make it finite simply makes no sense (classically it's completely unjustifiable to make phase space dimensionless which caused a lot of controversy historically) and is now leading to misunderstandings - given that your comment is being taken seriously, please point out where in section 7 of [1] anything like this is done to set up entropy. – bolbteppa Oct 06 '21 at 01:18
  • The classical notion that all physical laws are symmetric in time depends on the mathematical framework used. Specifically the use of real numbers (with infinite precision). Alternative mathematical number systems (that recognise that infinite precision is not possible in a finite universe) do not give time symmetry in physical laws. So the apparent contradiction may be wrong. – matt_black Oct 06 '21 at 10:12
  • @bolteppa In quantum mechanics entanglement (von Neumann) entropy is also a subjective quantity. It depends on the subjective division of the total Hilbert space into 2 subspaces (and of course on the state vector, too). There is always an infinite number of possible splits, some giving you the max. ("number of qubits") and others the min. 0 and others any number in between. And in a continous space the trace of the density operator mostly will be infinite. To get a finite value invent some subjective coarse grainig.... – Harald Rieder Oct 06 '21 at 15:52
6

The laws of physics are differential equations, and to solve a differential equation you need two things: the equation itself tells you how the value of some physical variable at each point of a region of time and space is related to the values at its neighbouring points, and you also need the boundary conditions that specify the value of the variable on the boundary of the region.

The differential equations specifying the laws of physics are time-reversal symmetric. The boundary conditions are not. The second law is time-asymmetric because the boundary conditions of the universe are time-asymmetric: the universe began in an extremely low-entropy state.

Given that the past boundary of your region of interest (i.e. the starting conditions of your experiment) is a low-entropy state, then the statistical argument can explain why it is statistically virtually certain that it will evolve into a high-entropy state if it can, or at least, no lower. But the statistical argument can be applied in reverse too. If told that the final state of the system is a low-entropy state, and asked what is the most probable sequence of events leading up to it, it turns out that the answer is a high-entropy state evolving towards a low-entropy one. There are many more sequences starting high and going low than there are sequences that start low and stay there. The statistical argument is time-reversal symmetric too.

The reason the universe began in such an extremely low-entropy state is, so far as I know, still not understood. It may be something to do with the process that creates universes. It may be something to do with the density of the initial states - that the matter is so 'crammed together' that there is no space for alternative arrangements. It may be something to do with events shortly after the beginning - like inflation. Whatever it might be, it is not explained by the second law, merely asserted.

In thermodynamics, the second law is always implied in the set-up to the question. You start with two bodies at different temperatures, a hot reservoir and a cold reservoir. Or you start with all the gas in one half of the chamber and not the other. The past boundary is declared to have low entropy - we don't ask how it got that way. Within our arena, the rules are time-reversal symmetric. We only get an asymmetric solution because of the asymmetry of the boundary conditions. The source of the asymmetry is always outside our view, somewhere out in the wide universe beyond the boundary. If we try to apply thermodynamics to the entire eternal/infinite universe, with no boundary anywhere (and hence, no big bang), and ask what history is most likely, the answer is always a cold, uniform, boring universe that starts, continues, and ends in some indistinguishable maximum-entropy state. It's just a box of gas, sitting there, forever. Only a tiny, tiny fraction of possible whole universe histories have this super-low-entropy start.

It's a very good question, that has exercised the minds of some of our greatest physicists. Part of the problem is psychological I'm sure - we tend to forget the importance of boundary conditions when discussing the laws of physics. But it's also a pretty big mystery. Well done for noticing!

  • 2
    This is the right answer! The appearance of the lack of reversibility in the Second Law is completely due to the the universe starting in a low entropy state. – WaterMolecule Oct 04 '21 at 13:48
  • This is false. Given a low entropy state, the past is either a lower entropy state or a same-entropy state, and there's no time-reversal operator for thermodynamic processes (except those carried out at constant entropy) regardless of boundary conditions. – g s Oct 04 '21 at 15:19
  • 2
    @gs Thermodynamic processes are not time-reversible only because the description of a system as a thermodynamic one loses information. – user253751 Oct 05 '21 at 13:35
  • I wonder how you want to calculate an entropy of the universe which is a subjective quantity in mechanics, see my comment above. – Harald Rieder Oct 05 '21 at 19:31
4

Short answer: mechanics are time reversible on microscopic scale, entropy is never reversible on the macroscopic scale (except in the ideal case where its value does not change):

Longer answer:

The time-irreversibility of entropy is demonstrated by Clausius' Theorem:

$$\oint \frac{dQ}{T} \leq 0 \tag{1}$$

$(1)$ can be derived by analyzing the results of the Carnot cycle using an ideal gas. I won't go into it here, but I will mention that Enrico Fermi's book on thermodynamics (1937) provides an excellent explanation of this equation without skimping over any of the mathematics.

This naturally provides us a definition of entropy that is consistent with both the microscopic and macroscopic frameworks of classical physics (sometimes this definition is provided with an inequality; feel free to argue about that with me in the comments):

$$dS \equiv \frac{dQ}{T} \tag{2}$$

Consider a closed cycle where a system goes from state $A\rightarrow B$ and then state $B \rightarrow A$. Combining $(1)$ and $(2)$ we see that for any system which returns to its initial state (in terms of pressure, temperature, volume), the entropy must have increased (or at a minimum, stayed the same) in the time-forward part of the cycle,

$$\oint dS = \int_A^B dS +\int_B^A dS = \int_A^B dS +(S(A)-S(B)) \leq 0$$

Which implies,

$$ \int_{A}^{B} dS \leq S(B) - S(A)$$

And,

$$S(B) \geq S(A)$$

The thermodynamic connection to $(2)$ is provided through two equations: first, the conservation of energy, and second Gibbs' entropy (which reduces to Boltzmann's entropy for the microcanonical case):

$$dU=-PdV+dQ=-PdV+TdS \tag{3}$$ $$S=-k_B \ln \langle \rho_i \rangle \tag{4}$$

I won't get into the statistical mechanics here, but the probability of a particular energy level, $\rho_i$, provides us a connection to the microscopic physics, which, for classically modelled particles, can be represented by Hamiltonian equations:

$$ \frac{dp_j}{dt} = -\frac{\partial H}{\partial x_j}, \frac{dx_j}{dt} = \frac{\partial H}{\partial p_j} \tag{5} $$

Where I've used index $j$ for a particular particle in the system. Here's the important part. All of this mathematics is self-consistent, and the equations in $(5)$ are completely time reversible.

Aside: In my opinion, the statistical mechanics of classically modelled particles is an extremely well understood area of theoretical physics. So I wouldn't arbitrarily doubt the mathematics. The difficult part (as is always the case in theoretical physics) is explaining what the mathematics means without referring to experiment.

michael b
  • 782
  • Thanks a lot for your answer. It looks neat, and I 'll definitely return to it when I learn thermodynamics. I think I might have understood where the confusion is coming from as in my comments to the answer of g.s. Do you agree or am I still missing something – Amr Oct 03 '21 at 03:23
  • It seems like you are grasping the main concept of macroscopic state vs. microscopic state. In g.s.'s comments he suggested that the piston could return to its "up" state or its "down" state, but referred to other processes which are obviously irreversible. There are a plethora of these examples, mixing milk in coffee, scrambling eggs, etc. The physics is clear: microscopic classical systems are time-reversible, macroscopic systems where $N \rightarrow \infty$ can demonstrate irreversible characteristics. This, as I have shown in my answer, is supported by the theory. – michael b Oct 03 '21 at 04:35
3

The example you described is not really an illustration of the 2nd Law. Picture rather a helium balloon in a room. Equal pressure inside the balloon and outside. Then the balloon pops. Although at first you have a balloon-shaped cloud of helium in the center of a room full of air (State 1), after some time passes you will have a uniform mixture of helium and air throughout the room (State 2). This situation would never happen in reverse, even though every molecular collision occurred during the transition from State 1 to 2 was itself reversible.

RC_23
  • 9,096
  • Thanks for ur answer. I agree with you that the helium example you described can evolve from state 1 to state 2 but not otherwise. My question is why is this infeasability present for thermodynamic systems but not for mechanical systems ? What is the distinguishing feature between mechanical systems from thermodynamic systems ? Aren't thermodynamic systems modelled as really large mechanical systems in statistical mechanics ? Feel free to correct me , I never studied statistical mechanics yet – Amr Oct 03 '21 at 02:49
  • 1
    The way I see it, every irreversible process is a form of mixing – whether concentration mixing like we mentioned; or high energy particles mixing with low energy particles (which we call heat transfer); or even free expansion of a gas, which is occupied position states mixing with unoccupied ones. The second law basically says mixing cannot be undone. So your answer is, any system that involves mixing will obey the 2nd Law, including many mechanical systems. A simple mechanical gear train will not. – RC_23 Oct 03 '21 at 02:56
  • @Amr: "Quantity has a quality all its own" – Daniel R. Collins Oct 03 '21 at 23:06
  • @Daniel R. Collins Elaborate please. – Amr Oct 03 '21 at 23:07
3

The way I think about is that by introducing the tools of statistical physics you purposefully throw away a lot of information and by doing so you are looking at different dynamics and the time reversibility of the microscopic laws may not be carried over.

Take for example a system of $N$ bits $b_1,\dots, b_N$ that can be either in the state $1$ or $0$. A timestep of this system consists of randomly picking a bit and flipping it. If instead of random we picked this bit pseudo-randomly then the dynamics of this system are perfectly reversible. Now to look at this from a statistical mechanics point of view let's consider the average of these bits $B=\tfrac 1 N(b_1+\dots+b_N)$. If we start the system in the state of all zeros then we know that, almost certainly, our parameter $B$ will slowly rise and then hover around $B=1/2$. We could even calculate the entropy: $$S=k_b\log\omega(B)$$ where $\omega(B)$ is the number of microstates that correspond to a particular B-value. Here $\omega(B)$ is a link between micro and macro and it shows exactly where the information is lost. Before we considered $2^N$ states, now we consider $N+1$ states because $B$ takes on $N+1$ different values. For each macrostate $B$ the function $\omega(B)$ counts how many of these microstates are counted as 'the same'.

Together with the low entropy initial state this gives rise to dynamics that are not time reversible. Had we started in a state with $B\approx 1/2$ then there would be no time asymmetry, both for the microscopic and macroscopic dynamics.

Disclaimer: I already mentioned this but this is my point of view and I don't know if this is in conflict with literature.

2

In my opinion, you are conceptually "putting the cart before the horse". Physics is the observation of physical phenomena, and the development of a mathematical model that describes the observations. Since practically no mathematical model is unique, it is easily possible to develop several mathematical models that describe the observations to varying degrees of accuracy. Predictions from those various models along with the concept of Occam's razor, generally leads to one mathematical model that is accepted as the "best" model by the physics community. This all means that physics is not math, and it is inappropriate to use a mathematical argument to refute observations regarding the 2nd Law of Thermodynamics.

David White
  • 12,158
  • Thanks for your answer. I am not trying to refute the second law of thermodynamics but rather want to understand what is going wrong with my intuition/argument – Amr Oct 03 '21 at 02:20
  • Here is a different formulation of my question with almost no reference to any math, however I belive correct rigorous mathematics will not be the source of the error.

    If a closed mechanical system evolves from state 1 to state 2 , then it is also feasible that it starts from state 2 and ends at state 1.

    However, the above statement does not remain true if one replaces the words "mechanical system" by "thermodynamic system". How could that happen if a "thermodynamic system" is nothing but a mechanical system consisting of a really large number of particles ?

    – Amr Oct 03 '21 at 02:22
  • Are you not using math to argue that the 2nd law of thermo should be different than the observations indicate? When I read your post, that is how I understood it. – David White Oct 03 '21 at 02:23
  • No I am not. My post said " If my intuition is correct, then it should be easy to prove mathematically using uniqueness theorem of ODE" This means I am skeptical about my intuition and I actually don't provide any proofs which confirms my skepticism. If I had a proof, I wouldn't be skeptical. – Amr Oct 03 '21 at 02:26
  • @RC_23, you are correct. Thanks. I'll edit my post. – David White Oct 03 '21 at 02:57
  • 1
    https://en.m.wikipedia.org/wiki/Loschmidt%27s_paradox You might enjoy reading this, it shows my question was asked earlier by another notable physicist/chemist . So your objection about mathematics was really irrelvant... – Amr Oct 03 '21 at 15:52
  • This is the correct answer to "Why is the second law of thermodynamics not symmetric with respect to time reversal?". The laws of thermodynamics were devised to capture the observations and experiments of thermal physics. They are highly successful at doing this. This has nothing to do with mathematical proof. Mathematics is about the properties of objects that only exist in the human imagination, physics is about what really happens. – John Doty Oct 04 '21 at 17:07
  • @Johndoty Then I should change the title of my question to "Why does thermodynamics seem at odds with classical mechanics/electrodynamics/gravitation even though physicists say thermodynamics is emergent from mechanics/electrodynamics/gravitation" , but that's a very long title and it's really the body of my question. On stack exchange titles are supposed to be just a summary/approximation to the actual question – Amr Oct 05 '21 at 04:36
  • Why do you think thermodynamics is "emergent"? Thermodynamics is based on experiment and observation, the true foundations of physics. The math is a story we tell to explain the experiments. Some physicists, unfortunately, forget this. You should ask "Why do we use time-symmetric math to describe a reality that is manifestly not time-symmetric?" There are good reasons, but all mathematical models have their limits. – John Doty Oct 05 '21 at 10:23
  • @JohnDoty Since I am not a physicist, I actually don't have any opinion nor was I making any claims about the truth of the statement "Thermodynamics is emergent". However if you pay close attention to my previous comment to you , you will see that my modified title of the question was "....even though physicists think thermodynamics is emergent". Thus, my question is about physicists. Put differently, my question is " Why do physicsts think that a non time symmetric theory of thermodynamics can emerge (via statistical mechanics) from time symmetric theories like mechanics, electrodynamics" – Amr Oct 05 '21 at 14:39
  • @JohnDoty And there are only two sensible responses to my modified question: 1) physicsts don't actually think that thermodynamics emerges from mechanics/electrodynamics/gravity via statistical mechanics 2)Physicists do think thermodynamics is emergent from mechanics/electrodynamics and the apparent non symmetry of thermodynamics can actually emerge from the time symmetric phenomena of mechanics/electrodynamics. Most answers on this thread seem to have taken option 2, if you want to take option 1 I 'll be happy to see it as an answer. – Amr Oct 05 '21 at 14:50
  • @JohnDoty As for your question  "Why do we use time-symmetric math to describe a reality that is manifestly not time-symmetric?", my answer is that I thought we use time symmetric math to model the time symmetric phenomena and non-time symmetric math to model non-time symmetric phenomena. It seemed to me that this doesn't seem to be the case with thermo/statistical mechanics , hence came my question posted above on stack exchange – Amr Oct 05 '21 at 14:58
  • @Amir You might want to consider the famous "H theorem" (https://en.wikipedia.org/wiki/H-theorem). Not a true mathematical theorem, but it works. Go figure... – John Doty Oct 05 '21 at 18:47
  • @JohnDoty Thanks, I saw the link. Unfourtanelty my physics background is not sufficient yet to access these stuff , but I hope to fix that soon:) – Amr Oct 05 '21 at 21:08
2

The remorseless increase in entropy is simply a matter of probability.

Consider an idealised pool table set up for a game- one with no friction or air resistance and perfectly elastic balls. You take a cue shot. The moving white ball is now the only energetic object on the table, so the entropy is low. When the white hits the triangle of balls at the end of the table, it loses energy to some of the balls and entropy increases. Those balls bounce off the cushions and interact with other balls, spreading the energy further and increasing entropy. Over time you will be left with a state in which all the balls are moving at random with low speeds compared with the speed of the initial cue ball.

Given that the laws governing the interactions of individual pairs of balls are time reversible, it would be possible in principle for the random motion of the balls at some point to bring all of them to exactly the same positions they held at some earlier point but with exactly reversed velocities, in which case the subsequent evolution of the system would see all the balls eventually return to the original state. However, the odds of 16 randomly moving balls ending up in positions which form an exact replica of a former state but with exactly reversed velocities is vanishingly small. You could watch the randomly moving pool balls for billions of years without seeing them return to their original configuration. And that example is a simple, idealised one. In real life the energy of the pool balls is diffused through collisions and friction and lost by exchange with the particles that comprise the table and the air- it is even less likely that the molecules of the table and the air would ever randomly interact with the pool balls to reverse their paths exactly to wind back to the original set-up.

All examples of systems, mechanical or otherwise, evolve in a way that increases entropy because large numbers of interactions happen at random, and the chances of them happening in a reversed fashion are vanishingly small.

Consider Newton's cradle with just two balls. The interactions between the two balls are largely symmetric, so the system behaves in a periodic way, seeming to return repeatedly to a former state. However, there is a continual random interaction between the balls and the air molecules, which gradually sees the energy of the balls transferred through billions of random collisions to the air until they eventually stop swinging. To return the balls to their earlier swinging state would require billions of collisions between air molecules and the balls that were aligned in direction and synchronised to the extent that the energy was returned to the balls- again possible in principle, but an utterly utterly improbable eventuality in the random motion of billions of air molecules.

Marco Ocram
  • 26,161
1

Let me first point out that entropy may mean different things. As Jaynes points in his article The minimum entropy production principle:

By far the most abused word in science is "entropy". Confusion over the different meanings of this word, already serious 35 years ago, reached disaster proportions with the 1948 advent of Shannon's information theory, which not only appropriated the same word for a new set of meanings; but even worse, proved to be highly relevant to statistical mechanics.

He then goes on to claim that there are at least 6 different definitions of entropy.

As a minimum one needs to distinguish the thermodynamic (Gibbs) entropy and Boltzmann entropy.

Thermodynamics
Thermodynamics is a phenomenological discipline. In particular, Gibbs explicitly defined entropy as the quantity that always increases in irreversible processes - that is, it was defined explicitly to account for the observed lack of symmetry with respect to time reversal in macroscopic objects.

Statistical physics
Boltzmann defined entropy as a logarithm of the number of available microstates, $$S=k\log(\Omega).$$ The lack of symmetry in respect to the time reversal then reflects the vanishingly small probability of finding the system in a particular microstate, corresponding to the initial values of the parameters. The tendency of entropy to evolve towards higher values is known as Boltzmann H-theorem, although some of the approximations used in the Boltzmann's original proof are questionable.

Loschmidt and Zermelo paradoxes
The argument that, by reversing all the velocities one should be able to make system evolve to its initial state, is known as Loschmidt's paradox, which was one of the first objections to the Boltzmann H-theorem. This is indeed the case, as, e.g., shown in spin echo experiments. However, we usually do not have control over all the degrees of freedom of a physical systems (and the universe as a whole), which is why this never happens. In short, the irreversibility is a result of our inability to observe and control the world on the microscopic level, a measurement error with far-reaching cosnequences.

A good discussion of Loschmidt's and some other paradoxes is given in this pedagogical article; see also this thread.

Roger V.
  • 58,522
  • Thanks for the comprehensive answer. Do Gibbs entropy and Boltzmann entropy turn out to be the same ? Or at least does knowing one version if these two entropies allow one to compute the other ? – Amr Oct 04 '21 at 17:53
1

Suppose you have a single small ball floating in a closed box, bouncing around. If you divide the box in half with an imaginary plane, you can say that one half of the box is "full" if the particle happens to be in that half. The other half is then "empty". As this ball bounces about, the half/empty states will reverse, many times.

Add now another such ball. The two balls bounce off of the walls, and they collide with each other. In this situation, sometimes one side is empty, but sometimes both sides contain a particle. So, if you start the system in the "left side full" state, and let it evolve, the particles will move about, spread out throughout the box, but at some point you'll notice that they'll bunch up on the left side, and the initial sate of "left side full" will happen again, although the exact positions and velocities of the two balls might be different compared to their starting positions.

So, there's a distinction here between two kinds of states: "left side full" is a macrostate, while the exact configuration of the particles is a microstate.

Now, suppose there are 10 balls bouncing around in there. Start the system in the "left side full" macrostate, let it evolve. The particles will spread out by diffusion, and their behavior will be very chaotic. The likelihood that they'll all bunch up on the left side again is significantly reduced, but still, if you wait long enough, it'll probably happen.

Notice here that there are many, many more microstates (exact configurations) where the particles are spread out, than there are those where the particles are all on the left side. This is makes the "left side full" macrostate comparatively less likely - even though every collision is governed by time-symmetric laws. If you reverse time, and you are zoomed in, you can't see anything funny going on - you have to zoom out, and observe the behavior of a large collection of particles.

Now place thousand, million, billion particles in the box. Start the system in the "left side full" state, let it evolve. Have fun waiting for it to reverse! It'll never* happen!

It's even worse than that, though. A glass of water contains many, many, many, many more molecules than that. If you took out a billion molecules, it wouldn't even feel it. If you took out a billion molecules from a single drop, it wouldn't even mater.

A drop of water contains more than 1,500,000,000,000,000,000,000 molecules. So you can see why mixing two liquids is statistically irreversible.


* never = "extremely, unbelievably unlikely"

1

"This is false. Given a low entropy state, the past is either a lower entropy state or a same-entropy state, and there's no time-reversal operator for thermodynamic processes (except those carried out at constant entropy) regardless of boundary conditions."

That's a common misunderstanding evident in a number of the answers and comments above, that I think is worth addressing at greater length than is possible in a comment. (My apologies if this is against the rules on here.)

There are two aspects to the laws of physics within a region of spacetime that need to be considered: the kinematic laws, and the statistical laws.

The kinematic laws are those mentioned by the OP - that for any given forwards-in-time history, you can reverse all the final velocities of the particles, call them initial velocities, and trace the same history backwards. That's true, and agreed by all sides - at least in the classical physics version.

The entire trajectory of every particle throughout the period is fully determined by their joint position and velocity at any given instant, which includes both the start of the period and its end, and reversing the velocities of all particles at any instant reverses the trajectories. The trajectories in the bulk of the spacetime region are fully determined by the trajectories at the boundary.

The statistical laws are about the number of possible trajectories fulfilling particular macroscopic conditions. The idea is that the number of trajectories exhibiting 'normal' behaviour so vastly exceeds the number where the second law is violated that it becomes a virtual certainty that things will proceed as expected. While violations are possible, they are exceedingly improbable. Thus, people try to derive the second law as a statistical effect. It isn't.

Let's consider a classic example - a large number of gas molecules starting in one half of the chamber. We specify the positions of the particles on the past boundary, but we haven't said anything about their velocities. So we suppose they are selected uniformly from all the possibilities.

Now for each starting combination of positions and velocities, the entire subsequent history is determined. But over the range of all possible choices of velocities, there are a vast, vast number of possible trajectories.

In some, all the particles finish in the same half of the box. In some, they all end up in the same hundredth of the box, crammed into one tiny corner. But the number of initial states where they're spread out between the two halves vastly exceeds the number where they're in the same half, which even more vastly exceeds the number where they're in the same one hundredth. So given a uniform choice over our range of possible starting states - specified positions, arbitrary velocities - it is virtually certain that we've got one of the spread-out ones rather than a huddle-together one. This is the statistical argument's explanation for the second law.

However, this argument only works if you apply it to the initial states - when the full trajectories are determined by their values at any time. Thus, we can equally easily assert that at the end of the experiment all the gas molecules are in the same half of the box, and ask how they got there. The number of time-reversed trajectories satisfying the time-reversed constraints is exactly the same number. So there are vastly more choices of final velocities that are preceded by molecules spread out fairly equally between the two halves than there are choices where the molecules started in the same half, or an even smaller region.

So if we take the statistical argument seriously, then we ought to expect that setting a low-entropy condition on the final state would be preceded by an entropy decrease! This is what the statistical argument tells us. So the statistical argument contradicts the second law.

Note, I am not saying that the second law is wrong. I am saying that the statistical argument does not imply or explain it.

The statistical argument simply counts trajectories - but the number of time-reversed trajectories is identical to the number of forwards-time trajectories, so the statistical argument is as time-reversal symmetric as the kinematic argument is. We have to look elsewhere for an explanation.

The second law says that in the bulk of each spacetime region the entropy does not decrease. This implies that the entropy at the end time is always equal to or greater than the entropy at the start time. Any statement about the bulk is also a statement about the boundary, and wording it this way directs us towards an understanding.

High entropy requires no explanation. Statistically, virtually all trajectories are high-entropy. The big question we really need to answer is where does the low entropy come from? On statistical grounds, the starting conditions of our thermodynamic experiments are fantastically unlikely. Statistical arguments cannot explain them. But they're clearly observed, so we need an explanation.

Not only are they observed to happen, we also observe that they always happen on the past boundary, never the future one. If we constrain the past boundary to a low-entropy state, leave the future boundary free, statistical arguments predict exactly the sort of events that we commonly see. But if we constrain the future boundary to a low-entropy state and leave the past free, statistical arguments make the wrong prediction. Instead of predicting an even lower entropy initial state, they predict entropy decrease.

The second law states that the lowest entropy is always on the past boundary of any experiment. This cannot be explained by anything going on inside our region. It isn't explained by either the kinematic or statistical rules that apply inside the region, and they fully determine everything that happens inside the bulk. So it has to be something outside the region. Something happened in the deep past to start the universe off in an extremely low entropy state, and every instance of low entropy we ever observe experimentally originates there.

If we take the position that low entropy always originates in the past, then seeing low entropy on the future boundary, we can legitimately conclude that the only place it could have come from to get there is the past boundary, through the bulk of our experiment, and thus predict the initial conditions to be of even lower entropy. Statistically, that's fantastically unlikely. But the start of the universe is fantastically unlikely, so that's not a problem.

0

Irreversible change in state occurs over time. Yet time, per se, doesn't determine state in matter - it is the state variable, e.g. pressure, temperature, external field, etc.

A volume of gas may have a pressure $p$ at some temperature $T$. But just from that fact alone we know nothing about actual states of particular gas molecules. A number of microstates for the gas molecules can result in the same macroscopic pressure.

Some macroscopically observable and state-changing variable can, in a particular irreversible process, have some functional relationship with time. But this does not mean that all micro-transitions making up that macro-transition are equally unique or determinable from the macroscopic state variable vs time function.

The kinetic theory of gases posits molecules as uniform bodies continually exchanging kinetic energy via elastic collisions with neighboring molecules. Interactional forces between molecules are assumed to be equal. In reality such collisions are not elastic and some of the incoming kinetic energy is not always converted to the same increase in the slower molecule's speed due to internal transitions within the latter molecule - the internal transitions will definitely be non-determinable due to uncertainty of electron energies/positions at impact. The same external molecule-to-molecule impact can give rise to a different factorization of kinetic energies among the molecule's various degrees of freedom.

So "reversing time" is not going to be like watching a video replay of a shot in a game of perfectly smooth billiards.

Trunk
  • 275