22

In statistical explanations of entropy, we can often read about a (thought) experiment of the following sort.

We have a bunch of particles in box, packed densely in one of the corners. We assume some temperature, and with it some random initial velocities of the particles. We don't exactly know the positions and the velocities, so these can be modeled as random variables in a mathematical sense. The random variables expressing the initial conditions have a certain joint probability distribution where the configurations expressing "particles in a bunch in the corner" have high probability. Now, we simulate physics (apply deterministic and reversible equations of motion) on this arrangement and we can mathematically prove that the random variables corresponding to the new positions and velocities of the particles have a joint distribution that makes it very likely that draws from it will fit the description "particles all over the place in a nothing-special arrangement".

This is very informal, but I know that all of this can be formalized by introducing the concept of a macrostate and then we have a mathematically provable theorem that the information theoretical conditional entropy of the full state given the macrostate will increase as time passes. This is basically the second law.

Now I don't see anything preventing me from applying the same logic backwards in time. Based on these mathematical results, I'd assume the following holds:

When I see a (moderately) clustered configuration of particles in the box, if someone asks me what I believe the particles looked like 10 seconds ago, my answer should be that 'they were probably more all over the place than now, with no particular arrangement or clustering'.

Or formulated otherwise, looking backwards in time, we should expect to see an increased thermodynamic entropy. The paradoxical thing to me is that we seem to assume that in the past entropy was even smaller than today!

Practical example: You arrive late to chemistry class and the teacher is demonstrating how some purple material diffuses in water. Common sense tells me to assume that the purple material was more concentrated in the water 10 seconds ago than it is now. But the above argument should make me believe that I look at the lowest entropy right now and the material was/will be more diffused in either direction of time. There is nothing time-asymmetric in the above statistical reasoning.

How can this paradox be resolved?

isarandi
  • 905
  • It's called the second law of thermodynamics, and there are people who will claim that it pretty much explains why we "experience" time in the forward direction. – Floris Nov 05 '14 at 23:48
  • @Floris I didn't know that the second law also talks about inferring things about the past. I thought it states that if you start with a low entropy configuration (like warm and cold water side by side in a tank) then in the future it will have higher entropy (equal temperature in both halves of the tank). But I don't think there is a law that would say if you see a tank with equal temperatures (high entropy) you should assume that in the past it had a lower entropy. I think you should just assume it was the same. – isarandi Nov 06 '14 at 00:06
  • If you look at the actual derivations of the second law, you will see that the arguments used are not reversible, thus there is no paradox. Nevertheless, your paradox is called Loschmidt's paradox, since one can doubt the assumptions made in the usual derivations of the second law, see also the part about the fluctuation theorem. – ACuriousMind Nov 06 '14 at 01:05
  • Related: https://physics.stackexchange.com/q/19970/2451 and links therein. – Qmechanic May 27 '17 at 09:09

2 Answers2

12

The reasoning in the question is correct. If you have a box with gas particles placed in half of a box but otherwise uniformly random and with random velocities then it is overwhelmingly likely that it entropy will increase with time, but if reverse the velocities, you will still have randomly distributed velocities and the same argument will apply. By time symmetry reversing the velocities and going forward in time is equivalent to going backward in time. So system prepared as described above would almost certainly be in local entropy minimum wrt to time.

If the whole universe only consisted of some water with unevenly distributed dye in it, and we knew nothing about its origin, then inferring that the dye was more evenly distributed in the past would be rational. The water and dye being in a beaker near a teacher in a far from equilibrium universe makes other explanations much more likely though. However, your line of reasoning has some bite at the cosmological level. This is the Boltzmann Brain Problem. It is still not satisfactorily resolved, as you can see on ArXiv.

The second law of thermodynamics works (and is a law) because the universe is far from equilibrium (ie low entropy) and is believed to have started much farther from equilibrium that than it is now. Of course a big part of the reason for believing that is the second law. ;)

Here is a more detailed explanation from my answer to Where does deleted information go?:


The apparent conflict between macroscopic irreversibility and microscopic reversibilty is known as Loschmidt's paradox, though it is not actually a paradox.

In my understanding sensitivity to initial conditions, the butterfly effect, reconciles macroscopic irreversibility with microscopic reversibility. Suppose time reverses while you are scrambling an egg. The egg should then just unscramble like in a film running backwards. However, the slightest perturbation, say by hitting a single molecule with a photon, will start a chain reaction as that molecule will collide with different molecules than it otherwise would have. Those will in turn have different interactions then they otherwise would have and so on. The trajectory of the perturbed system will diverge exponentially from the original time reversed trajectory. At the macroscopic level the unscrambing will initially continue, but a region of rescrambling will start to grow from where the photon struck and swallow the whole system leaving a completely scrambled egg.

This shows that time reversed states of non-equilibrium systems are statistically very special, their trajectories are extremely unstable and impossible to prepare in practice. The slightest perturbation of a time reversed non-equilibrium system causes the second law of thermodynamics to kick back in.

The above thought experiment also illustrates the Boltzmann brain paradox in that it makes it seem that a partially scrambled egg is more likely to arise form the spontaneous unscrambling of a completely scrambled egg than by breaking an intact one, since if trajectories leading to an intact egg in the future are extremely unstable, then by reversibility, so must trajectories originating from one in the past. Therefore the vast majority of possible past histories leading to a partially scrambled state must do so via spontaneous unscrambling. This problem is not yet satisfactorily resolved, particularly its cosmological implications, as can be seen by searching Arxiv and Google Scholar.

Nothing in this depends on any non classical effects.

  • What you say about perturbations is true, but it's worth pointing out that if you randomly pick a microstate out of all possible ones, the chance of picking state with higher entropy S that evolves to a state with lower entropy S' tiny, but the chance of picking a state with lower entropy S' to begin with is equally tiny. So this type of argument can't really resolve Lobschmidt's paradox--all the physicists who I've seen commenting on the issue say it's ultimately a cosmological question about why the universe started off in a very low-entropy state at the time of the Big Bang. – Hypnosifl Nov 06 '14 at 03:43
  • And speaking of physicists talking about this issue, if israndi is interested in seeing what they have to say, this exact paradox (that the statistical reasoning used to derive the 2nd law could equally well be applied in reverse, but it would lead to incorrect conclusions in that case) is discussed in Brian Greene's book The Fabric of the Cosmos (see p. 160 on google books here), and Sean Carroll's book From Eternity to Here. – Hypnosifl Nov 06 '14 at 03:49
  • @Hypnosifl the chance of picking state with higher entropy S that evolves to a state with lower entropy S' tiny, but the chance of picking a state with lower entropy S' to begin with is equally tiny. correct. So if you see a low entropy macrostate then both its past and present states are likely to have been higher entropy states. Also I think it is the Boltzmann Brain Problem that is cosmological. Lochschmidt's paradox applies on all scales. – Daniel Mahler Mar 28 '16 at 19:35
1

Or formulated otherwise, looking backwards in time, we should expect to see an increased thermodynamic entropy. The paradoxical thing to me is that we seem to assume that in the past entropy was even smaller than today!

The following assumes that the description of microscopic motion of the particles of the system is Hamiltonian (your system qualifies for this).

I will use the word thermodynamics in its restricted sense, i.e. the subject treating effects of heat and work exchange between bodies on their states of thermodynamic equilibrium. 2nd law of thermodynamics talks about changes between equilibrium states only.

The impression of a paradox and disagreement about its importance, resolution and whether resolution was found persists for more than a century now. No doubt this is partially due to the fact people teach many misconceptions at universities and their students later publish some of them in their papers.

Here is one solution that is known at least since 60's when Jaynes published it (see below). In contrast to resolutions based on various wild and misguided assumptions on the alleged entropy of the Universe and its value in the past, it is quite prosaic.

The short version of this prose is this: there is no paradox or contradiction between probabilistic reasoning and thermodynamics, because the theorems concluding the same trend for entropy for both the actual and the velocity-reversed specially prepared microstate talk about different kind of entropy than thermodynamics and 2nd law do. People got confused by two different concepts of entropy here.

The derivations actually talk about evolution of some coarse-grained information entropy $I_{CG}$ (or similarly, about minus Boltzmann H-function). This is typically defined for all microstates of the mechanical system, how different soever they are from its microstates compatible with equilibrium thermodynamic state of thermodynamic system modeled.

This is very different concept of entropy from thermodynamic entropy $S$ (Clausius' entropy), which makes sense only for microstates that are compatible with state of thermodynamic equilibrium. For general states of thermodynamic system (for example, its possible non-equilibrium states), the concept of thermodynamic entropy does not generally apply.

Also, any implication of the 2nd law for thermodynamic entropy is restricted to states of equilibrium. Trying to apply it to non-equilibrium states is a suspicious operation that may be useful in some cases, but has no general validity whatsoever.

This means 2nd law actually says nothing about the special microstate imagined or its reverse. Both correspond to highly non-equilibrium thermodynamic state and do not have thermodynamic entropy. The coarse-grained entropy increases, but there is no connection to thermodynamic entropy and thus no contradiction with 2nd law.

2nd law says only that when container with system in equilibrium with thermodynamic entropy $S_1$ is suddenly enlarged so that the system is no longer in equilibrium state, the final equilibrium state of the system will have thermodynamic entropy $S_2 \geq S_1$. There is no problem with thermodynamic entropy increase as time coordinate is decreased below the time of enlargement, because the entropy retains value $S_1$ since the system was in equilibrium state in the original volume.

This is one of the reasons why it makes no sense in thermodynamics to talk about thermodynamic entropy of systems such as living cell, fly, Earth or the Universe. These are not systems in thermodynamic equilibrium and are not eligible for thermodynamic description (in the above restricted sense).

Finally, this means that the above-mentioned derivations actually do not derive 2nd law of thermodynamics at all, but only a theorem about evolution of certain theoretical quantity - information entropy of coarse-grained description $I_{CG}$ - that is only similar in wording to the 2nd law of thermodynamics, but has completely different meaning.

The quantity $I_{CG}$ expresses ignorance about the actual microstate of the system when all we know is a cell in phase space. It is too general as far as allowed microstates go, and too specific as far as cell specification goes, to identify it with thermodynamic entropy in all cases.

Thermodynamic entropy of equilibrium state does correspond to information entropy, but in a very different way; its value is equal to maximum possible value of information entropy given mathematical constraints on the probability distribution implied by the thermodynamic state maintained by physical constraints (volume of the container). This is very different from coarse-graining.

If you got interested, you can read the original and more exhausting explanations in Jaynes' contributions to physics, mainly the papers

http://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf

http://bayes.wustl.edu/etj/articles/brandeis.pdf

http://bayes.wustl.edu/etj/articles/mobil.pdf - from page 141

http://bayes.wustl.edu/etj/articles/ccarnot.pdf - sec. 6 & Appendix C

  • I don't know enough about thermodynamics to understand how/whether this answers my question. – isarandi Jan 30 '15 at 19:12
  • If you'd like to study thermodynamics more, I recommend you get some good old books that treat only classical thermodynamics, and do ignore statistical physics while studying them. When you understand that, then study statistical physics, I think best advice is Jaynes' work even for students - he says some very interesting things and does so exceptionally clearly. – Ján Lalinský Jan 30 '15 at 19:51
  • My post answers your question in the sense close to 無 (mu - see e.g. https://en.wikipedia.org/wiki/Mu_(negative) ) - there is no paradox. Shortly and again, people think there is because they confuse thermodynamic entropy with other concepts of entropy. Entropy formulae used in statistical physics follow thermodynamic entropy as far probability distribution appropriate for thermodynamic equilibrium is used. – Ján Lalinský Jan 30 '15 at 19:59
  • If the initial distribution is highly non-equilibrium both for instants after and before the state is specified, like in your example, the statistical concepts of entropy that (the derivations insisting on the paradox and those insisting on resolving it) do not necessarily correspond to thermodynamic entropy and 2nd law has no import on their behaviour in time. – Ján Lalinský Jan 30 '15 at 20:03