Take a box of gas particles. At $t = 0$, the distribution of particles is homogeneous. There is a small probability that at $t = 1$, all particles go to the left side of the box. In this case, entropy is decreasing. However, it is a general principle is that entropy always increases. So, where is the problem please?
-
1duplicate of http://physics.stackexchange.com/q/542/ – Jun 03 '13 at 18:38
-
The second law of thermodynamics says entropy is always non-decreasing. But this is only a law of thermodynamics, so it is only true when thermodynamics is applicable and only in the sense of thermodynamic probabilities. It is not a law of mechanics and does not follow from the laws of mechanics. Thermodynamics makes additional assumptions which are not always true. That is "the problem". If the particles in the box are in some exceptional initial conditions so that ten seconds later they are all on the left, then the assumptions of Thermodynamics were not true for this particular case. – joseph f. johnson Jun 04 '13 at 04:34
5 Answers
Right, there is a small probability that the entropy will decrease. But for the decrease by $-|\Delta S|$, the probability is of the order $\exp(-|\Delta S| / k)$, exponentially small, where $k$ is (in the SI units) the tiny Boltzmann constant. So whenever $|\Delta S|$ is macroscopically large, something like one joule per kelvin, the probability of the decrease is de facto zero.
If you have $10^{20}$ molecules of gas (which is still just a small fraction of a gram), the probability that all of them will be in the same half of a box is something like $2^{-10^{20}}$. That's so small that even if you try to repeat the experiment everywhere in the Universe for its whole lifetime, you have no chance to succeed.
Statistical physics talks about probabilities and quantities with noise, as the previous paragraph exemplifies. But there is a limit of statistical physics that was known earlier, thermodynamics. Effectively, we can say that thermodynamics is the $k\to 0$ limit of statistical physics. We just neglect that $k$ is nonzero – it is tiny, anyway. In this limit, the noise of different quantities disappears and the exponential $\exp(-|\Delta S| / k)$ is strictly zero and the decreasing-entropy processes (by any finite amount, in everyday SI-like units) become strictly prohibited.

- 179,018
-
Thank you ! How could we justify that the probability is of the order $\exp(-|\Delta S| / k)$ ? – Arnaud May 08 '13 at 07:30
-
Hi Arnaud, it was a pleasure. See e.g. http://motls.blogspot.com/2013/02/ludwig-boltzmann-birthday.html?m=1 especially around the text "averaging and summing". The processes "From A to B" and "From B* to A*" are related by the time-reversal (or CPT) symmetry so their probabilities are related. If A,B represent macrostates ie ensembles of NA,NB microstates, respectively, the two macro-probabilities differ because we're summing over final microstates but average over the initial ones. The number of microstates is $\exp(S_A/k)$ or B, respectively, and the difference follows. – Luboš Motl May 08 '13 at 08:21
-
I should add that if the process with the increasing entropy is said to have a probability of order one, the reverse process with the decreasing entropy has the exponential suppression explained above. In reality, both processes may have a smaller probability, suppressed by a non-entropic, common factor, but the entropic suppression of the "decreasing entropy" process is dominant because this factor is really expo-exponentially small. – Luboš Motl May 08 '13 at 08:23
-
1I don't like this answer. The commenter says "small probability" which is true. Lubos says the "small probability" is strictly zero. This is not true. – Pricklebush Tickletush May 08 '13 at 17:41
-
-
1@zhermes Let us be precise. He says "Effectively, in the thermodynamic limit, the exponential $e^{-\Delta S/kT}$ is strictly zero." But real life is not in the thermodynamic limit -- this is an approximation. In the thermodynamic limit none of the particles will go to the left side of the box only -- yet this is still possible. The question of why it is still possible is left unanswered and it has to do with a fundamental misunderstanding of what $\Delta S > 0$ means (by the OP.) – Pricklebush Tickletush May 08 '13 at 18:50
-
In my opinion, the thermodynamic limit system does not exist in Nature, it is a mathematical abstraction. But some physicists believe it exists in Nature too. Be that as it may, it is an excellent approximation to some aspects of the problem it models, within the limits of the validity of its assumptions. – joseph f. johnson Jun 04 '13 at 04:37
Even though the answer you chose is very good I will add my POV
Take a box of gas particles. At $t=0$, the distribution of particles is homogeneous. There is a small probability that at $t=1$, all particles go to the left side of the box. In this case, entropy is decreasing.
Take the statistical mechanics definition of entropy:
where $k_B$ is the Boltzmann constant .The summation is over all the possible microstates of the system, and $P_i$ is the probability that the system is in the $i$th microstate.
The problem is that this one system you are postulating in your question is one microstate in the sum that defines the entropy of the system. A microstate does not have an entropy by itself, in a similar way that you cannot measure the kinetic energy of one molecule and extrapolate it to a temperature for the ensemble of molecules.
An observation on systems with decreased entropy: Entropy increases in a closed system. If an appropriate liquid is turned into a crystal, the ensemble of molecules will have lower entropy, but energy will have been released in the form of radiation, the system is closed only when the radiation is taken into account for the entropy budget.

- 6,986

- 233,453
-
-
@anna v Only macrostates have entropy defined for them. True. But a micro-state can be considered as a macro-state: in particular, the micro-state of all particles on the left-side of the box is macroscopically distinguishable, so it happens to be a macro-state, too, (OK, technically, one ought to throw in a few other micro-states indistinguishable from it, but all of them described by "all particles on the left") and has entropy (just about) zero. see http://physics.stackexchange.com/q/66651/6432 – joseph f. johnson Jun 03 '13 at 18:06
-
@josephf.johnson have you noticed the big Sigma in the definition of entropy in statistical mechanics? It means the sum total over all microstates. All else is philosophy and hand waving not dependent on math. – anna v Jun 03 '13 at 18:27
-
Your criticism applies to Boltzmann, not me. I don't use entropy in my research at all. The concept is too philosophical, and of limited utility for negative temperature states. But the OP was about entropy, so you have to use the usual formulas in the subject. One of them is Boltzmann's, and it has no summation sign in it. – joseph f. johnson Jun 03 '13 at 18:41
-
@josephf.johnson The two treatments/definitions of entropy, thermodynamic and statistical mechanics agree with each other , and are alternatively equivalent. Physics is not optional. – anna v Jun 03 '13 at 18:57
-
¡Then allow me to use the formula without the summation sign! But in fact they are not completely equivalent, the thermodynamic definition of entropy as integrating factor is only valid at equilibrium, but Boltzmann's formula works even for macro-states which are not in equilibrium. And Gibbs's formula, the one with the summation sign, does too. (And is more general than Boltzmann's since it allows for non-uniform distributions, which is useful for quantum theory and information theory.) Under certain hypotheses, all three are equivalent. – joseph f. johnson Jun 03 '13 at 19:06
Statistical physics doesn't tell you that entropy will increase all the time. Just that it will increase on average.
The maximum-entropy state is the one with the largest number of microstates. This doesn't prevent you from observing an odd state every once in a while -- even one with very low probability -- in fact fluctuations of the state do happen and are measurable. These fluctuations are centered about a state called equilibrium -- but if entropy could only strictly increase these fluctuations would not happen in the first place. Every isolated state would be stuck in equilibrium forever.
To elaborate on the point about averages -- the state which maximizes the entropy has the highest probability -- which is itself an average.
So for any moment in time our world is exceedingly more likely to increase its entropy than it is to decrease it, but there is no law, besides the law of large numbers, that prevents it from going the other way.

- 3,668
-
I don't understand this point. $S = - k \sum p ln(p)$, so when we calculate S, we already take in account the probabilites of each state. What is the "average" of $S$ ? – Arnaud May 08 '13 at 08:00
-
@Arnaud probabilities are an average. If I tell you that the odds of rolling a 2 on a six-sided die are 1/6, that means on average a sixth of your rolls will be a 2. – Pricklebush Tickletush May 08 '13 at 17:39
-
Is the following true : we cannot know that all particles will go on the left side. So $S$ is increasing, due to our lack of knowledge : we continue to assign positive probabilities to other states ! – Arnaud May 08 '13 at 17:44
-
@Arnaud no. It has nothing to do with a lack of knowledge. There are more ways for the gas molecules to arrange themselves around the whole box than there are ways to arrange themselves around part of the box. We add up all the probabilities. If you consider a large number of identical systems and check which microstate they are in, they are most likely to be found in the state with the highest entropy. So on average this is the state you observe. Like you correctly said, there is nothing to prevent you from seeing the gas on the left side of the box. – Pricklebush Tickletush May 08 '13 at 17:48
-
@Arnaud I edited my answer. Please tell me if it's more clear now. – Pricklebush Tickletush May 08 '13 at 17:55
-
It helps me. But, with my example of the gaz box, there is something that I don't understand. Is it really possible to know that all particles are in the left side ? If the answer is yes, then I don't understand why we use probabilities in statistical physics : we have only to observe the system ! If the answer is no, then why $S$ has decreased ? The probabilites haven't changed ! – Arnaud May 08 '13 at 18:01
-
@Arnaud classically, of course it is possible, at least in principle. Here is an experiment to do so. At every instant in time put a partition in the middle of the box and then count the number of molecules on each side. If you count zero in one side of the box then your state must have had all its particles on one side. There is nothing to prevent this from happening. – Pricklebush Tickletush May 08 '13 at 18:04
-
@Arnaud the problem is that it's too difficult to do experiments where you know absolutely everything, so instead we work with "what is most likely to happen." The great thing about this approach is that when you consider a large number of particles (on the order of Avogadro's number $10^{23}$) the probability that you will measure the system in any other state is so small as to be totally irrelevant. Sure, it may be that $10^{500}$ universe lifetimes from now an alien planet measures that same box of gas to have all its molecules on the left side, but every other time the theory works fine. – Pricklebush Tickletush May 08 '13 at 18:06
-
@Arnaud in the alien example, if they were to perform the experiment a moment later they would find (with overwhelming probability) that the gas had returned to equilibrium. – Pricklebush Tickletush May 08 '13 at 18:07
-
-
@Arnaud the point is that entropy does not always increase. At each point in time it is most likely to increase -- but it's not necessary that this happens. I can't answer your later question -- about lack of information and such -- because I can't really understand it. – Pricklebush Tickletush May 08 '13 at 18:46
Think of entropy as a steady-state quantity related to system dynamics: Wait for a 'long time', smear out the phase space trajectories and measure the resulting volume.
This means that even if all gas particles ended up on the left side of the box (unlikely, but not impossible and realized by perfectly valid microstates), entropy only would have decreased if they stayed there and never expanded to fill the whole volume again (which is even less likely and assumed to be impossible under the fundamental assumption of thermodynamics).
Note that even though the situation (all gas particles moving to the left side of the box) looks vastly different from the idealized equilibrium picture (think about what happens to pressure!), this is perfectly fine as thermodynamic variables are subject to random fluctuations, which can be large (but probably won't be).
This is essentially the same point (or rather one of the points) the accepted answer makes if you replace ensemble averages by time averages - I just prefer the dynamic picture I painted here:
Physics isn't really about abstract ensembles, information and missing knowledge - it's about energy, activity and variability: If you prayed to god and 'e felt generous and shared the knowledge about a particular thermodynamic system with you, that knowledge would not have any effect on your measurements. Dynamics place limits on relevant knowledge, but knowledge doesn't cause dynamics.
Also, as far as I'm aware, Luboš comments are misleading: Non-equilibrium thermodynamics is still largely an open problem, and the thermodynamic definition of entropy explicitly depends on equilibrium, regardless if you use the traditional definition based on Clausius' 19th century analysis or prefer a more axiomatic treatment.

- 13,545
-
-
@Arnaud: it is indeed hard to define entropy if you don't at least assume local equilibrium; take the dual to your experiment: confine the gas to one side of the box, remove the wall and let it expand; because the rate of expansion is fixed, at each point in time you could re-introduce the wall (freeze the instantaneous system parameter volume) and define the entropy of the expanding gas as the entropy of that equilibrium system – Christoph May 08 '13 at 09:18
-
I don't think it's right, Christoph. What is true is that one needs some at least local equilibrium to define the temperature - because the temperature labels rather well-defined mixed states e.g. $\exp(-H/kT)$ in QM - but not the entropy. Entropy is well-defined for time-dependent processes. Indeed, it has to be well-defined because the second law of thermodynamics says how it changes during such processes. If we could only define the entropy at equilibrium, the second law about the "strict increase" would never hold because the increase would be incompatible with the equilibrium. – Luboš Motl May 08 '13 at 09:22
-
There are uncertainties in the definition of the entropy, in practice $\pm C\times k$ where $C$ is of order one and $k$ is the Boltzmann constant is the minimum error that is introduced by making conventions about ensembles etc. And in statistical physics, the entropy decreases, usually by a tiny amount, in a small percentage of the times. The larger decreases we consider, the less likely they become. But if we only look at long enough changes $\Delta t$ when the expected entropy change is macroscopic, the percentage of the "large steps" where entropy went down becomes zero. – Luboš Motl May 08 '13 at 09:24
-
@LubošMotl: Entropy is well-defined for time-dependent processes. Indeed, it has to be well-defined because the second law of thermodynamics says how it changes during such processes - I don't think that necessarily follows: In particular, there are formulations of the 2nd law that explicitly state There exists for every system in equilibrium* a property called entropy*, and for irreversible processes the 2nd law only makes a statement about initial and final equilibrium states – Christoph May 08 '13 at 11:00
-
I'd also like to throw in some nice quotes from this paper: As the thermodynamic entropy is not measurable except when the process is reversible, the second law remains useless as a computational tool. and It is (has?) not been possible to show that the statistical entropy is identical to the thermodynamic entropy in general. – Christoph May 08 '13 at 11:38
-
I am saying that this restriction is in no way necessary for anything and it conflicts with the discussion about what the entropy is doing in between which is a totally legitimate discussion. You may prevent yourself from talking about these matters by unjustified extra restrictions but you shouldn't and even if you do, it should prevent you from trying to answer similar questions as well. – Luboš Motl May 08 '13 at 11:38
-
1Both major parts of the quoted claim above are completely wrong. There is no sharp open question about the identification of the statistical and thermodynamic entropy in the thermodynamic limit; and it is not true that entropy is only measurable for reversible processes - almost no processes in the real world are reversible so this would mean that entropy is almost never measurable which is just false. At any rate, if you followed your philosophy about the restriction, you should have honestly answered "my undestanding of the entropy doesn't allow me to discuss these matters", not what you did – Luboš Motl May 08 '13 at 11:40
-
-
My understanding, ahem, these questions are tricky, is that the thermodynamic definition of entropy does indeed depend on equilibrium. But the "information-theoretic" definition of Gibbs and Boltzmann does not: it is defined for any macro-state by the usual formula $-k\log\Omega$ or by the formula in anna v's answer. I am not sure how many people besides me call Boltzmann's definition, or Gibbs's, "the information-theoretic one" so don't quote me on that. – joseph f. johnson Jun 03 '13 at 17:50
-
@josephf.johnson: basically, there are 2 points of view - the bayesian/information-theroretic one, which is important in statistical mechanics to explain where thermodynamics comes from and why it works at all; however, once we've arrived at equilibrium thermodynamics, a frequentist/dynamical point of view becomes important so we can relate abstract statistical quantities to physical ones (temperature as average energy in case of equipartition, pressure as number and strength of collisions if the particle approximation holds, the characterization of entropy given above for ergodic systems) – Christoph Jun 03 '13 at 18:43
-
(continued) some people claim a single point of view works for everything, and that is - with respect - bullshit; in particular, the physics of equilibrium systems make a lot more sense if you take the point of view I described in my answer – Christoph Jun 03 '13 at 18:49
There is a certain amount of confusion about what is the difference between a macro-state and a micro-state.
Formally, a micro-state is the complete specification of all physical properties of the system: the complete location and momentum of each individual molecule. A macro-state, formally, can be any set of micro-states. Even more generally, and this is the setting of the formula, due to Gibbs, which anna v has written, a macro-state can be any probability distribution of micro-states, i.e., any collection of micro-states $i$ together with a coefficient $p_i$ for each micro-state, the coefficients are "probabilities", whatever that means, so they havbe to be positive real numbers that add to one. Boltzmann, and us too from now on, will simplify life and assume that all micro-states have equal likelihood, and even we will make the $p_i$ all equal to unity. (He has been criticised for this but it works.)
Strictly speaking, only macro-states have entropy. The entire ensemble is a macro-state, it is the set of all possible micro-states. But it is not the only macro-state. For $A$ any set of micro-states, let $\Omega$ be the number of micro-states in it. Boltzmann's definition of its entropy is $$S = k \log \Omega.$$
Nothing goes wrong with StatMech if you stick to this formalism, but there is supposed to be a physical intuition behind this distinction. Different people make different points here, but what we can use is this intuition:
a macro-state should be a collection of all the micro-states which are macroscopically indistinguishable from each other in some respect.
Now the OP asked about when all the molecules are in the left side. This is almost a macro-state. But not quite. But let us say A means "99.99% of the molecules are in the left side". This can be detected macroscopically, and if there are about $10^{23}$ molecules, and if each one has only about $7^{103}$ different ways to be "on the left side", then there are something like ... well, this is left as an exercise... there are a lot of ways that .01% of them can be on the right side, but this is still way, way less than the number of different ways for "50% plus or minus 0.01% to be on the left, and the rest on the other side". The entropy of these two macro-states will be vastly different. The macro-state which anna v has in mind is the macro-state of all micro-states, but that macro-state is not the only one that can be studied. The OP would not be far off thinking of the micro-state "all particles on the left" as if it were a macro-state and calculating its entropy to be zero. IN comparison to the entropy of the other macro-states, it is zero.
The OP started by talking about a homogeneous distribution of positions (and presumably momenta obeying the Maxwell distribution) of the particles at time t = 0. This is also a macro-state, call it B. There are a truly vast number of different micro-states all of which obey this description. Overwhelmingly more than the number in the macro-state A. So it has its own entropy, too. It is, however, more or less certain that one of these micro-states in B really will evolve to be a micro-state in A. But the probability of this happening is as small as Prof. Motl says it is.
See Sir James Jeans's discussion of a fire making a kettle of water freeze, which I posted as part of an answer to a similar question, Does entropy really always increase (or stay the same)?

- 6,986
-
I think our difference lies in the how one gets all molecules on the left. If it is a statistical fluctuation it is a statistical fluctuation within the total sample, i.e.momenta,energies, directions all happen to end up on the left, which is Lubos' answer. You are saying that a subset of microstates can be treated as a macrostate. It is not a closed system though. imo one can play the game of calculating increase/decrease of entropy for a closed system. The system of microstates all on the left is not a closed system since it commutes with the empty right . – anna v Jun 04 '13 at 04:07
-
The way I read the OPs post is the relatively naive way. E.g., the word "system" is not used at all. Nor is "state", nor "micro-state", etc. So I worked hard to explain "micro-state" and "macro-state." Now you are using the word "system" and there seems to me to be a mistake in the way you are using it. In this context, the system is the dynamical closed system of all the particles together. The system can be in a micro-state, we can have incomplete information about it and not know which micro-state it is in but assume it is in a macro-state...but it makes no sense to ask whether a – joseph f. johnson Jun 04 '13 at 04:28
-
state is a closed system or not. So it makes no sense to wonder whether the set A (not system) of microstates which are 99.99% on the left is "closed" or not: it is a macro-state of the closed system of particles bouncing around in a box. – joseph f. johnson Jun 04 '13 at 04:29
-
well, this is your own definition. For me macrostates are the complete set of variables etc in the box or whatever as seen in thermodynamic equations. microstate is when I take a microscope and look at its particulate physics and express thermodynamal quantities of the total in statistical terms. – anna v Jun 04 '13 at 05:52