6

Wikipedia - Second law of thermodynamics:

...the entropy of any closed system not in thermal equilibrium almost always increases.

I understand that the second law of thermodynamics is based on the statistical unlikelihood of fast-moving molecules to aggregate.

However, if it is only "almost always" then why is the phenomenon stated as a law?

Is it because we have not yet observed the unlikely aggregation?

Pacerier
  • 893
dotancohen
  • 4,545

5 Answers5

9

You say:

Of course I understand that the second law of thermodynamics is based up the statistical unlikelyhood of fast-moving molecules to aggregate

and the word "almost" just means "statistical unlikelyhood".

I suppose calling the second law a "law" is a matter for debate. Statistical thermodynamics gives you (in principle) a way of precisely calculating the probability that the entropy will increase, so even though the statement "will almost always increase" sounds vague it can be made as precise as you want. It seems to me that this justifies calling the second law a "law".

John Rennie
  • 355,118
5

''the entropy of any closed system not in thermal equilibrium almost always increases''

This doesn't (or at least shoudn't) refer to probability, which is a concept completely foreign to thermodynamics. Thermodynamics assumes (from a statistical mechanics perspective) the thermodynamical limit of infinitely many particles, in which case statistical irregularities are completely absent.

If a system is too small for the thermodynamic limit to be a good approximation, thermodynamics no longer apply, so the question of the validity of the laws for such a system is no longer sensible. The whole thermodynamic formalism and terminlolgy breaks down for such a system, not only the second law. (Just as that the fact that the laws of Germany are not applicable in Austria doesn't mean that they are only almost always valid.)

It also cannot refer to the fact that entropy is constant in equilibrium, since the statement explicitly assumes a nonequilibrium state.

Therefore the statement can sensibly refer only to the fact that there are many systems in nature that are in a metastable state only (see http://en.wikipedia.org/wiki/Metastable_state). Thus they are not in equilibrium but retain their state (and hence don't change their entropy) unless seeded from outside (loss of closure), in which case they suddenly undergo a phase transition.

  • Thanks, Arnold. I had not considered the possibility of a metastable system, and if that is in fact the reason for the word "almost" then I think that it should be clarified or even omitted. – dotancohen Mar 03 '12 at 13:19
4

The way the second "law" actually works is like this: any given system has a multitude of states it can occupy. It is assumed all of these states are equally likely. In that case, states which exist in a multitude of permutations (but are physically equivalent) are more likely to be observed, i.e, you are much more likely to see BAAA=ABAA=AABA=AAAB than to see AAAA, assuming all these are possible.

There is not actually any kind of mechanistic rule attached. Heat "flows" from a hot to a cool system simply in that there are more units of energy in the one, so in the random transfer processes, it is more likely to observe a unit of energy transferring from the hot system. In practice, the numbers which represent the number of possible states near equilibrium are so large that they are hard even to compute with (like 10^100 factorial) so these vastly dominate the probability of events.

But apart from commenting on where the dice are most likely to land, there is no rule in quantum thermodynamics which says heat has to transfer from the hot system to the cold system. It can just as well do the reverse, and this will become more apparent as you examine smaller scales.

2

In order to understand the meaning of "almost" you should read the Poincarè recurrence paradox. A gas of particles enclosed in a part of a box naturally expands, incrementing entropy, in the whole box. But there is a probability that a fluctuation happens so the whole gas is again confined in a part of the box: entropy decreases again. It is a mechanical consideration of course. Poincarè said that the time to see such a fluctuations is longer than the estimated time of the universe so is physically meaningless.

I don't report the demonstration of poincarè but before you should see also liouville theorem.

-1

I think the author has used the word "almost" to incorporate the possibility that $ \Delta S $ can be equal to zero also. As the exact statement of second law of thermodynamics goes this way,

An isolated system evolves in such a way that $ \Delta S \geq 0 $.

This means that an isolated system should evolve in such a way that the multiplicity should remain same or increase.

For example, consider a simple system with two macrostates $A$ and $B$, with $4$ and $6$ microstates respectively. If we find the system in mactrostate $A$, it can evolve and remain in same state or can move to macrostate $B$. But, if we find the system initially in macrostate $B$, it will remain in macrostate $B$ and never trasits to macrostate $A$.

To my knowledge, no one knows why nature follows this rule!!!

orionphy
  • 1,266
  • 4
    of course we know. statistical thermodynamics tells us that the rule is followed because the probability of disorder increasing is practically 1. It is within the classical thermodynamic system, a mathematically consistent system, that it is an absolute law. – anna v Feb 15 '12 at 15:08
  • @annav: but traditional statistical mechanics makes some assumptions on the behaviour of physical systems the only real ratification for which is that they lead to it being compatible with thermodynamics. However we can't really reduce them to physical laws independent of thermodynamics. (Maybe we can actually, but I don't think there is a definitive solution yet. FWIW I've just started working on that very matter.) – leftaroundabout Feb 15 '12 at 16:09
  • 6
    Comments to the answer(v1): The word almost is not just included to allow $S$ to stay constant, but also to take into account the small statistical probability that $S$ actually decreases. – Qmechanic Feb 19 '12 at 12:58
  • 1
    That Nature follows the laws of thermodynamics is a well-known consequence of statistical mechanics and the idealization that the particle number can be taken as infinite. the last assumption eliminates arguments about Poincare recurrence, which are valid only in finite systems. As the universe may well contain infinitely many particles, this assumption is perhaps even true in a literal sense, rather than an approximation only. – Arnold Neumaier Mar 02 '12 at 16:05
  • 3
    Well, not every application of statistical mechanics considers the whole universe. In principle, one may monitor the entropy of a very small finite system, and in such system, there could be a non-zero chance that the entropy decreases once in a while, so OP's question is not just academic. – Qmechanic Mar 02 '12 at 18:26
  • @Qmechanic: But thermodynamics doesn't apply to a very small finite system - trying to do so is like trying to apply classical mechanics to a molecule. The assumptions for its validity are not given, so other methods must be used. In other words, as long as thermodynamics applies, the second law is true (almost by definition), and entropy cannot decrease. – Arnold Neumaier Mar 03 '12 at 15:54
  • 3
    Well, we can still do statistical mechanics on a finite system, and extract thermodynamical quantities from it (possibly with error-bars that grows if we make the system smaller). But this is not my main point. My main point is that for certain physical systems, it is possible to make statistical precise that the entropy may decrease once in a while. – Qmechanic Mar 03 '12 at 16:41