16

I am a high school student trying to wrap my head around the second law of thermodynamics for the past few days to no avail. Having only a cursory knowledge of calculus, and chemistry and physics in general doesn't help either.

  1. The second law of thermodynamics says that entropy of the universe always increases. For constant pressure and temperature conditions, Gibbs free energy equation is used to calculate whether the reaction is spontaneous or not, meaning whether it will occur or not.

  2. The more I try to read about it, the more proof I find against above paragraph. After having read about the Poincaré recurrence theorem, Maxwell's demon, and this excellent Quora answer, I would say that the whole law of thermodynamics is a farce. A plot by Gibbs and Helmholtz and Boltzmann and Maxwell to dupe the students while they laugh from the heavens. Please excuse my rambling. It's the product of tearing out half my hair trying to understand this.

  3. From what I have read, it seems that the second law is not really a law, but a statement about the most probable arrangement of a given system. Of course, I don't claim to understand anything from the sources I have mentioned, nor do I think I will understand before at least an undergraduate course in partial differential equations, calculus and all the other prerequisites required to even start.

  4. So my goal in asking this question is asking if anyone is capable and willing to write a concise and simple explanation for a high school student which would also sort out all the fallacies I have mentioned above, or can direct me to someone who can. I understand that this might be a Feynman-esque feat not suitable for this site and I apologise for that.

EDIT: I have gained a somewhat good understanding of the second law (for a high school student). So my question is not as open ended as it was. What I really want to ask now is: What does it mean for entropy to decrease, if there was a small enough isolated system so that the chances of non-spontaneous events happening was not 1 in TREE[1000]?

Would all laws of thermodynamics go out of the window? It seems to me that this weakness (I don’t know how to phrase this) of the second law is largely ignored because the chances of this happening are approximately 0.

Of course, all this rests on the assumption that entropy can decrease, which is what I have gathered, although not all people agree, but many do. If it can decrease, doesn’t that mean that as the system gets smaller the laws of thermodynamics get weaker?

Where do you draw the line after which the laws of thermodynamics are not reliable?

Also, when I use the Gibbs equation to find the boiling point of water at NTP, would that boiling point change as I reduced the number of particles?

Is my boiling point wrong? Boiling point is a bulk property, but you could easily substitute a chemical reaction in that.

7 Answers7

26

I'm going to specifically address the two concepts you brought up in your second point:

The Poincare recurrence theorem

In layman's terms, this theorem reads: "For any system in a large class of systems that contains systems in thermodynamic equilibrium: if you take a picture of the arrangement of the system at a particular instant, then if you wait long enough, there will eventually be another instant in which the system's arrangement is very close to the one in the picture." This doesn't actually contradict anything in thermodynamics, because thermodynamics is built such that it doesn't really care what specific arrangement the system is in at a particular instant. That's the reason it was developed, after all: it's impossible to measure the precise positions and velocities of $10^{23}$ particles at once, so we have to have a way to deal with our lack of knowledge of the initial state of a system. This is where thermodynamics comes in: it turns out that if you make some fairly simple assumptions about the microscopic behavior of a system, then you can make accurate predictions about how the system behaves in equilibrium.

At any instant, a system in thermodynamic equilibrium is in a particular specific arrangement, which we will call a microstate. If you watch a system in thermodynamic equilibrium, it will adopt many, many different microstates. Thermodynamics makes the assumption that every accessible microstate is equally probable. If you take the set of all microstates that a given system can adopt in equilibrium, that set is called the macrostate of the system. Thermodynamic quantities are defined only on the macrostates. For example, there is no such thing as the entropy of a microstate. The entropy is a property of a system in equilibrium, not a particular arrangement of atoms.

So, if a system in equilibrium is in a macrostate that contains a highly-ordered microstate, the fact that the system can sometimes be in that microstate has absolutely no bearing on the entropy of that system. The existence of that microstate was already accounted for when calculating the entropy. So the Poincare recurrence theorem doesn't really have much at all to do with the second law of thermodynamics, which talks only about how entropy behaves when a system moves between different macrostates.

Maxwell's Demon

Maxwell's Demon does not violate the second law of thermodynamics, because the decrease of entropy inside the chamber is more than counterbalanced by the increase of entropy of the demon itself (or the environment). In order to do its job, Maxwell's demon must measure the velocity of a particle. To act on that measurement, the measurement value must be stored somewhere. Even if the measurement is done in a completely reversible fashion, without expending energy, the stored information from the measurements must either accumulate over time, or be erased. The key point is that erasing information increases entropy. Any physical Maxwell's demon must have a finite information storage capacity, and so must eventually start erasing as much information as it records. So, in equilibrium, the increase in entropy due to the continual erasure of information in the demon is greater than or equal to the decrease in entropy inside the chamber.

14

Suppose you flip a fair coin $N=10$ times. You would expect the number of heads $n_H$ not to differ too much from the number of tails $n_T = N - n_H$, but you wouldn't be surprised if you got, for example, $n_H = 8$ heads and $n_T = 2$ tails. Indeed, we can plot the probability distribution of outcomes, and see that it is peaked around $n_H = 5$.

enter image description here

One way to think about why this is is that if we look at all the possible sequences of heads and tails resulting from our flips, there are more sequences with similar numbers of heads and tails than there are sequences with different numbers of heads and tails. For $n_H = 5$, we could have HTHTHTHTHT, HTTHHTTHHT, etc., but for $n_H = 10$, there is only one possible sequence of outcomes, namely HHHHHHHHHH.

As we increase the number $N$ of coin flips, the distribution becomes more sharply peaked about $n_H = N / 2$, meaning we're increasingly likely to observe similar numbers of heads and tails. Here are the same plots for $N=10^3$ and $N=10^5$:

enter image description here enter image description here

I can't get my computer to make a similar plot for $N=10^{23}$, but you can imagine if I did it would be just a tiny needle of a peak situated at $n_H = N / 2$. What is happening is that when $N$ is large, there are so many more sequences with similar numbers of heads and tails that it becomes increasingly improbable that we'll find large differences in these numbers (relative to the number of coin flips.)

This is only an analogy, but the essence of the 2nd law is here. The analogy is that the the strings of outcomes are like the microstates of our system with $N$ constituent subsystems, and the number of heads is like a thermodynamic variable (a statistic) specifying the macrostate of our system. The entropy counts the number of microstates corresponding to a given macrostate (like the number of sequences of heads and tails containing a given number of heads). The 2nd law says that in thermodynamic equilibrium, the most probable macrostate is the one with the largest number of microstates, assuming the microstates are are equally probable. That is, the entropy of a macroscopic system in thermodynamic equilibrium is maximized.

Is it possible in theory for a macroscopic system to be in a state that doesn't maximimze the entropy? Sure, but the probability of this happening is so fantastically unlikely — like flipping $10^{23}$ coins and all of them coming up heads — that in practice we will never observe this happening. This is the reason that physicists are confident that the 2nd law cannot be violated.

Once one has this heuristic understanding of entropy and has had a chance to apply it to real thermodynamic systems, there are plenty of subtle things to thing about like ergodicity, Poincaré recurrence, etc. But I don't think such subtleties should get in the way of the fact that we have a pretty conceptually simple and satisfying picture of why the 2nd law must hold.

d_b
  • 8,113
  • 1
  • 21
  • 46
  • But what about prior probabilities? In order to make statistical predictions you need prior probabilities, where do they come from? – Marco Disce Mar 06 '21 at 21:19
  • In statistical mechanics we assume a uniform prior with all accessible microstates are equally probable. (An aside: is "In order to make statistical predictions you need prior probabilities" necessarily true? Frequentist inference doesn't require a choice of prior.) – d_b Mar 06 '21 at 23:15
11

Here is one way to look at it which might help (I am no Feynman!).

My way of thinking about the second law is that if left alone, a system is unlikely to evolve into a state of decreased entropy- and the more constituent particles the system contains, the less likely that outcome is.

By the time you are dealing with particle counts of order ~10^23, the "law" becomes law, the relationships are cast in concrete, and you will never see them violated even if you watched that isolated system for longer than the lifetime of the universe.

You can reduce the entropy of a system by performing work on it to increase its orderliness, but in so doing that system is now no longer isolated and you will inevitably increase the entropy of the system's surroundings, which now have become a part of your system.

niels nielsen
  • 92,630
4

I'll attempt to give a layman's conceptual view of what the laws of thermodynamics are telling you.

The first law of thermodynamics states that energy is conserved, meaning that energy can't be created or destroyed, it can only be made to change form. This statement by itself leads to the possibility of creating a device that can take heat from the environment to do work, and such a device would be a perpetual motion machine that produced "free" work.

The second law of thermodynamics states that all energy sources spontaneously go from a "more concentrated" state to a "less concentrated" state (e.g., hot objects always spontaneously cool down to ambient conditions, but cold objects never spontaneously heat up above ambient conditions). This law was necessary because the above mentioned perpetual motion machine has never been observed. Thus, the second law of thermodynamics states that energy always "runs downhill", which means that perpetual motion machines are impossible to construct.

David White
  • 12,158
  • That was exactly what I thought, but after seeing Maxwells demon thought experiment and reading other stuff, I am not so sure. It seems that the second law doesnt prohibit perpetual motion machines, but simply states that the probability of such a machine working on a macroscopic scale for an extended period of time is infinitesimally low. Its like getting to Mars through quantum tunnelling low. – Manit Agarwal Jul 28 '20 at 16:05
  • 8
    Maxwell's demon is a figment of Maxwell's imagination. That demon doesn't exist any more than perpetual motion machines exist. And note - "infinitesimally low", from a practical standpoint, means impossible. – David White Jul 28 '20 at 16:07
  • The second law does not say heat goes spontaneously from higher concentration to lower concentration, but from higher temperature to lower temperature. Imagine a cold lump of metal warming up to room temperature - the thermal energy it absorbs from the air becomes a lot more concentrated as it moves into the lump. – pwf Jul 28 '20 at 19:40
  • 2
    This answer needs polishing to be useful even to the layman. All heat engines "take heat from the environment to do work."; I think you mean that they cannot turn thermal energy entirely into work. "[H]ot objects always spontaneously cool down, but cold objects never spontaneously heat up" is obviously false; ever take something out of the fridge? – Chemomechanics Jul 28 '20 at 20:40
  • @Chemomechanics, you are talking to the exact issue that I have seen before in this forum. The OP clearly stated that he is a high school student. This means that he is learning about thermodynamics for the first time. I taught high school physics for 13 years, and I can tell you unequivocally that the normal amount of "polishing" that this forum would want to see would quickly get to be too abstract for the OP to follow. However, note that I did modify my statement about hot objects cooling down and cold objects warming up. – David White Jul 29 '20 at 02:41
  • Putting aside the problematic issue of using "heat" and "work" as nouns, it takes so little effort to simply say: "We cannot turn heat entirely into work." – Chemomechanics Jul 29 '20 at 02:58
4

No, the second-law-of-thermodynamics isn't a hard law. Nothing they teach in school really is. For example, that stuff about Newton's laws of motion isn't hard-law, either.

Historically, engineers discovered Classical Thermodynamics. The field itself is just how they made their machines work. Academics came and formalized stuff as time went on.

Academics were confused because they had two successful theories of physics: mechanical theories (like Newton's laws) and classical thermodynamics. These were very different theories, but somehow they both seemed to work. How can they be combined into one, coherent philosophy?

The answer was Statistical Mechanics. Turns out that Classical Thermodynamics can be seen as Mechanical Physics applied at a grand scale, to tons and tons of little particles. For example, the second-law-of-thermodynamics – which had previously been believed in just because it seemed to hold true in the lab – was now almost a mathematical truth of the universe.

This mathematical-ish justification elevated the second-law-of-thermodynamics from an empirical law to a meta-physical truth behind how physics must work at larger scales. This is why it's often trusted with such confidence, beyond that afforded to even the most highly respected empirical laws:

The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations — then so much the worse for Maxwell's equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.

Arthur Eddington, as quoted by Wikiquote, in "The Nature of the Physical World" (1915), Chapter 4

Our confidence in the second-law-of-thermodynamics is so strong that it's beyond even our confidence in gravity. For example, if we were to wake up and discover that this entire world was merely a Matrix-like scenario, where everything we thought that we knew about physics was just an illusion, the second-law-of-thermodynamics would still hold – the outer universe would have to obey it, even if forces like gravity were entirely fictional.

Now to address the confusion..

Despite our extreme confidence in the second-law-of-thermodynamics, we don't actually expect the naive, Classical Thermodynamics version of it to be perfect. In fact, given our understanding of it now, we expect that it's not.

This isn't a contradiction, just a matter of needing to be precise: we're extremely confident in the general principle and stuff like it holding statistically; that's what all of the fuss is about! However, we don't expect the naive, Classical Thermodynamics notion of the second-law-of-thermodynamics to be absolute; that was never a mainstream position.


Regarding the Poincaré recurrence theorem.

Yes, the Poincaré recurrence theorem demonstrates that the second-law-of-thermodynamics, as imagined in Classical Thermodynamics in the context of the physics posited in Statistical Mechanics, can't be absolute.

Nat
  • 4,640
  • Basically, what you`re trying to say is that entropy may fluctuate in a system but in general it will tend to increase to the maximum possible value. Yes? – Manit Agarwal Jul 29 '20 at 10:02
  • @ManitAgarwal-Elpsycongroo: Yup, that's probably a good perspective to focus on. Entropy's a huge topic that may require a good bit of study to appreciate, so the picture kinda evolves. – Nat Jul 29 '20 at 10:34
  • If entropy fluctuates it will have expansions and contractions without any privileged position for expansions – Marco Disce Mar 06 '21 at 21:25
3

I want to clear up for you the meaning of "laws" in physics, and to do that we have to understand what a theory (as the theory of thermodynamics we are discussing here) means in physics.

From ancient times, physics mathematics and philosophy were tied up. It had to come to Newton's times to see a clear separation of physics theories from the rest.

At present, physics is the gathering of data and observations numerically from nature, tabulating them and looking for the best mathematical formulae and equations that can not only describe the given data, but also successfully predict future measurements.

In mathematics there exist the axioms from which all theorems can be proven and they control the form of the particular mathematical theory. The axioms are assumed to be true; they cannot be proven. At most a theorem can be raised to the position of an axiom, and then the axiom becomes a theorem. It is a closed system once the axioms are assumed.

When physics uses mathematics, data automatically has to obey the mathematical axioms, but the mathematical formulae and solutions, for example when using differential equations, are an enormous multitude, most of them not fitting any useful physics data. This brings us to the need of laws in physics. They have the power of extra axioms, in order to pick up those solutions that describe the data and observations and are also predictive of new. These laws are chosen so that the particular mathematical solutions work with the present and future data.

When you study physics further you will see that sometimes these extra axioms are sometimes called postulates or principles. They are a distillation from observations that allow to pick up those mathematical solutions (and ignoring the multiplicity of other mathematical solutions to the same equations) that are useful in describing the data.

The laws, etc. are not as strict as axioms in mathematics, because they depend on the context. In general, physics theories aim for consistency in the boundary of the phase space between two descriptions. General relativity laws are consistent with Newtonian physics for low masses and low velocities, for example. Thermodynamics emerges from classical statistical mechanics when the many particle system can be assumed, and the thermodynamic quantities emerge from the statistical behavior.

anna v
  • 233,453
1

Harvey Brown, a philosopher of physics, puts it this way (paraphrasing):

The second law is a specific case of a more general observation of our universe, that systems out of equilibrium spontaneously tend toward equilibrium.

So why does this happen and what is the mechanism behind the above? If the physical laws are entirely symmetric at the deepest level (CPT invariance), which they are, where does the asymmetry in the arrow of time or entropy come from?

The first component is the physical laws themselves. They are symmetric and don't have any preferred direction in time, yet the vast majority of systems evolve in only one direction, increasing entropy. This happens because, while the physical laws we know and love work equally well in either direction, once a system is "large" enough, they act upon it in a way that increases the multiplicity exponentially as time goes on. (Briefly, multiplicity is the idea behind particles in the corner of a box having very few initial "moves" due to confinement, but more "moves" as they spread out. You likely won't ever witness them ever again all in the corner once released.) And the universe was "large" enough at the Big Bang for multiplicity to kick in. (There isn't any multiplicity or change in entropy in extremely small assemblages.) This is the second part - prior conditions.

So, even with perfectly symmetric equations of physics, with the right kind of initial conditions, you get increasing multiplicity from the get-go, with exponentially decreasing odds at reversing the arrow. This is what we observe today. The law is a statement about the current condition of our universe. It is perfectly valid in that regime. But yes, fundamentally that arrow could reverse for the entire universe, but the odds are just so ridiculously low. The fluctuation theorem can give you those odds. The Poincaré recurrence theorem is not expected to pertain to our universe because we suspect we live in a one-shot universe that is unbounded and infinite. Although the horizon of the observable universe does complicate things a bit.

J Kusin
  • 601
  • 4
  • 10