4

Conceptually, I've always understood entropy to be a statistical idea. For example, if you have a vacuum inside of a box and you place a handful of gas atoms on one side, the molecules have a higher statistical probability to spread out, versus remain in one concentrated spot. Therefore, on average, they will spread around and entropy is increased.

Of course there's a more elaborate definition involving macrostates and microstates, where entropy is greater when a macrostate has more microstates. However, in my understanding, the idea is still the same: increasing entropy is really just a statistical likely-hood that will hold true on average.

There are many formulas that relate entropy to energy. For example, Gibbs Free Energy: $$ΔG = ΔH - TΔS$$

Now, before I explain my confusion, I'd like to share an example from my course: hydrogen bonding (non-covalent interactions... like electrostatic interaction).

We call the formation of hydrogen bonds as being "stabilizing" - that is, they lower the overall energy $ΔG$. However, when hydrogen bonds form, the $ΔS$ actually decreases; there is more order when molecules align to form H-bonds. But this entropy cost is offset by a greater decrease in enthalpy, bringing the overall $ΔG$ down (so it's negative), and therefore making the bond "stabilizing".

That's where my confusion begins. I'm having a difficult time understanding how we can relate the idea of entropy to energy. A paper that my professor once showed me makes it clear that entropy isn't a measure of energy density/distribution... rather, it's more of a statistical observation.

So, assuming that entropy is a statistical idea (which is where I think my gap in understanding may be), how does forming a hydrogen bond, or being ordered in general, "destabilize"/increase the energy? Why does being ordered - which is simply just a small probability event actually occurring - impact the energy of a system? It's not like some outside force is inputting energy into the system to align and order the molecules... so how is entropy, a statistical idea, playing a role in the energy?

F16Falcon
  • 1,013
  • 3
    Equilibrium is attained when each part of your system is identical (in the thermodynamic sense) to every other part. And this happens by spreading any excess energy a subsystem has to the rest of the system. This when looked at in terms of possible distribution that your subsystem can have will increase with increase in energy. Thus the spreading of energy causes increase in entropy. This, thermodynamically is linked with heat. Is this in the direction of what you’re looking for? – Superfast Jellyfish Feb 07 '20 at 04:47
  • @FellowTraveller Hi, thanks for your reply. It is kind of what I'm looking for, however, could you further explain what you mean by "this... will increase with increase in energy"? I'm not sure I understand what you are referring to when you say "this". Maybe you can write it as an answer to allow voting and marking as best! :) – F16Falcon Feb 07 '20 at 13:11
  • There are two definitions of entropy, which are actually the same, see my answer here https://physics.stackexchange.com/questions/519293/does-the-fact-that-there-are-two-different-mathematical-definitions-of-entropy-i/519691#519691 – Mr Anderson Feb 08 '20 at 03:14

2 Answers2

2

Here is a mathematically precise real-world example: Consider a monoatomic ideal gas of $N$ atoms in a box of constant volume $V$, in contact with an environment of temperature $T$.

Now imagine increasing the temperature by a small amount $dT$, which will increase the average total energy $U$ of the gas by an amount $dU = \frac{3}{2} N k_B dT$. The first law $dU = T\,dS - p\,dV = T\,dS$ says that the entropy will increase by $dS = \frac{dU}{T} = \frac{3}{2} N k_B \frac{dT}{T}$.

Now, as you know, entropy is $S = -k_B \sum_i p_i \ln p_i$ where the sum goes over all possible microstates $i$ of the gas and $p_i$ is the probability of microstate $i$. It is a measure of the unpredictability of the current microstate: How difficult is it to guess the microstate if you only know the macro variables?

How to interpret the entropy increase? The first thing to notice is there are more microstates with high energy than with low energy. For example, there are fewest microstates with energy $0$: All particles are standing still with $0$ energy. If you increase the energy, there are more ways to spread the energy across the $N$ particles, so it is more difficult to guess the actual microstate. If the average energy $U = \sum_i p_i E_i$ increases, the microstates with higher energy will be more likely, and since there are a greater number of them, the entropy should increase.

Apologies if that was a bit rambling...


Edit: Regarding your example of hydrogen bonding and Gibbs free energy. The big picture is that you have to consider both the system (the two hydrogen atoms) and the environment (the rest of the universe, including all the other molecules of the gas).

The basic problem of thermal equilibrium is like this: We know that the total energy in the universe is constant: $E_\mathrm{sys} + E_\mathrm{env} = E_\mathrm{tot}$. Furthermore, both the system and the environment have greater entropy the more energy they have. Now, the Second Law says that all processes will tend to maximise the total entropy of the universe: $$ \text{In equilibrium, } S_\mathrm{tot} = S_\mathrm{sys}(E_\mathrm{sys}) + S_\mathrm{env}(E_\mathrm{tot} - E_\mathrm{sys}) \text{ is maximised}. $$

What will $E_\mathrm{sys}$ be in equilibrium? If the system gets less energy, the entropy of the system decreases. But the environment gets more energy and can therefore have more entropy. That is why the system tends to minimise energy: because it increases the overall entropy!

But there is of course a tradeoff: If the system gives away energy, it will also lose some entropy. Equilibrium occurs when the system's lost entropy is equal to the environment's gained entropy. So from the system's point of view, there is a tension between minimising energy and maximising entropy. Minimising the Helmholtz and Gibbs free energies is a clever way to find the solution to this problem.

As an aside, we can also express the sentence in italics as $$ \frac{dS_\mathrm{sys}}{dE_\mathrm{sys}}(E_\mathrm{sys}) = \frac{dS_\mathrm{env}}{dE_\mathrm{env}}(E_\mathrm{tot} - E_\mathrm{sys}). $$ In equilibrium, the system and the environment should have the same value of $\frac{dS}{dE}$. Actually this is a definition of temperature: $\frac{dS}{dE} = \frac{1}{T}$. So it just says that at equilibrium, $T_\mathrm{sys} = T_\mathrm{env}$.

  • Thanks for the reply! The idea that more microstates are possible when energy is higher makes a lot of sense to me now - upvoted! So, when molecules interact (i.e. get closer) due to electrostatic attractions, energy is lowered, thus entropy is always lowered? Is that assumption correct? If so, is it just that often times (as in the case with intermolecular forces like hydrogen bonds), a MUCH greater decrease in enthalpy (due to potential energy?) offsets the decrease in entropy, thus making the Gibbs Free Energy negative, and the attraction spontaneous? – F16Falcon Feb 08 '20 at 00:01
  • 1
    Without having done the actual analysis, I think that sounds more or less right! You get a kind of tension between minimising energy and maximising entropy, which I believe is what minimising the Gibbs free energy (at constant $p$, or the Helmholtz free energy at constant $V$) encodes. But I am skeptical when you say "when energy is lowered, entropy is always lowered". It is usually the case, but in principle, I think it depends on the system. In your case, it is because of geometry: there are more ways for molecules to be far apart and moving quickly than close together and moving slowly. – Elias Riedel Gårding Feb 08 '20 at 00:21
  • I think I know what I should say now, I will make an edit to my question! – Elias Riedel Gårding Feb 08 '20 at 00:25
  • See the edit, maybe it will make it clearer. But it also came out more rambling than I had hoped! – Elias Riedel Gårding Feb 08 '20 at 01:03
  • Great answer! Makes a lot more sense now. Just one minor question: when a hydrogen bond is formed and the system's energy decreases, how does the surrounding's energy increase? Where is the energy for the surrounding coming from? – F16Falcon Feb 08 '20 at 16:07
1

Say you have a box. And you have $n$ balls numbered from $1$ to $n$. Your goal is to put the balls inside the box. However, more the number of balls inside the box, the harder it is to add more. That’s just how the box is. With one box there is not much choice and we have to brave through and put all $n$ balls inside.

However, if now there are two such boxes it is easier to put $n/2$ in each box than to put all $n$ in each (or any other configuration for that matter). And similarly if you have more boxes. The least energy is expended when we share the balls equally in all boxes.

Notice however, that if now someone closes all the boxes and asks you where ball number $k$ is, you can’t say where it is. But if you had put all of them in one box then you can tell exactly where it is (by lifting the box say). In this sense, you have reduced the energy required at the expense of losing information about where the ball is. In order to minimise energy, you are spreading your system out.

This is a crude analogy for how energy is impacted by entropy. The system will always tend towards a lowest energy configuration and this happens to be the configuration where the system is spread out.

Hope this helps.