2

Entropy is famously defined as

$S = k_B \ln(\Omega(E))$,

where $\Omega$ is the number of microstates of a system with energy $E$. If I consider a closed system with energy $E$, this immediately defines the number of microstates accessible to the system which would realize that energy. The number of microstates is therefore indifferent to the current "energy configuration" and is an invariant. How come then that entropy is supposed to increase as the system approaches its equilibrium state?

  • The total number of configuration $N$ is the sum of all the configurations associated to all the possible energies. $N$ doesn't depend on a specific energy but is related to the system per se, and is ot related to the entropy of the current system. Now, each energy $E$ has a certain number of configurations realizing it. This is $\Omega(E)$. The system evolves towards the state of energy $E$ that maximizing $\Omega(E)$. Therefore the entropy $S[\Omega(E)]$ is increasing. – J.A Dec 05 '18 at 10:55

1 Answers1

1

This is all about what we mean by a 'state' of a system, and over the years different experts have had different views on it. I expect there will be different opinions in the replies here. There is a good related question here Does entropy depend on the observer? and a link to a helpful paper by E. T. Jaynes https://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf

One way to deal with your question is to say that the claim is not that $S$ has to increase, only that it can't decrease, and your observation is consistent with that. I feel that this reply would be inadequate, however. It seems to me that $S$ does increase as a system comes to thermal equilibrium. But there you have the hint of the answer to your question: we only want to claim that $S$ increases if the system was not already in thermal equilibrium to begin with. But how did we know that it was not in thermal equilibrium at the start? We can only know that if we know something in addition to the energy and other fixed external constraints. So we are claiming to know that, initially, the system state is somewhere within a bunch of microstates which is a subset of the states counted by $\Omega(E)$. Thus the initial entropy is $S_0 = k_B \ln ( \Omega_0(E) )$ where $\Omega_0(E) < \Omega(E)$.

The above largely settles the matter. But I can't help feeling that a little more is needed. The difficulty is that we want to make physics as far as possible about things which can be agreed, and which are not dependent on subjective things such as the prior knowledge that a given observer may have. That means we want to define entropy as far as possible as a property of a system, not a statement about knowledge of an observer. There is disagreement about how far this can be achieved, and, for myself, I have not yet settled the matter completely in my own mind.

For anyone else who wishes to answer, I would like to encourage them to keep in mind phenomena such as spin echo, in which, owing to correlations in the initial state, the evolution of an isolated system can appear to decrease entropy (there are many examples). It's ultimately up to us how we choose to define technical terms in physics, but the consensus is that the 2nd law of thermodynamics is so useful that it would be a mistake to define entropy in such a way that spin echo is an example of a refutation of the 2nd law. In the above-mentioned article Jaynes offers the following statement as a good way to state the 2nd law:

'The experimental entropy cannot decrease in a reproducible [adiathermal] process that starts from a state of complete thermal equilibrium.'

I put [adiathermal] in brackets because he used the word adiabatic, but I think he means simply 'no heat transfer' not 'constant entropy' (the word has had a changing meaning over the years).

Andrew Steane
  • 58,183