1

A few dumb questions about entropy. (I apologize if the answers are already available.)

We usually associate entropy with the disorder/ randomness of a body but I don't understand what is this "order" we talk about? What is this ordered state that we compare our system to and then decide that our system has more or less entropy than it?

Also, does it make sense to say things like "this system has this much entropy"? Should we not only be talking about the "change" in the entropy of a system?

Qmechanic
  • 201,751
ryan1
  • 189
  • Entropy is the average amount of information per datum. – shawn_halayka Oct 05 '21 at 01:18
  • Entropy is also a measure of the degrees of freedom in a physical system. Imagine a substance where the connectivity is only with nearest neighbours. Concrete is a good example. The number of degrees of freedom per particle is like n. Now imagine a substance that has been smashed to dust, exposing all of the innards. The connectivity is higher than n, more like n^2. – shawn_halayka Oct 05 '21 at 01:40

3 Answers3

3

Probably, the easiest way to understand entropy is to think of it not as of a property of a system, but rather as the property of our knowledge of the state of the system, or better to say, the property of the probability distribution of the system over different internal states. If a system can be in states $1,2,...,n$ with probabilities $p_1,p_2,...,p_n$, then entropy is usually defined (up to a constant multiplier) as $$ S=-\sum_i p_i \ln p_i.\tag{1} $$ You can notice, that if the system is in a definite state (i.e. all $p_n$, except one, are zero), entropy is zero. Also, entropy $S$ is the largest when all $p_n$ are equal; then $S=\ln(n)$. The definition gets a bit more complicated when we deal with continuous distributions, changing the sum to integrals over degrees of freedom like all positions and all momentums of the particles.

Over time, a system, due to external noise and internal dynamics, distributes over all accessible states within imposed constraints (like, the gas must remain in the box, the average energy should remain constant, etc.). The more states it distributes over - the larger is the entropy. When a material is in a gas phase - each molecule can travel anywhere and is not constrained by others, unless it comes very close to them (thus, entropy is high). In contrast, when a material is in a solid phase, each molecule is strongly constrained by others, and has little space to fit into the lattice (yielding a low entropy).

Pavlo. B.
  • 2,605
1

"Order" is kind of an imprecise term. What we mean by entropy is "how many microstates could contribute to this macrostate."

The way this is expressed mathematically is:

$S = k \ln(W)$

$S$ is the entropy of the macrostate of the system, $W$ is the number of microstates that could produce that macrostate $k$ is Boltzmann's constant.

Boltzmann actually has this equation inscribed on his grave, believe it or not.

A classic example is flipping 100 coins. A possible macrostate (macroscopic observable configuration of the system) is 100 heads and no tails.

A possible corresponding microstate (exhaustive delineation of each Heads or Tails state of every coin) is coins 1 through 100 all landing on heads. This state has only one microstate $W$, so $S=0$. It is a low entropy state of the system.

On the other hand, there are many, many microstates that would achieve the macrostate of 50 heads and 50 tails. You could have the even numbered coins heads and the others tails, you could have the first 50 heads and the next 50 tails. You could have the first two be heads, then 50 tails, then 48 heads....... there are about $10^{29}$ total possibilities, believe it or not. So for that macrostate, because it has so many options and because each microstate is equally likely, it has higher entropy per the formula.

So that is entropy. The concept of "order" comes in when, for example, you start with a box of coins that are 100 heads, and then shake up the box or perturb it in some way, you will end up with a state much closer to 50 heads. The system is said to have evolved from a low entropy state to a high entropy state, and it will never spontaneously evolve back to lower entropy. And the 50 heads and tails roughly corresponds to more "disorder" in a sense.

RC_23
  • 9,096
1

Edit: This answer is basically a mathematical supplement to the other two answers provided, which both make sense to me.

How do we define entropy?

  1. Thermodynamically: studying an ideal gas in a Carnot cycle led to the realization that the ratio of the energy absorbed isothermally at some temperature $T_1$ was equal to the ratio of energy surrendered at some temperature $T_2$ $(1)$. This eventually led to the Clausius inequality $(2)$ and ultimately a definition of entropy $(3)$.

$$\frac{\Delta Q_1}{T_1}=\frac{\Delta Q_2}{T_2} \tag{1}$$ $$\oint\frac{dQ}{T} \leq 0 \tag{2}$$ $$dS \geq \frac{dQ}{T} \tag{3}$$

  1. Through statistical mechanics: Imagine two separate boxes of gases which can only conduct heat, but not transfer volume or particles. We discover that the thermodynamic temperature, $\beta$, is constant for the two systems when they are allowed to reach an equilibrium $(4)$. In addition, we know the conservation of energy $(5)$ may be restated in terms of the entropy given in $(3)$, assuming a reversible system. Combining these two facts and invoking Boltzmann's postulate that the number of microstates, $\Omega$, is maximized at equilibrium, we can achieve the Boltzmann entropy $(6)$.

$$\frac{\partial \ln \Omega}{\partial U} = \beta \tag{4}$$ $$dU = -PdV +dQ=-PdV+TdS \tag{5}$$

$$\frac{\partial S /\partial U}{\partial \ln \Omega / \partial U}= \frac{1}{\beta T} = k_B$$

Upon integrating this equation with $dV=0$ and ignoring the additive constant,

$$S=k_B \ln \Omega \tag{6}$$

This answers one of your questions:

Should we not only be talking about the "change" in the entropy of a system?

If using the Boltzmann entropy equation, then its absolute value does have meaning, due to our omission of this additive constant. As pointed out in other answers, when there is only one microstate, the Boltzmann entropy is zero which clearly defines the "floor" of the entropy; the most ordered system possible corresponds to the lowest entropy.

but I don't understand what is this "order" we talk about?

In terms of the Boltzmann entropy this order is clearly defined by counting microstates, and therefore the entropy is a measure of disorder of those microstates.

Considering a different scenario, where a container of gas is in contact with a heat bath. The following probability distribution of energy levels can be derived in a variety of ways (canonical ensemble):

$$\rho_i = \exp(\beta(F-E_i)) \tag{7}$$

We discover a new entropy equation by performing the following operation (noting that $U=\langle E_i \rangle$):

$$-k_B \langle \ln \rho_i \rangle = -k_B \langle \beta (F-E_i) \rangle = -k_B \beta F + k_B \beta U = (U - F)/T \tag{8}$$

By recognizing $(8)$ as the Legendre Transform of our internal energy $(U)$, we can find a new form for entropy. It is apparent that $U \equiv U(S,V)$ from $(5)$, so while holding $V$ constant, we need to only imagine a function $F \equiv F(T,V)$ which transforms the energy function from $S \rightarrow T$,

$$U(S,V) + (-F(V,T)) = ST \tag{9}$$

Therefore, from $(8)$ and $(9)$, and noting $F$ as the Helmholtz Free Energy,

$$S = -k_B \langle \ln \rho_i \rangle = -k_B \sum_{i} \rho_i \ln \rho_i \tag{10}$$

This is the Gibbs Entropy. It can also easily be shown that this more general entropy equation reduces to $(6)$ when the probability distribution is simply $\rho_i = 1/\Omega$. There are numerous ways of validating that Gibbs entropy equation is self-consistent.

but I don't understand what is this "order" we talk about?

The problem has now been reframed in terms of the probability distribution of the underlying energy levels. In principle this can be the underlying "anything" which we are trying to determine the disorder of. This entropy therefore determines the "expectation of the logarithm of the probability function"; recalling that probabilities always fall on $[0,1]$, we know that the expectation of the logarithm of the probability will trend towards negative infinity as this expectation value drops to zero. Therefore:

  • Lower average probability of a state occuring = high entropy
  • Higher average probability of a state occuring = low entropy

What is this ordered state that we compare our system to and then decide that our system has more or less entropy than it?

This is the famous "arrow of time" and you can see a recent Physics SE post that has many answers discussing it, and the apparent irreversibility of systems with $N \rightarrow \infty$.

Finally, $(10)$ is related to the Shannon entropy by a factor of $k_B$, which serves as the physical factor which corrects for units in how entropy is defined thermodynamically. For non-physical systems there is nothing fundamental about this factor, and the entropy might as well be,

$$\mathcal{S} = - \sum_{i} \rho_i \ln \rho_i$$

This helps cement the ideas of entropy as it relates to probability as mentioned in my bullet points above.

michael b
  • 782