30

In simple words what is the conceptual difference between Gibbs and Boltzmann entropies?

Gibbs entropy: $S = -k_B \sum p_i \ln p_i$

Boltzmann entropy: $S = k_B \ln\Omega$

3 Answers3

36

The Gibbs entropy is the generalization of the Boltzmann entropy holding for all systems, while the Boltzmann entropy is only the entropy if the system is in global thermodynamical equilibrium. Both are a measure for the microstates available to a system, but the Gibbs entropy does not require the system to be in a single, well-defined macrostate.

This is not hard to see: For a system that is with probability $p_i$ in a microstate, the Gibbs entropy is

$$ S_G = -k_B \sum_i p_i \ln(p_i)$$

and, in equilibrium, all microstates belonging to the equilibrium macrostate are equally likely, so, for $N$ states, we obtain with $p_i = \frac{1}{N}$

\begin{align} S_G &= -k_B \sum_i \frac{1}{N} \ln\left(\frac{1}{N}\right) \\&= -k_B N \frac{1}{N} \ln\left(\frac{1}{N}\right) \\ &= k_B \ln(N)\end{align}

by the properties of the logarithm, where the latter term is the Boltzmann entropy for a system with $N$ microstates.

ACuriousMind
  • 124,833
  • I've been looking for a justification of the connection between these two concepts for a while, and your point about Boltzmann's formula necessitating thermal equilibrium to ensure the equiprobability of the micro states really illuminated this for me. – Matt Hanson May 09 '23 at 03:47
4

It is true that "In the special case all (micro)states are equally probable and $p =1/N$ , then $S=\ln N$." This, however is not equivalent to the Boltzmann entropy, as is often written. The equivalence is mathematical but not physical. The Boltzmann entropy is for an isolated (microcanonical) system, whereas the Gibbs entropy is canonical and exchanges energy with its surroundings.

Mass
  • 2,020
3

The expression $$ I=-\sum_i p_i\ln p_i $$ is a function of probabilities $p_i$, also called information entropy. It can be calculated for any values of $p_i$, even those that are not appropriate for an equilibrium state. In such case it has nothing to do with equilibrium thermodynamic entropy $S$. When some equilibrium state $X$ is prescribed, however, the appropriate probabilities $p_i^*$ for this state are those that maximize $I$ (can be based on probabilistic considerations). Only then $k_B I[p^*]$ gives value that corresponds to thermodynamic entropy $S(X)$ of the state $X$.

In special case when all states are equally probable, $p_i=\frac{1}{N}$ where $N$ is number of available states and $$ I = \ln N. $$

So the last expression is a special case of the first one. Since the probability of a state of thermodynamic equilibrium is believed to depend only on the corresponding energy of this state, this simplification is appropriate when the system has fixed and known energy $E$ - then all states with this energy are equally probable and it suffices to count them as a function of $E$ and calculate entropy $S$ as $k_B\ln N(E)$.