4

In the context of Thermodynamics and Statistical Mechanics we encounter, basically, three different definition of entropy:
First definition:
Consider an isolated macroscopic system, it has a macrostate and a microstate; the macrostate is the collection of its macroscopic properties and the microstate is the precise condition in which the constituent parts of the system are into. Then the entropy $S$ is defined as the following quantity: $$S= k_B\ln{\Gamma} \tag{1}$$ where $k_B$ is just the Boltzmann Constant and $\Gamma$ is the number of possible microstates that are compatible with the macrostate in which the system is.
Second definition:
Same setup as before, but now the Entropy is defined as: $$S=-k_B\sum _i p_i \ln{p_i} \tag{2}$$ where the $p_i$ is the probability of the system being in the $i$-th microstate (again, taken from the pool of compatible microstates).
Third definition:
We completely change the setup: consider a reversible cyclical thermodynamical process; then we simply define entropy as a generic function of the state variables (so as a state function) that has the following property: $$dS=\frac{dQ}{T} \tag{3}$$ where $T$ is the temperature of the system in which the thermodynamical process is occurring and $dQ$ is the infinitesimal amount of heat being poured into the system in a reversible way.

So you can see for yourself that, at least as a first glance, the definition of entropy is quite fragmented and confusing.
Fortunately we can easly unify the the first and second definition by invoking the second fundamental postulate of Statistical Mechanics: the Principle of Indifference, it tells us that: $$p_i=\frac{1}{\Gamma}$$ and then with a little bit of work we can easily show that the two definitions are equivalent.

But, even with this improvement, the picture remains fragmented, for mainly two distinct problems:

  1. Fragmentation between Classical and Quantum Mechanics: How do we count the number of possible microstates $\Gamma$ in a non-quantum, classical, perspective? What I mean is: We sort of taken for granted, in the first two definitions, the fact that the number of possible compatible microstates is finite, but of course this can only be true in a quantum scenario, by hoping that the quantum phase space of position and momentum, is quantised. What if this is not the case? What if we want to statistically define entropy in a classical, non quantum scenario? How we define $\Gamma$ in the classical phase space of position and momentum? Statistical entropy can only be defined quantum mechanically?

  2. Fragmentation between Classical Thermodynamics and Statistical Mechanics: How do we show that the first two definition are compatible with the third? This seems challenging, expecialy because the third definition appears to be really vague. This is sort of a bonus question because I founded this previous question tackling the same problem; but I think that the answer provided there is rather poorly made and excessively concise, and also I think it would be nice to complete this picture all in one question. But feel free to ignore this one if you think it would be an useless repetition of the answer I linked.

Gert
  • 35,289
Noumeno
  • 4,474

2 Answers2

3

Not sure if this is what you are looking for, but I'll give it a go.

In a classical system, you are working in phase space. You have to subdivide this space into volume elements in order to do any type of counting. We assume that the system under consideration depends upon some $f$ generalized coordinates and momenta:$$E(q_1...q_f, p_1...p_f)$$ and thus phase space is divided into volume elements of $h_0^f$, where this 'size' remains undefined.

The partition function is now calculated as: $$Z=\int...\int e^{-\beta E(q_1...p_f)}\frac{dq_1...dp_f}{h_0^f}$$ You can now use this to calculate many thermodynamic properties, including entropy $S=k(lnZ+\beta \bar E)$.

If you perform this calculation for an ideal gas(with proper counting to avoid the Gibbs Paradox), you get: $$S=kN(ln\frac{V}{N}+\frac{3}{2}lnT + \sigma_0)$$ where $$\sigma _0=\frac{3}{2}ln\frac{2\pi mk}{h_0^2}+\frac{5}{2}$$ Note our undefined volume in phase space is in $\sigma_0$. If you do the exact same calculation quantum mechanically, you end up with the exact same formula, except now $h_0$ is $\hbar$ as it should be.

So to summarize, statistics requires counting. When working classically, you have to subdivide phase space into an undefined volume that gets carried through in your calculations. This volume becomes properly defined as Planck's constant when doing the same calculation quantum mechanically.

With regards to the second question, I'll only say this. In the first instance, the equation defines entropy and in the second, it defines an infinitesimal change in entropy. One derives the first by bringing two systems of roughly the same size into contact and determining the conditions for equilibrium (the temperature must be the same). One derives the second by having a smaller system exchange heat with a much larger heat reservoir and then calculating the subsequent change in the entropy of the heat reservoir by assuming its temperature remained constant.

As a final note, I'll just point out if you don't know (apologies if you do), that your first entropy defined above is Boltzmann's definition, while your second is Gibb's definition. There is a difference that is discussed here.

CGS
  • 2,488
0

In thermodynamics, entropy is a state quantity. So, it is used to define a specific state of the system in the configuration space. First definition you gave is the Boltzmann Entropy and is only applicable to isolated systems or microcanonical ensembles. In a microcanonical ensemble, state variables such as energy, number of particles and volume are fixed. So, all microstates have the same energy and are equally probable. Thus, for all microstates $j$, $P_j = \frac{1}{\Omega}$; where $\Omega$ is the number of all possible microstates.

This notion of entropy can be extended for other ensembles by the associated Legendre transformations. From that, you arrive at the second definition you gave, the Gibbs Entropy. That is a generalized version of entropy that is applicable to all ensembles/ systems. This definition of entropy is identical to the third definition you gave (https://doi.org/10.1119/1.1971557).

For a semi-qualitative understanding, consider the following situation. You have $N$ boxes, $N_1$ blue balls, $N_2$ red balls and balls that have the same colour are indistinguishable. Also, $N_1$ and $N_2$ can fluctuate as long as $N_1 + N_2 = N$ holds. Say, you're looking for the equilibrium state of this system. Using the Boltzmann entropy you can prove that the equilibrium state is $N_1 = N_2 = \frac{N}{2}$; because this macrostate is the most probable one and corresponds to maximum entropy. Another way to understand this without doing any calculation is to look at the macrostate that has the most microstates or has the highest degeneracy. If $N_1 = N_2$, then you can make the maximum number of combinations when distributing them to $N$ boxes. So, if this system is ergodic it will spend most of its time visiting microstates of the mentioned macrostate. In addition, this macrostate has the maximum amount information this system can have. In other words, the information you need to describe this state is more than the information you will need to describe any other state. Hence, Gibbs entropy is the physical analogue of Shannon entropy. However, the fact that you need more information doesn't imply anything about disorder. As you will need a definition of order to do that. On the contrary, one can find the number of balls being equal pretty plausible and ordered.

Long story short, entropy is mostly (if not all) about redistributing the available energy until equilibria for all describable systems are reached. Only then the equilibrium can be reached.

See, https://en.wikipedia.org/wiki/Heat_death_of_the_universe.

Ef00
  • 136