1

This question suggests that for the microcanonical ensemble, additional to the "usual" definition of entropy \begin{align} \omega(E)=Tr \delta(E-H) \\ S_B=\ln \omega(E) \end{align} (Called Boltzmann entropy) there also is a so called "Gibbs' Entropy": $$ \Omega(E)=\int_0^E\omega(e) \\ S_G=\ln \Omega(E) $$

I'm confused about those two definitions: First and foremost, why is the 2nd definition also called "Gibbs' Entropy", when the usual entropy formula given by gibbs $\sum_i p_i \ln p_i$ generalizes to $S_B$ for an equilibrium state in the microcanonical ensemble?

Which one is the more natural one to use? One can find equilibrium states by maximizing $\sum_i p_i \ln p_i$, and it will yield $S_B$. Can we maximize a different quantity to find $S_G$ as well?
On the other hand, $S_G$ is the right candidate to define temperature in a way that satisfies the equipartition theorem. Which of those arguments weighs more?

Quantumwhisp
  • 6,733
  • 2
  • 19
  • 50
  • Contrary to what is claimed in the OP, the various definitions of entropy are not really equivalent. See discussion and further references in this answer – Roger V. Jan 18 '22 at 08:41
  • @RogerVadim where do I claim that the definitions are equivalent? – Quantumwhisp Jan 18 '22 at 08:43
  • You seem to suggest that they follow from the information-theoretical definition of the entropy (which is yet another kind of entropy - if you have already looked at the link). I didn't mean to offend you - just wanted to stress the point that entropy can mean different things. This is by no means a trivial point. – Roger V. Jan 18 '22 at 08:47
  • @RogerVadim none taken, I just wanted to clarify. Wether or not $S_G$ can follow from the information-theoretical definition is not a claim, but the question I pose.

    Or do you want to say that $S_B$ as well isn't equivalent to the information-theoretical definition of entropy? I indeed claimed that.

    – Quantumwhisp Jan 18 '22 at 08:50
  • Jaynes claims that there are 6 different kinds of entropy. I did not study the question deeply enough to distinguish them all, but, e.g., Gibbs defined entropy phenomenologically, as a function that always increase, whereas Boltzmann defined it as a logarithm of the number of states. It is shown in stat physics that the latter can serve as the former, but not sure they must be the same. Information-theoretical entropy equals to Boltzmann entropy in equilibrium, under the assumption that the states are equally probable. – Roger V. Jan 18 '22 at 09:05
  • Then there is Boltzmann entropy and the entropy as the extremum value see here. I am not sure they are the same, even thoiugh it is suggested, e.g., in this thread. – Roger V. Jan 18 '22 at 09:06

0 Answers0