This question suggests that for the microcanonical ensemble, additional to the "usual" definition of entropy \begin{align} \omega(E)=Tr \delta(E-H) \\ S_B=\ln \omega(E) \end{align} (Called Boltzmann entropy) there also is a so called "Gibbs' Entropy": $$ \Omega(E)=\int_0^E\omega(e) \\ S_G=\ln \Omega(E) $$
I'm confused about those two definitions: First and foremost, why is the 2nd definition also called "Gibbs' Entropy", when the usual entropy formula given by gibbs $\sum_i p_i \ln p_i$ generalizes to $S_B$ for an equilibrium state in the microcanonical ensemble?
Which one is the more natural one to use? One can find equilibrium states by maximizing $\sum_i p_i \ln p_i$, and it will yield $S_B$. Can we maximize a different quantity to find $S_G$ as well?
On the other hand, $S_G$ is the right candidate to define temperature in a way that satisfies the equipartition theorem. Which of those arguments weighs more?
Or do you want to say that $S_B$ as well isn't equivalent to the information-theoretical definition of entropy? I indeed claimed that.
– Quantumwhisp Jan 18 '22 at 08:50