0
  • Boltmann's entropy: $$S_B=k_B ln(\Omega)$$ where $k_B$ is the Boltzmann constant, and $\Omega$ is the number of accessible microstates of the system.
  • Statistical entropy:$$S_s=-k\sum_i P_i ln(P_i)$$ where k is a positive constant, and $P_i$ is the probability to be in the accessible $i$ microstate .

Observation: if $k=k_B$ and $P_i=1/\Omega$, the statistical entropy becomes equals to the Boltzmann entropy, $S_s=S_b$. So it seems that the statistical entropy equals the Boltzmann entropy only if the probability distribution is the uniform distribution $P_i=1/\Omega$.

How is it possible that, in canonical distribution, at equilibrium, $S_B=S_s$ and, at the same time, $P_i \neq 1/\Omega$?

The only possible answer that I came up with is that $S_B=S_s$ is true only for isolated systems, because for an isolated system the $P_i$ distribution is indeed $1/\Omega$. However, reading online, it seems that $S_B=S_s$ is true for every equilibrium state, so I don't know.

SimoBartz
  • 1,790
  • Related https://physics.stackexchange.com/q/98948/226902 https://physics.stackexchange.com/q/545714 https://physics.stackexchange.com/q/722051 and possible duplicate: https://physics.stackexchange.com/q/155351/226902 – Quillo Oct 12 '22 at 21:58
  • You will find there is aa lot of confusion on this site regarding entropy, and this is a reflection of the even-greater confusion of the world at large regarding entropy. It is not helped by the fact that we use the word "entropy" to mean a lot of different things. – hft Oct 12 '22 at 22:37
  • 1
    Have a look at Eq (18) and the surrounding explanation from this paper: https://www.informationphilosopher.com/solutions/scientists/jaynes/Jaynes_Gibbs_Boltzmann.pdf – hft Oct 12 '22 at 22:37
  • Also: https://physics.stackexchange.com/questions/172717/entropy-s-for-canonical-nvt-and-isobaric-npt-ensemble – hft Oct 12 '22 at 22:52

0 Answers0