1

Reif's book on Statistical Physics is one of the most prescribed books in the subject. For the canonical ensemble, he derives that the formula for entropy is $$S_{\rm can}=k_B\ln\Omega(\bar E)$$ which differs from the microcanonical Boltzmann formula (or definition) $$S_{\rm mic}=k_B\ln\Omega(E)$$ i.e. $E$ has been changed to $\bar{E}$ in the argument of $\Omega$, the number of microstates of the system. Now, there is another expression for entropy for the canonical ensemble which is given by $$S_{\rm can}=-k_B \sum_i p_i\ln p_i,~~{\rm where}~~p_i=\frac{e^{-\beta E_i}}{\sum_j e^{-\beta E_j}}.$$

  • I want to show that the two different looking formulae for $S_{\rm can}$ are identical by starting from any one of them and reduce it to the other.

Please see that my question is not the same as this. Note the argument of $\Omega$ in my formula (or refer to Reif's book). In no way, it is a duplicate. yu-v's answer seems correct.

2 Answers2

1

(I use $k_B=1$)

Reif gives, in $6.6.7$, the following: $$ \ln(Z) = \ln \Omega(\bar{E}) - \beta \bar{E}$$ which he derives from $Z=\sum_E \Omega(E) \exp(-\beta E)$ and thermodynamics arguments.

So we have

$$ - \sum p_i \ln p_i = \frac{1}{Z}\sum_i \beta E_i e^{-\beta E_i} + \sum_i p_i \ln(Z) = \frac{\beta}{Z}\sum_i E_i e^{-\beta E_i}+\ln(Z) = \beta \bar{E}+\ln(Z) = \ln\Omega(\bar{E})$$ by definition.

We can also further connect the sum over probabilities in general to the partition function and from it to the free energy $$ - \sum p_i \ln p_i = \frac{1}{Z}\sum_i \beta E_i e^{-\beta E_i} + \sum_i p_i \ln(Z) = \frac{\beta}{Z}\sum_i E_i e^{-\beta E_i}+\ln(Z)=$$

$$ -\frac{\beta}{Z}\partial_\beta Z + \ln(Z) = -\beta \partial_\beta \ln(Z) + \ln(Z) = -\beta^2 \partial_\beta \frac{\ln(Z)}{\beta} = \partial_T T\ln(Z)$$

For the canonical ensemble $Z=\exp(-F/T)$ so we get $S_{can}=-\partial_T F$ which is consistent with $F=U-ST$.

  • 1
    You did not exactly derive this but defined $\beta\bar{E}+\ln(Z)$ to be equal to $\ln \Omega(\bar{E})$. – Solidification Apr 22 '20 at 15:39
  • as I wrote, this is taken from Reif (I even referenced the exact equation number), which is also where you took your $S = \ln \Omega(\bar{E})$. The derivation is outlined there. If you have specific questions about it you can ask but I am not going to copy from the book verbatim. I assume that you have it, as your question originated from it (I also have no way of knowing exactly which steps you assume to be obvious and which are not, as you didn't give any context) –  Apr 22 '20 at 15:54
-1

I think that the comments/answers to the cited question completely answer yours: In statistical physics the probabilities of all the microstates are assumed to be equal, that is $$p_i = \frac{1}{\Omega},$$ where Omega is the number of microstates (i.e. the number of terms in the sum for the entropy): $$S = -k_B\sum_ip_i\log p_i = -k_B\sum_i\frac{1}{\Omega}\log \frac{1}{\Omega} = -k_B\Omega \frac{1}{\Omega}(-\log \Omega) = k_B\log\Omega.$$

Roger V.
  • 58,522
  • "In statistical physics the probabilities of all the microstates are assumed to be equal" No. This is true only for isolated systems. You can put $p_i=1/\Omega$ only for the microcanonical ensemble, for any microstate $i$. For the canonical ensemble, clearly, all microstates are not equally probable. They are given by $p_i=e^{-\beta E_i}/Z$. – Solidification Apr 22 '20 at 04:19
  • @mithusengupta123 Can you use the first formula for the canonical ensemble? The last equation is the general definition of entropy in any circumstances. – Roger V. Apr 22 '20 at 04:25
  • The first formula in my question is what Reif derives in his book for the canonical ensemble. Please note that unlike the microcanonical case where $S=k_B\ln \Omega(E)$, the first formula in my post is $S_{\rm can}=k_B\ln \Omega(\bar E)$. I know, $S=-k_B\sum_i p_i \ln p_i$ is the general definition and the second part of your derivation is correct. But I gave that subscript "can" because I wrote $p_i=e^{-\beta E_i}/Z$ which refers to canonical ensemble. – Solidification Apr 22 '20 at 04:30