0

In Gibbs ensemble, we assume that along with energy exchanges (as heat), there is also an extensive parameter $x$ that changes in a field $f$ (so that $dE = dQ + fdx$) such that $E_{\text{sys}}+E_{\text{env}}=E_0$ and $x_{\text{sys}}+x_{\text{env}}=x_0$ where $E_0$ and $x_0$ are both constants.

Then we observe that the probability $P(\mu)$ of finding the system in a microstate $\mu$ is proportional to $\mathcal N_{\text{env}}(E_0-E_\mu, x_0-x_\mu)$.

Then using that the number of particles in the system is much smaller than that of those in the environment (to ignore the higher derivatives), and using that $\partial S/\partial E = 1/T$, we get that $P(\mu)$ is proportional to $$ \exp\left(-\frac{E_\mu}{k_BT} -\frac{1}{k_B}\frac{\partial S}{\partial x}x_\mu\right). $$

Now, all that remains to show is that $$ \frac{\partial S}{\partial x}= -\frac{f}{T} $$, and we'll get the celebrated $e^{-\beta(E_\mu-fx)}$. But I've no clue how to prove this.


I tried considering two systems labeled 1 and 2, isolated from the rest of the world, exchanging energy and $x$ such that \begin{align} E_1 + E_2 &= \text{const}, \quad\text{and}\\ x_1+x_2 &=\text{const}. \end{align}

Then we need to extremize $$ S_1(E_1, x_1) + S_2(E_2, x_2)-\lambda(E_1+E_2)-\mu(x_1+x_2), $$ which yields \begin{alignat}{2} \frac{\partial S_1}{\partial E_1} &= \frac{\partial S_2}{\partial E_2} &= \lambda &\stackrel{\text{def}}{=}\frac{1}{T},\quad\text{and}\\ \frac{\partial S_1}{\partial x_1} &= \frac{\partial S_2}{\partial x_2} &= \mu. \end{alignat}

But this was a dead end as I could not show $\mu=-f/T$ here.

Atom
  • 1,931
  • 1
    I have not studied statistical mechanics till. But in classical thermodynamics, we can find $\frac{\partial S}{\partial x}$ as follows. If $S(E,x)$, then by first law of thermodynamics, $TdS=dE-fdx\implies dS=\frac{dE}{T}-\frac{fdx}{T}$. Hence, $\frac{\partial S}{\partial x}=-f/T$. I have not studied ensembles etc. But in classical thermodynamics this identity is a consequence of first law when fundamental relation of entropy is given. – Iti Apr 22 '21 at 10:14
  • @Iti I would have been content with that if I knew why $dQ = TdS$ holds, which I don't, at least from statistical mechanics. :( – Atom Apr 22 '21 at 10:29
  • @Atom It is much simpler to work with the thermodynamic identity than trying to prove with statistical mechanics. In thermodynamics $dQ = T dS$ directly comes from the definition of entropy. – GiorgioP-DoomsdayClockIsAt-90 Apr 22 '21 at 13:37
  • @GiorgioP Isn't $dQ = TdS$ valid when only heat is exchanged (and nothing else (like vol) changes)? I'm rusty on my thermodynamics. – Atom Apr 22 '21 at 15:47
  • 1
    @Atom Not really. Think of a Carnot cycle. Heat is exchanged along the two isotherms. At the same time, some work is done. – GiorgioP-DoomsdayClockIsAt-90 Apr 22 '21 at 20:06
  • 1
    @Atom In your proof, what you are trying to use is not statistical mechanics, but thermodynamics, in the last paragraph. Indeed you can proof that the parameter $T$ that comes up from this $\frac{\partial E}{\partial S} = T$ has the usual property of temperature. I don't think you can prof the equation you would like to within the framework of Stat. Mech. The same for $fdx$. Hope I explain my self correctly. – Mark_Bell Apr 23 '21 at 01:28
  • @GiorgioP Yes, I was wrong. But now, to use $dQ=TdS$, I need to show that the thermodynamic entropy is the same as the one defined in StatMech. Can that be done? – Atom Apr 23 '21 at 04:43
  • @Atom Yes that can be done. Although, depending on the level of generality and rigor, can be a more or less easy task. The most general proof requires proving the existence of the so-called thermodynamic limit. An apparently simple proof, as found in many textbooks, is given for instance here: https://physics.stackexchange.com/questions/301477/how-does-the-statistical-definition-of-entropy-reduce-to-heat-engine-entropy . Notice however, if you need completely rigorous proof, that in such an approach some assumptions have not been explicitly justified. – GiorgioP-DoomsdayClockIsAt-90 Apr 23 '21 at 05:33

1 Answers1

1

I think @Iti, @GiorgioP, and @Mark_Bell are right in their comments. The connection, $$ \beta\rightarrow\frac{1}{T} $$ $$ S_{\textrm{ensemble}}\rightarrow S_{\textrm{Carnot}} $$ is not obvious. It's genius on the part of Bolzmann and Gibbs. There's a reason why people write things in stone. And Botzmann's intuition is written on his tombstone:

https://www.atlasobscura.com/places/boltzmanns-grave

You need thermodynamics to make this connection. To be more specific, that kind of "circular calculus" that characterises it. For your system, $$ dE=TdS+fdx $$ Then, it must be, $$ dE=\left.\frac{\partial E}{\partial S}\right|_{x}dS+\left.\frac{\partial E}{\partial x}\right|_{S}dx $$ Then, $$ \left.\frac{\partial E}{\partial x}\right|_{S}=f $$ But also, $$ dS=\frac{1}{T}dE-\frac{f}{T}dx $$ And then, $$ \left.\frac{\partial S}{\partial E}\right|_{x}=\frac{1}{T} $$ $$ \left.\frac{\partial S}{\partial x}\right|_{E}=-\frac{f}{T} $$ If you're not convinced, given that your system is entirely analogous to a $P$, $V$, $T$ system, go to your most reliable book on statistical mechanics and see how they derive the concept of pressure from a "purely statistical" point of view. You will find the same thermodynamical argument with just a change of a minus sign. $f$ will be $-P$, and $x$ will be $V$.

joigus
  • 1,656