2

Computing the entropy of a single particle has a number of pitfalls, however there seems to be a classical definition that should work.

This answer provides a definition of temperature for $N$ classical particles by using a probability distribution $D$ on phase space, setting up the maximization problem using Lagrange multipliers, and finding $D$ which maximizes the entropy $S=-k\int D\log D$. It is the Boltzmann-Gibbs distribution $D=\frac{1}{Z}e^{-\beta u}$, where $u(x,p)=\sum_i p_i^2/2m$, $\beta = \frac{1}{kT}$, and $Z = \frac{1}{h} \int e^{-\beta u} $. Setting $N=1$ gives the temperature for a single particle.

Since entropy is the quantity being maximized, we can just plug in the Boltzmann distribution. Let's assume the particle is in a box of length $L/2$ in the $x$ direction:

$$S(D) = -k\int \frac{dx dp}{h} \frac{1}{Z}e^{-\beta u} \log(\frac{1}{Z}e^{-\beta u})$$ $$=\frac{-kL}{hZ}\int dp e^{-\beta u}(-\beta u - \log Z)$$ $$=\frac{\beta k L}{hZ}\int dp \frac{p^2}{2m}\exp\left(\frac{-\beta p^2}{2m}\right)+\frac{k L \log Z}{hZ}\int dp \exp{\left( \frac{-\beta p^2}{2m}\right)}$$ $$\frac{k}{2}\left(1 + \log\left(\frac{2\pi m L^2}{\beta h^2}\right) \right)$$

where $Z=\frac{L}{h}\sqrt{\frac{2\pi m}{\beta}}$.

EDIT 1: The phase space measure $dx dp$ should be $dx dp/h$ as suggested in the comments, including the $S$ integral.

EDIT 2: I forgot parentheses around $2m$ when integrating, giving the wrong value of the Gaussian integral $Z$.

  • There seems to be an issue with units. What is wrong with this calculation? Check-my-work questions are off- topic on this site. – Ghoster Jan 04 '24 at 20:19
  • @Ghoster Is there a way to modify the question that would make it on-topic? – Jackson Walters Jan 04 '24 at 23:26
  • I'm only skimming what you wrote, but I get the sense that you might be falling into the trap of not carefully distinguishing probabilities and probability densities (the former are unitless and appear in sums, while the latter have dimensions and appear in integrals). Note that there are subtleties in defining the entropy for a continuous distribution. – Andrew Jan 05 '24 at 02:54
  • @Andrew Perhaps. Here $D$ is a probability and one could consider $D dx dp$ to be a probability density. The core issue is just with $Z$. Note $\beta u$ is unitless (inverse energy times energy), but when you do the integral, you get $\beta$ and $m$ factors popping out, under square roots even. This is a problem since we then take $\log Z$. – Jackson Walters Jan 05 '24 at 03:07
  • 1
    I agree with @Andrew's comment. It is not only $Z$ requiring a 1/h factor but every integral on the one-particle phase space. You can understand it by thinking at $dx dp/h$ as a measure counting the number of states. – GiorgioP-DoomsdayClockIsAt-90 Jan 05 '24 at 06:50
  • Related to what @GiorgioP-DoomsdayClockIsAt-90 is saying, I think essentially you want to set $\Delta=\hbar$ in this piece describing how to define entropy for continuous distributions (for the reasons Giorgio is mentioning about how the uncertainty principle means you can think of the $x$ and $p$ space as effectively discretized in volumes of order $\hbar$). – Andrew Jan 05 '24 at 15:51
  • @GiorgioP-DoomsdayClockIsAt-90 Ah! Okay, that makes sense and is intuitive. Implementing that fix removes a factor of $h$, so it looks like $k/2$+log correction. Now I need to figure out why $Z$ has units, and therefore $\log(Z)$ doesn't make sense. – Jackson Walters Jan 05 '24 at 19:26
  • Why do you think $Z$ has such value or such units? Did you try to calculate the integral $Z= \frac{1}{h}\int dqdp~ e^{-\beta H}$? It is obviously dimensionless... – Ján Lalinský Jan 05 '24 at 19:59
  • @JánLalinský It should be! I got $Z = \sqrt{\frac{2\pi L^2}{m \beta h^2}}$. It's just a Gaussian integral, but the parameters $m$ and $\beta$ control the variance. Oh wait ... I did it in Mathematica and forgot to put parenthesis around the $2m$. – Jackson Walters Jan 05 '24 at 20:05

0 Answers0