1

As I understand it, statistically this means that a fall from a high temperature (say $300\text{ K}$) to a middle temperature ($200\text{ K}$), and an increase from say ($100\text{ K}$) to ($200\text{ K}$) by heat exchange do not result in the same entropy changes, the first loses less entropy than the second gains. So this implies that the former loses less microstates than the latter gains.

But how can this be so? Surely every increase of say $1\text{ K}$ increases the number of available microstates by the same multiplicative factor, and thus the same additive amount of entropy?

Edit:

I've been reading a text which states quite clearly that $\frac{\partial\log{\Omega}}{\partial U} \approx -\beta$ so the rate at which accessible microstates become available would approximately fall with increased energy. I'm trying to get a handle on why the rate of increase would typically fall with higher values of $U$, and by implication $T$.

As you say the log is a concave function, which answers a lot of why this behaviour happens, but I am trying to understand it from a physical rather than a mathematical perspective. The fact that the log was used is generally presented as a way in which to keep it in check with the state variable definition of entropy - which is in fine. But I guess that then leads to a corollary question of why entropy defined in such a way that it is a concave function of the number of states rather than the number of states per see. In other words why can the increase in the number of states available for states with low initial states vs. the descrease in states for a state with high initial states be sufficient to drive spontaneous processes through a statistical process.

Sam Keays
  • 125
  • 1
    If you want to use the statistical mechanics definition of entropy, then you have to calculate the probability of states and use $S=-k_B \sum_i p_i\ln{p_i}$. Entropy is not simply proportional to the number of states and the probability of states is not a linear function of temperature. Depending on the system it can be very hard to calculate S from first principles. – CuriousOne Sep 07 '14 at 02:18

3 Answers3

3

I've been reading a text which states quite clearly that $\frac{\partial\log{\Omega}}{\partial U} \approx -\beta$ so the rate at which accessible microstates become available would approximately fall with increased energy. I'm trying to get a handle on why the rate of increase would typically fall with higher values of $U$, and by implication $T$.

The minus sign in $\frac{\partial\log{\Omega}}{\partial U} \approx -\beta$ does not belong there. Nor does the "approximately equal to" sign. A better way to write this is $$\frac 1 T \equiv \frac{\partial S}{\partial U} = k \frac{\partial\log{\Omega}}{\partial U}$$ where $k$ is the Boltzmann constant. This is the statistical mechanics definition of temperature. You are misinterpreting what the above says (and very much so).

As an example, consider a monatomic ideal gas comprising $N$ atoms that occupies a volume $V$ and is at some temperature $T$. For an ideal gas, the energy $U$ is a function of $N$ and $T$: $U = \frac 3 2 N k T$. Thus $\frac 1 T = \frac 3 2 N k \frac 1 U$ for an ideal gas.

Combining this with the definition of temperature yields $\frac 3 2 N \frac 1 U = \frac{\partial\log{\Omega}}{\partial U}$ or $\frac {\partial \Omega}{\Omega} = \frac 3 2 N \frac{\partial U}{U}$. Treating this as an ordinary differential equation would yield $\Omega = c_1 U^{\frac 3 2 N}$ where $c_1$ is a constant of integration. Since entropy (and hence number of microstates) is a function of energy, number of atoms, and volume, the above becomes $$ \Omega(U,V,N) = f_1(V,N) U^{\frac 3 2 N} + f_2(V,N)$$ In other words, the number of microstates increases drastically as energy increases.

I derived the above just using the statistical mechanics definition of temperature and the relation between energy and temperature for an ideal gas. A more rigorous counting of microstates yields the Sackur-Tetrode equation, $$S= Nk \left( \log \left( \frac V N \left( \frac {4\pi m U}{3Nh^2} \right)^{\frac 3 2} \right) + \frac 5 2 \right)$$

To illustrate that entropy does increase, suppose there are two containers separated by a wall, each having a volume $V$ and each holding $N$ atoms of the same monatomic ideal gas. The gas in container is at a temperature $T_1$ while the gas in the other is at a different temperature $T_2$. Now remove the wall. The gases will eventually merge and come into thermal equilibrium, forming a gas of $2N$ particles that occupies a volume $2V$ and has a temperature $T_f = (T_1+T_2)/2$. I'll leave it up to you to show that this is always a higher entropy than the original entropy. One way you can do this is to show that the original entropy is equivalent to a gas of $2N$ particles that occupies a volume $2V$ and has a temperature $T_i = \sqrt{T_1T_2}$. Since the geometric mean of two different positive numbers is always less than their arithmetic mean, the entropy increases.

David Hammen
  • 41,359
0

Why "Surely every increase of say 1K increases the number of available microstates by the same multiplicative factor". It's not obvious to me why you say that.

Generally, the more energy a system has, the more states become available for the addition of small amounts of additional energy. Consider an ideal gas where each particle contributes $p^2/2m$ to the total energy of the system. For three particles, the total energy is $E=p_1^2/2m + p_2^2/2m + p_3^2/2m$. That is, in "$p$-space" surfaces of constant energy are spheres. As you increase the energy of the system, the area of the sphere increases. There are more places for any additional energy to go.

States of a system occupy a finite volume in phase ($x,p$) space. It's easier to get a handle on this within the framework of quantum mechanics, where we can make some sense of the fact that a state in phase space occupies a volume $\Delta x\Delta p \approx \hbar$. The same fact falls out of classical statistical mechanics, but the value of $h$ is not specified at all. It appears simply as a parameter.

So the number of states available to accommodate more energy is proportional to the volume of a shell radius $E$ and thickness $\Delta E$. As $E$ increases, the volume of that shell increases.

garyp
  • 22,210
  • So if I read you correctly the size of the new energy states available (say $n_{e}$) is proportional to $r^2$ but the total number of states in proportional to $r^3$ so that the ratio of new states to total states is proportional to $r^{-1}$? – Sam Keays Sep 07 '14 at 14:40
  • Yes, that's what I'm trying to say, for my three-particle ideal gas. – garyp Sep 07 '14 at 18:52
0

The statistical definition of temperature is

$$\frac{1}{kT}=\frac{d S}{d E}$$

So if $T_1 < T_2$, then $kT_1 < kT_2$ so $\frac{1}{kT_2} < \frac{1}{kT_1}$.

Therefore $$\frac{d S_2}{d E}=\frac{1}{kT_2}< \frac{1}{kT_1}=\frac{d S_1}{d E}$$.

Assuming all the temperatures involved are positive, if the two systems exchange energy in amount $\Delta E$ from system 2, then the change in entropy $\Delta S_2 \approx -\frac{d S_2}{d E}|\Delta E|$ whereas the change in entropy $\Delta S_1 \approx \frac{d S_1}{d E}|\Delta E|$. Which is larger in magnitude? They both have a factor of $|\Delta E|$, and $\frac{d S_2}{d E}< \frac{d S_1}{d E}$, so the hotter one $T_2$ loses less entropy than the colder one gains. OK, that was definitions, logic and math. But it's the lead up to the real answer:

Hot things decrease their entropy less than cold things increase them when they exchange equal amounts of energy because that is the literal definition of temperature.

Timaeus
  • 25,523