43

Could someone provide me with a mathematical proof of why, a system with an absolute negative Kelvin temperature (such that of a spin system) is hotter than any system with a positive temperature (in the sense that if a negative-temperature system and a positive-temperature system come in contact, heat will flow from the negative- to the positive-temperature system).

Nat
  • 4,640

8 Answers8

43

Arnold Neumaier's comment about statistical mechanics is correct, but here's how you can prove it using just thermodynamics. Let's imagine two bodies at different temperatures in contact with one another. Let's say that body 1 transfers a small amount of heat $Q$ to body 2. Body 1's entropy changes by $-Q/T_1$, and body 2's entropy changes by $Q/T_2$, so the total entropy change is $$ Q\left(\frac{1}{T_2}-\frac{1}{T_1}\right). $$ This total entropy change must be positive (according to the second law), so if $1/T_1>1/T_2$ then $Q$ has to be negative, meaning that body 2 can transfer heat to body 1 rather than the other way around. It's the sign of $\frac{1}{T_2}-\frac{1}{T_1}$ that determines the direction that heat can flow.

Now let's say that $T_1<0$ and $T_2>0$. Now it's clear that $\frac{1}{T_2}-\frac{1}{T_1}>0$ since both $1/T_2$ and $-1/T_1$ are positive. This means that body 1 (with a negative temperature) can transfer heat to body 2 (with a positive temperature), but not the other way around. In this sense body 1 is "hotter" than body 2.

N. Virgo
  • 33,913
28

From a fundamental (i.e., statistical mechanics) point of view, the physically relevant parameter is coldness = inverse temperature $\beta=1/k_BT$. This changes continuously. If it passes from a positive value through zero to a negative value, the temperature changes from very large positive to infinite (with indefinite sign) to very large negative. Therefore systems with negative temperature have a smaller coldness and hence are hotter than systems with positive temperature.

Some references:

D. Montgomery and G. Joyce. Statistical mechanics of “negative temperature” states. Phys. Fluids, 17:1139–1145, 1974.
http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19730013937_1973013937.pdf

E.M. Purcell and R.V. Pound. A nuclear spin system at negative temperature. Phys. Rev., 81:279–280, 1951.
Link

Section 73 of Landau and E.M. Lifshits. Statistical Physics: Part 1,

Example 9.2.5 in my online book Classical and Quantum Mechanics via Lie algebras.

Glorfindel
  • 1,424
  • "From a fundamental (i.e., statistical mechanics) point of view, the physically relevant parameter is coldness". I am afraid, that is not correct. It is energy, as shown in this paper. For instance, (inverse) temperature does generally not allow determining the direction of heat flow, because it is only a derivative of $S$. – jkds Oct 15 '18 at 11:26
  • 2
    @jkds: Of course, internal energy, temperature, pressure, etc. are all physically relevant. What I had meant is that coldness (inverse) temperature is more relevant than temperature itself. – Arnold Neumaier Oct 15 '18 at 12:18
  • Sure, but what the authors showed was that temperature is not in one-to one correspondence to a system's macrostate. The same system can have the same temperature at completely different internal energies. So temperature, unlike $E/N$, can be a misleading descriptor of the system. – jkds Oct 22 '18 at 09:36
  • 1
    @jkds: In the canonical ensemble, the macrostate is determined by the temperature; in other ensembles (such as the grand canonical one), one needs of course additional parameters. Then temperature and internal energy are no longer in 1-1 correspondence but related by an equation of state involving the other parameters. But my answer is anyway independent of heat flow. – Arnold Neumaier Oct 22 '18 at 13:29
  • No that's not the problem. The problem is that for a non-convex DoS (think a staircase) $\frac{\partial S}{\partial E}$ can take the same value and therefore the same temperature at different energies. That's all. You only need the microcanonical ensemble to see it. – jkds Oct 23 '18 at 14:10
  • 1
    @jkds: Temperature is a property of the thermodynamic limit where the microcanonical ensemble is equivalent to the canonical ensemble. In the canonical ensemble the 1-1 correspondence is self-evident. Moreover one can prove convexity. Thus if you assume a non-convex entropy functional you are in the thermodynamic situation only after performing the Maxwell construction (corresponding here to taking the convex envelope). – Arnold Neumaier Oct 23 '18 at 16:40
  • I think it might be easier to explain if I wrote an answer myself, with figures. But the TLDR; is (i) the microcanonical ensemble is at the basis of stat. mech. If something fails there, you can't repair it later. (ii) convexity of $S(E)$, while true for normal systems, like ideal gases, etc. is not generally the case. Textbook counter example: $N$ spins on a line in 1D: $S(E)$ looks like this $\bigcap$, concave. (iii) ensembles are generally not equivalent. But that's a more subtle point. – jkds Oct 24 '18 at 07:34
8

Take a hydrogen gas in a magnetic field. The nuclei can be aligned with the field, low energy, or against it, high energy. At low temperature most of the nuclei are aligned with the field and no matter how much I heat the gas I can never make the population of the higher energy state exceed the lower energy state. All I can do is make them almost equal, as described by the Boltzmann distribution.

Now I take another sample of hydrogen where I have created a population inversion, maybe by some method akin to that used in a laser, so there are more nuclei aligned against the field than with it. This is my negative temperature material.

What happens when I mix the samples. Well I would expect the population inverted gas to "cool" and the normal gas to "heat" so that my mixture ends up with the Boltzmann distribution of aligned and opposite nuclei.

John Rennie
  • 355,118
5

Ah, but who says that negative absolute temperatures exist at all? This is not without its controversies. There's a nature paper here which challenges the very existence of negative absolute temperatures, arguing that negative temperatures come about due to a poor method of defining the entropy, which in turn is used to calculate the temperature.

Other people insist that these negative temperatures are "real".

So, depending on which side of this debate you align yourself with, these systems can be described with positive temperatures (and behave accordingly), or negative temperatures which have very exotic properties.

Matt Thompson
  • 91
  • 1
  • 2
  • 3
    This does not answer the question (the proof that is asked for does not rely on whether such systems actually exist or not). – ACuriousMind Jun 30 '15 at 10:26
  • 1
    The one thing that everyone agrees on is that their behavior is a bit surprising, and that is to be expected as we don't encounter systems with temperature ceilings in day-to-day life. In any case, that paper is cited in the comments on most of our "negative absolute temperature" questions. I can assure you that most of the answer authors are aware of it. But the question presupposes the definition of temperature which generates 'negative' values and this post doesn't really address it. – dmckee --- ex-moderator kitten Jul 01 '15 at 03:04
  • @ACuriousMind: What of E=-mcc? Matt Thompson's answer is to claim the negative temperatures are the similar beast of spurious mathematical solutions and have no meaning whatsoever. – Joshua May 22 '16 at 16:20
  • @matt-thompson: you are spot on. In fact, "temperature" as opposed to energy is only a derived quantity (a derivative of $S$) and nowhere near as fundamental. By looking at non-monotonously growing densities of states it is easy to construct paradoxa, like systems in which heat is flowing from the colder to the hotter bath, regardless of which entropy definition is used, see the authors' follow-up paper – jkds Oct 15 '18 at 06:36
  • 1
    For negative temperature, you require a thermal equilibrium in which dS/dU < 0. This can happen, but only in a metastable sense. However, much of equilibrium thermal physics can apply to long-lived metastable equilibria. The concept of negative temperature is consistent with this. (And by the way, if it were true that someone had found a way for heat to flow from a colder to a hotter bath (correctly defined) without entropy increasing elsewhere, then we would all know about it because they would be rich and our energy problems would be over.) – Andrew Steane Oct 30 '18 at 10:13
1

For the visually inclined, this article explains it simply. The maximum hotness definition is the middle image instead of the expected right image:

absolute zero, infinite hot, and beyond infinite hot

Due to the unintuitive definition of heat, a sample that only includes hot particles is negative kelvin / beyond infinite hot, and as clear from the image would give energy to colder particles.

Cees Timmerman
  • 168
  • 1
  • 11
0

None of the answers above are correct. Matt Thompson's answer is close.

The OP asks for a mathematical proof that

if a negative-temperature system and a positive-temperature system come in contact, heat will flow from the negative- to the positive-temperature system

There is no proof for this statement because it is incorrect

In statistical mechanics temperature is defined as \begin{equation} \frac{1}{T} = \frac{\partial S}{\partial E} \end{equation}

i.e. a derivative of $S$. For $\it normal$ systems, like ideal gases, etc. $S(E)$ is a highly convex function of $E$ and there is a 1-to-1 relation between the system's macrostate and its temperature.

However, in cases where $S$ is not a convex function of $E$, $\frac{\partial S}{\partial E}$ can take the same numerical value at different energies $E$ and therefore the same temperature. In other words, $T$, unlike $E$ does --in general-- not uniquely describe a system's macrostate. This situation occurs in systems that have a negative Boltzmann temperature (detail: for a negative Boltzmann temperature $S$ needs to be non-monotonous in $E$).

An isolated system 1 with a negative Boltzmann temperature $T_B<0$ can have either higher or lower internal energy $E_1/N$ than another isolated system, system 2, that it gets coupled to.

Depending on which system has a higher $E_i/N, i=1,2$ heat flows either from system 1 to system 2 or vice versa, regardless of the temperatures of the two systems before coupling. For details, see

Below I have attached Fig. 1, taken from the arxiv version of this work to illustrate this fact.

enter image description here

PS

  1. I am not an author of any of the cited papers.

  2. Thermodynamics is compatible with the use of the Gibbs entropy, but not with the Boltzmann entropy. Showing this is a four line proof, see this Nature Physics paper Consistent thermostatistics forbids negative absolute temperatures. The Gibbs temperature (unlike Boltzmann temperature) is always positive, $T>0$.

  3. The attempt above by @Nathaniel at a purely thermodynamic proof of the OP's statement relies on the premise that $T<0$ is compatible with thermodynamics. This is not the case, see point 2. The proof given is invalid.

  4. For normal systems the distinction between Gibbs and Boltzmann temperature is practically irrelevant. The difference becomes drastic though, when edge cases are considered, e.g. truncated Hamiltonians or systems with non-monotonous densities of states. In fact, in most calculations in statistical mechanics textbooks the Gibbs entropy is used instead of the Boltzmann entropy. Remember calculating "all states up to energy $E$" instead of "all states in an $\epsilon$ shell at energy $E$"? That's all the difference.

  5. There is a whole series of attempts to publish comments on the Nature Physics article by Dunkel and Hilbert, but all got rejected. These all follow the pattern of trying to create a contradiction, but none were able to punch a hole into Dunkel and Hilbert's short mathematical argument.

jkds
  • 139
  • 2
    It is not necessary for $S$ to be nonconvex in order to have a negative temperature. The canonical ensemble for a simple 2-state system has a negative temperature regime, but $S(E)$ is convex in that case. It is surely the case that if you move to the microcanonical ensemble then nonconvexity can make things more complicated, but that's tangential to this question. – N. Virgo Oct 30 '18 at 09:51
  • 1
    I had a quick look at the paper just in case, but I didn't change my mind. The proof in my answer really is a mathematical proof - it says that (i) if temperature is defined as $1/T=\frac{\partial S}{\partial E}$, and (ii) if the first and second laws hold, then (iii) heat must always flow from lower $1/T$ to higher $1/T$. If it doesn't then you're using the wrong ensemble or have made some other mistake - there is no other possibility. Neither non-convexity of the entropy nor non-uniqueness of $E(T)$ can change this. – N. Virgo Oct 30 '18 at 10:13
  • @Nathaniel the research result I quoted here, including an specific example, is precisely that temperature (regardless which entropy is used) does not allow to deduce the direction of heat flow. My answer is specific to the OPs question and short, because I did not want to go into all the details. Please see the linked paper and others by the same authors for answers to your questions. – jkds Oct 30 '18 at 11:10
  • 1
    Yes, I read the paper, albeit briefly, as I said. They review multiple statistical definitions of the entropy and temperature, and claim that for some of them the temperature doesn't predict the direction of heat flow. But that implies a violation of the second law, so it just means those definitions are not the correct ones for the system in question. I do agree with them that the temperature doesn't uniquely determine the thermodynamic state if the entropy isn't convex, but they seem to say this implies it can't predict the direction of heat flow, which doesn't actually follow at all. – N. Virgo Oct 30 '18 at 14:13
  • @Nathaniel "But that implies a violation of the second law ". Not correct. The second law is discussed in the paper in Sec. V. Of the reviewed entropy definitions only one --the Gibbs entropy-- satisfies the second law strictly. – jkds Nov 02 '18 at 08:42
  • 1
    Look, if $T$ is defined via $1/T = \frac{\partial S}{\partial E}$ then for two coupled systems we have $\frac{\partial S}{\partial E_1} = \frac{\partial (S_1+S_2)}{\partial E_1} =\frac{\partial S_1}{\partial E_1} - \frac{\partial S_2}{\partial E_2} = 1/T_1-1/T_2$, and the entropy increases if and only if heat flows from the system with lower $1/T$ to the system with higher $1/T$. This is a really simple, completely incontrovertible consequence of the definition. If your statistical definition of entropy contradicts this then it contradicts the second law, even if you have a Nature paper. – N. Virgo Nov 02 '18 at 09:21
  • 2
    Regarding the Nature Physics paper by Dunkel and Hilbert, it's baffling to me that they fail to mention the Gibbs-Shannon or the von Neumann entropy, those being the statistical definitions from which the Boltzmann distribution is derived in the first place. However, it's not surprising to me that the thing they call the Gibbs entropy (which is actually Boltzmann's definition of the entropy) is a better approximation than the thing they call the Boltzmann entropy. So I don't disagree with them on that point. – N. Virgo Nov 02 '18 at 09:44
  • 1
    Their argument about negative temperatures is not convincing, however, since it's really just an assertion and not a mathematical argument at all. They say "In particular, such an analysis has to account for the peculiar fact that, when the heat engine is capable of undergoing population inversion, both a hot and cold bath may inject heat into the system" as if that constitutes a criticism, but in fact it's the entire point! – N. Virgo Nov 02 '18 at 09:45
  • Incidentally, for the avoidance of any doubt, the down vote is not from me. – N. Virgo Nov 02 '18 at 09:52
  • @Nathaniel. we could discuss forever, but this is probably not the best place to do so. Just regarding your first remark regarding heat flow: what Dunkel and Hilbert show in the Nat.Phys. is that satisfying $dE=T dS-p dV$ leads directly to the Gibbs entropy and not the Boltzmann entropy. All this takes 12 Eqs. including 8 Eqs. of definitions. As a corollary $T>0$ in thermostatistics. So if you want to come up with a thermodynamic proof you cannot simply plug in $T<0$. This restriction in TD also follows from being able to switch between $E(S,V,N)$ and $S(E,V,N)$. They must be monotonous. – jkds Nov 02 '18 at 11:27
  • @Nathaniel perhaps you two can discuss this in chat. This is going to be interesting. – Shing Nov 02 '18 at 12:47
  • I don't understand why you keep telling me what their paper says. I have read it. But what you say about it in your last comment is false. It may be true that the relation $dE=TdS-pdV$ is not satisfied by what D&H call "the Boltzmann entropy", and it may also be true that it is also satisfied by what they call "the Gibbs entropy", I am not arguing against any of that. But that relation is surely also satisfied by what we now call the Shannon entropy, so it isn't true that it "leads directly to the Gibbs entropy". – N. Virgo Nov 02 '18 at 13:41
  • 1
    More importantly: it is of course true that if you can switch between E(S) and S(E), then the temperature must be always the same sign (positive or negative) because those functions have to be monotonic in order for this to be the case. But for systems with bounded energy levels you can't, and they aren't. There's no principle that says you have to be able to switch between them in that way, and the inability to do so for some systems does not affect the definition or behaviour of temperature or heat flow. – N. Virgo Nov 02 '18 at 13:45
  • @Nathaniel: OK (i) to get the lingo right: If there are $W$ microstates at energy $E$, all equally probable with $p_i=1/W$, then $S = -k_b\sum_{i=1}^W p_i log(p_i) = k_B \sum_{i=1}^W log(W) / W= k_B log(W)$. So for us physicists Boltzmann and Shannon entropy are the same thing here, because we use the fundamental postulate of stat. mech. that all microstates are equally probable. Agreed? – jkds Nov 02 '18 at 16:44
  • @Nathaniel: "There's no principle that says you have to be able to switch between them in that way". Let me remind you of the third postulate of thermodynamics: The entropy of a composite system is additive over the constituent subsystems. The entropy is continuous and differentiable and is a monotonically increasing function of the energy, see e.g. http://cvika.grimoar.cz/callen/callen_01.pdf[Callen] So in thermodynamics (no stat. mech yet) $T$ is positive. The question is then: which entropy definition is consistent with TD? – jkds Nov 02 '18 at 21:25
  • Regarding Shannon vs. Boltzmann entropy, you're right that they're equal under that assumption, but for any given system that assumption may or may not be a correct one. This is another of those things that tends not to matter for large systems (because of the asymptotic equipartition property) but tends to matter rather a lot for small ones. Even if it's satisfied initially, it's unlikely to continue to be satisfied once the system starts to exchange heat with another system. – N. Virgo Nov 03 '18 at 03:47
  • 1
    Note also that what you're calling "the Bolztmann entropy" is what D&H call "the Gibbs entropy", and what they call "the Boltzmann entropy" is some kind of ideal gas ansatz. – N. Virgo Nov 03 '18 at 03:48
  • 1
    Regarding Callen's postulates, those are a phenomenological description of classical macroscopic thermodynamics - it's not so surprising that they would break down for microscopic systems. What's interesting is that you can keep pretty much all of them except for the statement that the entropy is monotonically increasing. – N. Virgo Nov 03 '18 at 03:50
  • I’m sorry, but none of what you wrote is correct. You write so many wrong things I can’t even answer all of them. Gibbs entropy: let $\Omega(E)$ be the sum of all microstates up to energy $E$ starting at 0, then the Gibbs entropy is $S_G=k_B ln \Omega$. Boltzmann entropy $S_B=k_B ln (\Omega’(E)\epsilon)$ where $\epsilon$ is a small energy needed for making the argument of the log dimensionless. – jkds Nov 03 '18 at 08:41
  • Next: thermodynamics is what we want to explain with stat. mechanics. These are not just callens postulates, but the foundation of TD. We want stat mech so we can explain why your engine works. – jkds Nov 03 '18 at 08:49
  • So on purely TD grounds if entropy did not increase monotonously with $E$, we could not go from $dE=T dS -p dV -\sum_i a_i dA_i$ to $dS=\frac{1}{T} dE + \frac{p}{T} dV + \sum_i \frac{a_i}{T}$. This is called the fundamental relation. The first requires $E(S,V,N)$ the second $S(E,V,N)$. So this is the purely thermodynamic reason why entropy needs to increase with $E$. In TD there is no $T<0$. But that also follows from requiring that stat. mech. fulfills the fundamental relation, and that's what Dunkel and Hilbert showed. – jkds Nov 03 '18 at 10:26
0

Negative temperature - yes I encountered that once: I seem to recall that it's the state that arises when, say, you have a system of magnetic dipoles in a magnetic field, and they have arrived at an equilibrium distribution of orientations ... and then the magnetic field is suddenly reversed and the distribution is momentariy backwards - basically the distribition given by substituting a negative value of T. Other scenarios can probably be thought of or actually brought into being that would similarly occasion this notion. I think possibly the answer is that the system is utterly out of thermodynamic equilibrium, whence the 'temperature' is just the variable that formerly was truly a temperature, and is now merely an artifact that gives this non-equilibrium distribution when rudely plugged into the distribution formula. So heat is transferred because you now have a highly excited system utterly out of equilibrium impinging upon a system that approximates a heat reservoir. I think there's no question really of accounting for the heat transfer by the usual method, ie when both temperatures are positive, of introducing the temperature difference as that which drives the transfer.

And would it even be heat transfer atall if the energy is proceeding from a source utterly out of thermodynamic equilibrium? It's more that the transferred energy is becoming heat, I would say.

  • Just to say, in the spin example the system is not "utterly out of equilibrium". Surprising as it may seem, the situation with spins more "up" than "down" is a metastable equilibrium, because the second derivative of the entropy is negative. This means that after a small fluctuation the system will move back or 'relax' to the negative temperature state, and this is the sense in which we can speak of thermal equilibrium here. – Andrew Steane Oct 30 '18 at 10:20
  • Really!? It's metastable is it? That's really quite remarkable! I feel a need to look at that more closely. Thankyou. – AmbretteOrrisey Oct 30 '18 at 10:23
0

Here is a simple argument from first principles.

First, please review the following points from statistical-thermodynamics (please excuse me for glossing over some minor details):

  1. Definition of entropy: Entropy is the number of microstates (i.e microscopic configurations) a system in equilibrium can be in, for a given total energy. (up to exponentiation).

  2. Definition of temperature: When the energy of the system is increased, the entropy changes. In its most fundamental form, temperature is defined by how much the entropy increases when the energy is increased (assuming no other transformation is done on the system). Typically temperature is positive, meaning that the entropy increases when the energy increases. But negative temperature is also allowed, whereby the entropy decreases when the energy increases.
    (For simplicity, I'm actually thinking about the inverse of the temperature, usually denoted by $\beta$, but this isn't important here).
    (Very often people describe temperature as a measure of the average motion energy of the ingredients that the system is composed of, but this is a non-fundamental description, and it doesn't apply well to negative temperature situations)

  3. Second law of thermodynamics: When two subsystems are brought into thermal contact, energy will flow between them. This is heat. As the energy of each subsystem changes, the number of accessible microstates of the subsystem changes as well. Heat will keep flowing, until the the number of microstates of the combined system will be the maximum possible with the current total energy. This is equilibrium. At this point, there will be no more net exchange of energy (at long time scales). This is statistically very reasonable, because now the number of accessible microstates vastly outnumbers the number of microstates for any other distribution of energy between the two subsystems, and so the probability that the combined system will move to a microstate whose energies correspond to a different entropy, and keep doing so, is ridiculously tiny. In other words, the system will equilibrate at a configuration that maximizes the entropy.

Now to the argument.

Let's assume that system $P$ and system $N$ are brought into thermal contact. System $P$ has positive temperature, and system $N$ has negative temperature. Let's try to figure out the direction of the heat flow. Let's first guess that heat flows from system $P$ to system $N$. In this case, the energy of system $P$ decreases, and therefore its entropy decreases as well (since it has positive temperature). The energy of system $N$ increases, and so its entropy decreases as well (since it has a negative temperature). In total, we get a decrease in entropy. This contradicts the second law of thermodynamics. The conclusion is that heat must flow from system $N$ to system $P$. i.e heat will flow from a negative temperature system to a positive temperature system.

Lior
  • 3,309