22

What's the most fundamental definition of temperature? Is it the definition concern about average energy, number of micro states, or what?

By "fundamental", I mean "to be applied" in such general cases as Black Hole's Temperature, Accelerated Frame's Radiation,...

Qmechanic
  • 201,751
anonymous67
  • 1,493

2 Answers2

24

It's the differential relationship between internal energy and entropy: \begin{align} dU &= T\,dS + \cdots \\ \frac{\partial S}{\partial U} &= \frac 1T \end{align} As energy is added to a system, its internal entropy changes. Remember that the (total) entropy is $$ S = k \ln\Omega, $$ where $\Omega$ is the number of available microscopic states that the system has. The second law of thermodynamics is simply probabilistic: entropy tends to increase simply because there are more ways to have a high-entropy system than a low-entropy system. The logarithm matters here. If you double the entropy of a system (by, say, combining two similar but previously-isolated volumes of gas) you have squared $\Omega$.

Consider two systems with different $U,S,T$ that are in contact with each other. One of them has small $\partial S/\partial U$: a little change in internal energy causes a little change in entropy. The other has a larger $\partial S/\partial U$, and so the same change in energy causes a bigger change in entropy. Because they're in contact with each other, random fluctuations will carry tiny amounts of energy $dU$ from one system to the other. But because of the internal differences that lead to different numbers of internal states, it becomes overwhelmingly more likely that energy will flow from the system with small $\partial S/\partial U$ (reducing its entropy by a little) and into the system with larger $\partial S/\partial U$ (increasing its entropy by a lot). So we call the first one "hot" and the second one "cold."

This definition even extends to the case where entropy decreases as energy is added, in which case the absolute temperature is negative. It also explains why those negative temperatures are "hotter" than ordinary positive temperatures: in that case adding energy to the positive-temperature system increases its entropy, and removing energy from the negative-temperature system also increases its entropy.

rob
  • 89,569
  • It seems like the process "adding energy to the positive-temperature system increases its entropy, and removing energy from the negative-temperature system also increases its entropy." will go on until the negative-temperature runs out of energy? – anonymous67 Jul 07 '14 at 06:56
  • 1
    No, because $\partial S/\partial U$ changes with internal energy. The one-way flow of energy goes on until both systems have the same $\partial S/\partial U$, at which point energy flows in both directions are equally likely and we say that the two systems are at the same temperature. – rob Jul 07 '14 at 12:38
  • "But because of the internal differences that lead to different numbers of internal states, it becomes overwhelmingly more likely that energy will flow from the system with small ∂/∂ (reducing its entropy by a little) and into the system with larger ∂/∂". Can you explain this? I don't get how the conclusion follows from the premise here. I mean, I know energy flows from hotter systems to colder ones, but I don't get how this follows from systems having different entropies. – gardenhead Feb 28 '20 at 15:04
  • @gardenhead I don't know if I can do better in a comment. Ask a follow-up question that stands alone and I'll give it a shot. You can link your question to this one so that the context is clear. – rob Feb 28 '20 at 15:22
  • @rob - Question of curiosity: Is this totally general? That is, would this still apply for a non-equilibrium, non-thermodynamic system? – honeste_vivere Jul 29 '20 at 14:32
  • 1
    @honeste_vivere I don't know as much as I would like to know about systems where the statistical assumptions that underlie thermodynamics are invalid. I think that might make a good follow-up question. – rob Jul 29 '20 at 14:41
-3

There is no mathematically exact (infinite number of decimal places) definition of the temperature. Statistically speaking the temperature is a parameter of the Boltzmann distribution: If you pick up a particle of gaz from a thermalized (what is it??) volume, then, at temperature $T$, probability for it to have energy $E$ is $e^{-E/kT}$ where $k$ is the Boltzmann constant. So, in an experiment, you can start picking gas molecules, measure energy of each, create histogram (energy on $x$ axis, number of occurences on $y$ axis) and fit it with function $e^{-E/kT}$ (where you consider $T$ as a free parameter). Then the $T_{opt}$ leading to the optimal fit is the temperature. However, temperature is not a number "written on the sky", gaz particles gain their energy not by obeing some objectively existing distribution, but they gain their energy in chaotic microscopic collisions, which (we believe), should with high precision correspond in output to Boltzmann distribution. There is no "objective" temperature and there is no way to make it objective. It you adopt the statistical approach then, in the experiment I described, the number $T_{opt}$ is, strictly speaking, the most probable value of the tempereture. But, actually, the "true" temperature can be completely different however with very small probability. There is actually a statistical error (sigma of the distribution) to the most probable value. In systems with standard size (one Mole of particles) the error is placed somewhere around 13$th$ decimal place. Places beyond cannot be defined.

EDIT: I somehow forgot to make point about "generality" of my answer. Whatever is supposed to have a temperature (back hole, vacuum as seen from an accelerated frame, .... anything as long as it has the black color) simply has to have associated (black body) radiation, i.e. outgoing (thermalized) gaz of photon particles, where the definition I propose should be applied (photon energies follow Boltzmann distribution).

F. Jatpil
  • 358