10

How would you prove that $1/T$ is the most suitable integrating factor to transform $\delta Q$ to an exact differential in the second law of thermodynamics: $$dS = \frac{\delta Q}{T}$$

Where $dS$ is the change in entropy, $\delta Q$ is the change in heat energy, and $T$ is the equilibrium temperature.

Qmechanic
  • 201,751
Ana S. H.
  • 1,343

6 Answers6

5

There's a group I call "thermodynamic purists" who think that thermodynamics is a self-contained system based on semi-mathematical "axioms". I disagree! I think that thermodynamics is fundamentally a consequence of statistical mechanics, and that this is the best way to think about it and understand it. I acknowledge that reasonable people can differ on this, but anyway, here is my non-thermodynamic-purist way to address your question. (Someone else can give a thermodynamic-purist answer, which I bet is what you're looking for.)

What is heat $\delta Q$? It's energy flow (in the form of microscopic kinetic energy etc.). What is entropy change $dS$? It's the increase in disorder (more precisely, increase in $k_B$ * log(number of microstates)). What is temperature $T$? It's a parameter describing how unlikely it is that a high-energy microstate will occur (Boltzmann distribution etc.).

Then the appropriate question is: Why (according to these definitions) does $T dS = \delta Q$? It's a good question without a very obvious answer. This was discussed in a recent question, see here.

Steve Byrnes
  • 16,612
  • Pure thermodynamic is underappreciated. I don't see it as mere consequence of statistical physics, but rather as the more detailed theory as some sort of a model of it. In any case, if you come from the hardcore statistical side, then temperature $(\partial S/\partial E)^{-1}$ is a definition and not a mystery. Here, to me, heat flow is just energy flow which is not due to volumetric change and alike. – Nikolaj-K Apr 26 '13 at 12:09
  • You can of course take this path and define a parameter, suggestively named $T$, as the derivative of the logarithm of the number of microstates with respect to the energy. But then you still owe us a very general explanation why that thing is identical to what we in ordinary life call a temperature... – Markus Deserno May 13 '13 at 11:13
  • Markus, yes the "very general explanation" you're looking for does exist. It can be found in statistical mechanics textbooks---for example, Kittel & Kroemer. – Steve Byrnes Nov 19 '18 at 15:08
  • While I agree with you that the statistical picture is better than the one of the purist, the issue often is that undergraduates learn thermodynamics first before learning statistical physics. This leads to many misconceptions that are hard to clarify without introducing advanced topics. – Mauricio Apr 22 '23 at 10:59
5

I apologize "basics foundations of thermodynamics" still does not make a lot of sense to me.

Steve B already provided some answer associated to one way to interprete the word "foundation" that is from statistical mechanics. I will kinda play here devil's advocate and assume that you are refering to axiomatic thermodynamics.

As far as I am concerned, the foundations of axiomatic thermodynamics are simply experimental facts but there are many ways to choose the set of fundamental axioms to derive all the other results of thermodynamics and this may be related then to the question you ask.

The second principle of thermodynamics tells you that there is a quantity called entropy and labeled $S$ that is a function that depends on the state of the system only. Now, it also says that for an isolated system under a spontaneous evolution towards thermodynamic equilibrium, the entropy can only increase.

Let us consider for a minute the case of an isolated system that comprises of two identical sub-systems 1 and 2 but that have different internal energies $U_1^0$ and $U_2^0$ such that, say, $U_1^0 > U_2^0$.

Once these sub-systems are put in thermal contact, the whole system is out of equilibrium and there should be a heat flux from sub-system 1 to sub-system 2.

Let us describe it more formally:

The total entropy of the system at any moment (if we imagine the heat transfer slow enough) is given by:

$S_{tot}(t) = S_1(t)+S_2(t) = S(U_1^t,V,N)+S(U_2^t,V,N)$

Note that I used the same function $S$ for both systems since I have said previously that they were the same. However, they are not in the same state and therefore their entropies are different.

Let us assume that over a time interval $\delta t$, $U_1^t \rightarrow U_1^t + \delta U_1$ and $U_1^t \rightarrow U_2^t + \delta U_2$ then, the entropy of the total has to change by an amount:

$\delta S_{tot}(t) = S_{tot}(t+\delta t) - S_{tot}(t) = \left(\frac{\partial S}{\partial U_1}\right)_{V,N}\delta U_1 + \left(\frac{\partial S}{\partial U_2}\right)_{V,N}\delta U_2$

Note also that because of its extensive property $\delta S_{tot} = \delta S_1 + \delta S_2$

By identification (this step may not be very rigorous) we then have that:

$\delta S_i = \left(\frac{\partial S}{\partial U_i}\right)_{V,N}\delta U_i$

Since the sub-systems are only in thermal contact, there is not work exchanged between them and the first principle of thermodynamic says then that $\delta U_i = \delta Q_i$ we thus have:

$\delta S_i = \left(\frac{\partial S}{\partial U_i}\right)_{V,N}\delta Q_i$

Finally, to evaluate what is $\left(\frac{\partial S}{\partial U_i}\right)_{V,N}$, you can look at two cases:

  • At time $t$ the system is at equilibrium and therefore $\delta S_{tot}=0$ implies that $\left(\frac{\partial S}{\partial U_1}\right)_{V,N}=\left(\frac{\partial S}{\partial U_2}\right)_{V,N}$, it thus corresponds to an intensive variable that has to "thermalize" between systems at thermal equilibrium...it has to be related to the temperature

  • At time $t$ the system is still out of equilibrium and $\delta S_{tot} > 0$ implies that $\left(\frac{\partial S}{\partial U_1}\right)_{V,N} < \left(\frac{\partial S}{\partial U_2}\right)_{V,N}$

Since the subsystems are the same it implies that the higher the temperature of a system the smaller the intensive variable, the simplest quantity that does the job is $1/T$

This derivation does not exclude more complicated - always decreasing - functions of $T$ and I do not know if they should be completely excluded at this stage anyway...I guess that it can be made pretty sure that it should be $1/T$ then by using Schwartz theorem of equality of cross partial derivatives byt changing then the volume or the number of particles, but I am too lazy for that now, sorry.

gatsu
  • 7,202
3

One way to derive existence of entropy function and the expression $dS=\frac{dQ}{T}$ (the traditional one, due to Clausius) for a two-dimensional thermodynamic system, where thermodynamic state is defined by two variables (like simple uniform fluid with state defined by $T,V$), is to consider the following process. The system undergoes arbitrary cyclic process $\gamma$ where the state changes quasistatically, so the system is at some equilibrium state at all times. Its temperature is changing and is the same as temperature of the heat reservoir it is connected to.

This cyclic process in 2D space can be replaced by a set of infinitesimal Carnot cycles, which tile the region of the space of states enclosed by the cyclic path. Let the temperature $T$ be measured by ideal gas thermometer (obeying $PV=nRT$).

Since the infinitesimal cycles are Carnot cycles, they obey $$ \frac{dQ_1}{T_1} + \frac{dQ_2}{T_2} = 0 $$ for heat $dQ_1$ accepted from the reservoir at $T_1$, and $dQ_2$ is heat accepted from the reservoir at $T_2$. As one goes from one infinitesimal Carnot cycle to another, $T_1,T_2$ change.

When the left-hand side of the last equation is summed over all infinitesimal cycles tiling the region, the contributions from all pairs of isotherms that coincide in the space of states cancel each other, because they have opposite sign of heat $dQ$ and the same $T$. Only the isotherms near the region boundary remain uncancelled and contribute to the sum. We thus obtain quantity which is a sum of terms $dQ/T$ over all states in the great cyclic process $\gamma$. Using real number $s$ parameterizing the closed path, we have

$$ \oint_\gamma\frac{dQ/ds}{T(s)}ds = 0. $$ Since this equation is valid for all closed paths $\gamma$, the expression

$$ \int_i^f \frac{dQ}{T} $$ is a function of the states $i$,$f$ only. We can thus define entropy of the state $f$ as

$$ S(f) = S(i) + \int_i^f\frac{dQ}{T}. $$

Thus using Carnot's results for a set of infinitesimal Carnot cycles tiling the real cycle, we have derived existence of function of state $S$ such that $dS=dQ/T$.

It can be shown mathematically that for 2D systems, assuming $U$ is function of two variables $x,y$, and $dW$ is expression of the form $A(x,y)dx + B(x,y)dy$, there always exists integrating divisor function $D(x,y)$ for the expression

$$ dU - dW $$ that makes the expression $$ \frac{dU -dW}{D} $$ total differential of some function of $x,y$. This is a mathematical theorem, valid irrespective of 2nd law. This means the last expression defines a function $S(x,y)$. The fact that $D$ is gas temperature (or absolute temperature) can't be proven just from general mathematics, but follows from the 2nd law and properties of the ideal gas (or the absolute temperature).

Deriving existence of entropy for systems with dimension 3 or higher in this way is more difficult, because it is not clear that any closed path in such space of states can be tiled using infinitesimal Carnot cycles. Also, in 3 or more dimensions, the mathematical theorem for existence of integrating divisor does not hold; there are functions $U$ and expressions $dW$ for which integrating divisor does not exist. Saying that entropy exists for all thermodynamic systems means that all these systems have to be described by special functions $U$ and $dW$ that do have an integrating divisor. It is believed that 2nd law of thermodynamics implies existence of entropy for all thermodynamic systems.

3

Yes, it can be proved. But the mathematical rigour only applies to ideal gas, so you might feel disillusioned after all the calculations, considering how much this can be valid in reality.

If you insist you want to know, then you may read on. From the first law of thermodynamics,

$$\delta Q = \frac{3}2 nRdT + \frac{nRT}V dV,$$

where internal energy $U=\frac{3}2 nRT$ from kinetic theory of gas, pressure $P=nRT/V$ from ideal gas equation. The differential equation is exact if and only if $$\frac{\partial\left(\frac{3}2 nR\right)}{\partial V} = \frac{\partial\left(\frac{nRT}V\right)}{\partial T},$$

which is not the case here, that's why $\delta Q$ is inexact differential. If we now multiply a funtion $f(T)$ to $\delta Q$ to make it exact, then $f(T)$ is called integrating factor (in hindsight, we can only say that an integrating factor exists if $f$ is a function of $T$ alone). Simplifying $f(T)$ as $f$ for brevity, it must be true that

$$\frac{\partial\left(\frac{3}2 nRf\right)}{\partial V} = \frac{\partial\left(\frac{nRT}V f\right)}{\partial T}$$

The LHS of the equation is zero:

$$ 0 = \frac{n R}{V} f + \frac{n R}{V} T f' $$

where $f'=df/dT$. The constants can be eliminated, what remains is just a first order differential equation, and you can easily solve to $f=1/T$.

If you want the most general version of abstract proof you can look into this article, but I doubt you will gain you any further insight from it. Just my two cents.

Neoh
  • 348
  • You could probably do the same thing, just positing a general equation of state and thermal equation of state, and find the restrictions. After all, the formula for entropy is general, the first law is general, and so, the result better not be equation of state-dependent. – Zo the Relativist Apr 25 '13 at 17:18
  • Perhaps it can be expressed in more general form, but I never heard that something dS=dQ/T can be proven in total without referring to first law. Yes, entropy may be alternatively defined in terms of microstates, but that would be off topic. – Neoh Apr 25 '13 at 17:33
  • 1
    sure, you'll have to refer to the first law (which is very, very general), but you won't have to refer to the ideal gas law (which is not at all general). – Zo the Relativist Apr 25 '13 at 17:52
  • 1
    I'd love to see if there is a proof without referring to ideal gas too. – Neoh Apr 25 '13 at 17:55
  • @Neoh Thank you for your answer and for the article, I'll check it. – Ana S. H. Apr 26 '13 at 00:10
2

Another approach is to $\mathbf{define}$ temperature to be what you've just asked it to be: that is almost the approach that pops out of Carnot's original arguments about the efficiency of heat engines and his formulation of the second law. One can rework the following idea of defining temperature in terms of efficiency to being a function of the scale factor one needs to multiply $\delta Q$ by to make it an exact differential.

The ideas in question, as with many things, are beautifully explained in the chapter "The Laws of Thermodynamics" (Ch 44) in the first volume of the Feynman lectures.

As soon as you realise that all reversible heat engines working between the same two "big"(see footnote) heat reservoirs at equilibrium have the same efficiency $i.e.$ that the proportions of heat drawn from the "hotter" reservoir, that dumped to the "colder" one and work derived are always the same if the engine is reversible (regardless of the engine's workings), you have proven that this universal, uniquely defined efficiency is a way to compare all heat reservoirs at equilibrium, thus: we take a particular reservoir as a standard and call its temperature unity. Then, if we run a reversible heat engine between a hotter reservoir and this one, and three units of heat is taken from hotter one for each unit of heat dumped to our standard reservoir (thus producing 2 units of work), then we shall call the temperature of the hotter one 3 units, by definition. Likewise, if we run a reversible heat engine between our standard reservoir and a colder one and we find that our reservoir delivers three units of heat to the engine for every unit dumped to the colder reservoir, then the colder one is by definition at a temperature of $\frac{1}{3}$ units.

From this definition, it then follows that $\frac{\delta Q}{T}$ is an exact differential because $\int_a^b \frac{d\,Q}{T}$ between positions $a$ and $b$ in phase space must be independent of path (otherwise one can violate the second law). What I've just said is not altogether obvious: you'll have to look at Feynman for details. The Wikipedia entry for "temperature" also does a fair job of explaining the same ideas, in the section "Second law of thermodynamics".

footnote: $i.e.$ big enough that finite heat flows to and from them do not change their macrostate appreciably).

1

In the statistical mechanics approach, this is just the definition of the temperature. We first define $S$ (based on the "number" of microstates) and then define $T$ as:

$$\frac{1}{T}=\left(\frac{dS}{dU}\right)_V$$

At constant $V$, the only possible energy exchange is heat and $dU=\delta Q$. There is nothing to prove, since it is a definition. We know that $S$ is a state function from the start. All we can do is relate this $T$ to the intuitive notion of temperature. From this definition of $T$, we can easily prove that heat flows from hotter to colder.

In a "thermodynamical purist" approach, this is strangely more difficult to do rigorously. It is what Clausius essentially proved in this acclaimed theorem. He proved $\frac{\delta Q_{rev}}{T}$ is an exact differential. But this is hard to read and not fully rigorous by modern standards. I've read several attempts to rewrite Clausius theorem with a more modern approach. Here is an example: https://jfoadi.me.uk/stat_therm.html. A possibility is to define temperature through the efficiency of heat engines as explained in @Selene Routley's answer. The sketch of the proof is:

  • assume the second law as the Kelvin-Plank or Clausius statement (they are equivalent). The definition of temperature is not required, only the notions of hotter / colder
  • define $T$ via the efficiency of Carnot engines
  • prove that the second law implies Clausius theorem
  • thus $\frac{\delta Q_{rev}}{T}$ is an exact differential

This is not "obvious" at all and requires quite a bit of thoughts. We can only make our opinion about their rigour and consistency by looking at them closely.

The difficulty is essentially about defining or characterizing temperature properly. A thermodynamically purist approach with extreme rigour is given here: https://arxiv.org/pdf/cond-mat/9708200.pdf. It is a global approach that considers entropy of all systems as a whole and relies heavily on the extensiveness of entropy. But still, the definition of entropy comes first and the definition of the temperature is from entropy.

While it seems that temperature is a much more intuitive and tangible concept than entropy, the definition of temperature is at the centre of this problem.

Benoit
  • 551