29

In thermodynamics, entropy is defined as $ d S = \dfrac{\delta q_{\rm }}{T}$. This definition guarantees that heat will transfer from hot to cold, which is the second law of thermodynamics. But, why do we denote entropy as$\dfrac{\delta q_{\rm }}{T}$ other than $\dfrac{\delta q_{\rm }}{T^2}$,$\dfrac{\delta q_{\rm }}{e^T}$,or something else?

Is there an intuitive explanation for this $\dfrac{\delta q_{\rm }}{T}$?

maple
  • 571

3 Answers3

16

The very short and definitive answer is this is how thermodynamic temperature is defined.

It goes back to very first formulations of the second law of thermodynamics by Carnot and Clausius, to wit, that it is impossible to build a perpetual motion machine of the second kind or "heat can never pass spontaneously from a colder to a warmer body" and the implications of this law to the efficiencies of heat engines. A perpetual motion machine of the second kind is one whose state undergoes a cycle in phase space and, on returning to its beginning state, has pumped heat from a colder to a hotter body without any input of work.

The Wikipedia page on Temperature under the heading "Second law of Thermodynamics" gives a reasonable summary of these ideas; "The Laws of Thermodynamics" (Chapter 44) in the first volume of the Feynman Lectures on Physics is a much fuller exposition.

It all comes down to the efficiencies of reversible heat engines, which, in Carnot's conception, work either by (i) drawing heat from a hotter ("higher temperature", not yet well defined) reservoir and dumping some of it to another cooloer ("lower temperature"", not yet well defined) reservoir whilst outputting the difference as useful work or (ii) work in the inverse way, taking in mechanical work to pump heat from the cooler to the hotter body. A "reservoir" here is a hot body that is so big that any amount of heat added to or taken from it does not appreciably change its macrostate.

By a thought experiment whereby the work output of one reversible heat engine taking heat from hot reservoir and dumping it to the cold reservoir is used to drive another reversible engine taking heat in the opposite direction. After a little work with this idea, it readily follows that the efficiencies of the two reversible heat engines must be the same. Otherwise if one efficiency were greater than the other, we could use the greater efficiency engine as the heat pump and violate the Carnot / Clausius statement of the second law. So we have now Carnot's theorem that:

The efficiencies of all reversible heat engines working between the same two reservoirs must all be the same and depends only on those reservoirs and not on the internal workings of the heat engines

Once you understand this, you now have a way of comparing different reservoirs from the point of view of idea heat engines. Namely, we take a particular reservoir as a standard and call its temperature unity, by definition. Then, if we run a reversible heat engine between a hotter reservoir and this one, and $t$ units of heat is taken from hotter one for each unit of heat dumped to our standard reservoir (thus producing $t-1$ units of work), then we shall call the temperature of the hotter one $t$ units, by definition. Likewise, if we run a reversible heat engine between our standard reservoir and a colder one and we find that our reservoir delivers $t$ units of heat to the engine for every unit dumped to the colder reservoir, then the colder one is by definition at a temperature of $\frac{1}{t}$ units. In general the proportions of heat flowing between reservoirs of temperatures $T_1$ and $T_2$ ($T_1>T_2$) defined in this way in a reversible heat engine (i.e. heat $Q_1$ is drawn from reservoir one and heat $Q_2$ is dumped into reservoir 2, thus producing work $Q_1 - Q_2$) are always in the same proportions and given by:

$$\frac{Q_1}{T_1} = \frac{Q_2}{T_2}$$

From this definition, it then follows that $\frac{\delta Q}{T}$ is an exact differential because $\int_a^b \frac{d\,Q}{T}$ between positions $a$ and $b$ in phase space must be independent of path (otherwise one can violate the second law). This last statement is not altogether obvious: you'll have to look at Feynman for details. So we have this new function of state "entropy" definied to increase by the exact differential $\mathrm{d} S = \delta Q / T$ when the a system reversibly absorbs heat $\delta Q$.

So the entropy expression is the way it is owing to the way we define the thermodynamic temperature, which definition is in turn justified by Carnot's theorem.

Pretty neat, eh?

Anyhow, what happens in practice is the following. Now that we have a definition of the ratio of temperatures in terms of the efficiency $\eta$ of the reversible heat engine running between reservoirs of these temperatures:

$$\frac{T_2}{T_1} = 1-\eta$$

one defines a "standard" unit temperature (e.g. as something like that of the triple point of water), then the full temperature definition follows. This definition can be shown to be equivalent to the definition of temperature for a system:

$$T^{-1} = k\,\partial_U S$$

i.e. the inverse temperature (sometimes quaintly called the "perk") is how much a given system "thermalizes" (increases its entropy) in response to the adding of heat to its internal energy $U$ (how much the system rouses or "perks up"). The Boltzmann constant depends on how one defines one's unit temperature - in natural (Plank) units unity temperature is defined so that $k = 1$.

Now temperature is sometimes said to be proportional to the average energy of a system's thermalized constituent particles. This is true for ideal gasses, but not the general definition. For example, I do the calculation of the temperature of a collection of thermalized quantum harmonic oscillators in this answer here and the thermodynamic temperature is only equal to the mean oscillator energy for temperatures $T$ such that $k\,T\gg \hbar \omega$, where $\hbar \omega$ is the photon / phonon (as appropriate) energy of the oscillator.

  • I don't think this addresses the question. The question is about why we can't redefine temperature through an arbitrary one-to-one function. Nothing you've said in this answer rules out such a redefinition. –  Oct 23 '13 at 15:42
  • 2
    @BenCrowell The way I read it, it does. This clearly establishes a way to objectively find the temperature ratio between two reservoirs. It might be a little unsatisfactory, by requiring to run an ideal Carnot engine between the two, but at least theoretically it seems that we have a way to find the temperature ratio of two things - and that does eliminate the possibility of using any one-to-one function. – Alan Rominger Oct 23 '13 at 17:03
  • @BenCrowell I believe I side with here because the OP asked for an intuitive justification for the equation, and its origins in the basic theory of heat engine efficiency is the most the concrete setting entropy ever had. That's ultimately what made entropy and the 2nd Law so impressive to the founders of thermodynamics: they kept finding it to hold true in increasingly general settings where it had no intuitive interpretation. – David H Oct 23 '13 at 17:28
  • @BenCrowell The point is we could redefine temperature in another way if we want to as long as the mapping were one-to-one - it gets down to a convention and, as you say, - the important thing is the fact of a new function of state - entropy - which fact arises through Carnot's theorem. IMO it is a matter of taste as to whether you get into the statistical definition of entropy in a question like this asking for "intuition" - I'd rather stick with the Clausius / Carnot definition because, as you very well know, the second law is only such if applied macroscopically to things like .... – Selene Routley Oct 24 '13 at 01:25
  • @BenCrowell ...heat engines and "long term" and cannot be proven owing to the Loschmidt argument. As you know, you can dig very deep into informational definitions, but the best one can do is, as you call it, a proof of the "weak" version and even this involves the experimental observation of a past low entropy state. – Selene Routley Oct 24 '13 at 01:28
  • It may help to supply the definition the efficiency $\eta$, namely the ratio of the output work $Q_1-Q_2$ to the input heat $Q_1$.

    $$ \eta = 1 - \frac{Q_2}{Q_1} = 1 - \frac{T_2}{T_1} $$

    – Papa Smurf Jan 16 '24 at 16:35
4

First off, temperature is an intensive quantity, i.e., not additive. For example, two cups of coffee don't have twice the temperature of one cup. For an extensive (additive) quantity, such as mass, we can't just redefine $m\rightarrow m'=f(m)$, where $f$ is a nonlinear function, because then $m'$ wouldn't be additive. This constraint doesn't apply to temperature, because temperature is intensive.

The relation $dS=dq/T$ is really the definition of temperature, not entropy. (Entropy is really defined as the log of the number of accessible states.) But this still allows us to take $T\rightarrow T'=f(T)$, where $f$ is some nonlinear function, and then we would just define temperature as $T'=f(dq/dS)$. We would need $f$ to be a one-to-one function, because we don't want objects that aren't in equilibrium to have the same temperature.

There is really nothing wrong with this, and for example this question discusses the possibility $f(x)=1/x$. Some equations (e.g., the partition function) actually come out simpler if written using this definition, although others get more complicated (the heat capacity of an ideal gas is no longer constant). However, most possibilities for $f$ result in all equations looking more complicated.

  • The intensiveness temperature and the freedom to nonlinearly rescale its measure per convenience has always made intuitive since. But a light-bulb just went off in my head relating this to relativity's non-linear rescaling of velocity by by rapidity, and the resulting non-additive velocity-composition formula. If you require $\vec{v}:=\frac{d}{dt}\vec{r}(t)$, then velocity only behaves like an 'intensive' quantity if time behaves as an extensive on, and vice versa, and absolute simultaneity makes time intensive! Is it kosher apply the adjectives intensive/extensive to kinematic quantities? – David H Oct 23 '13 at 16:36
  • 1
    Lovely succinct answer, but a justification of temperature's well-definedness and intensiveness would be good and this is what the Carnot and Clausius arguments give us: that there is some intensive notion of "relative hotness" of reservoirs that is independent of the workings of heat engines and the relativity of this notion sets which way a heat engine can run. I agree that from the modern standpoint the OP's $dS$ is more a definition of temperature than entropy, but that is assuming we already ken entropy as a primitive notion. – Selene Routley Oct 24 '13 at 01:31
0

The answer is plain and simple differential calculus $$ds = \frac{\partial s}{\partial e} \large{|}_vde +\normalsize{\frac{\partial s}{\partial v}} \large{|}_e dv$$ What does the differential change $$ds=\frac{\delta q}{T}$$ have to do with the first? For starters, an important question on everyones mind should be what is temperature? Is it a physical quantity that we have intuition about what it represents? In certain circumstances perhaps, but in general we have no intuition about what temperature truly represents. So what is temperature? It is simply defined by $$T=\frac{\partial e}{\partial s}\large{|}_v$$ If you want to go start your own country and define temperature some other way feel free to do so, but no one is going to follow you. The first equation is thus $$ds = \frac{de}{T} +\normalsize{\frac{\partial s}{\partial v}} \large{|}_e dv$$ So how do you get to the second equation? Two simple assumptions: no volume change occurred, in other words no physical work, and the internal energy change was strictly due to heat transfer ($dv=0$ and $de=\delta q$). Voila!

  • I believe this is simply begging the question: all you've done is played with basic definitions of the total differential, but to do this assumes that the things differentiated are indeed functions of state. The only physics you state is that there is a definition of temperature involved. Moreover, there most certainly IS basic intuition for the temperature: simply reasoning carefully about an abstract notion of hotness and coldness begets Carnot's theorem, as well as a proof that the second law implies one can define a new function of state $S$. Richard Feynman certainly thought this was ... – Selene Routley Oct 31 '13 at 23:07
  • ... very deep intuition and I agree with him - it's a stunningly insightful way to organise the 19th century experimental findings that befuddled the best minds. Moreover, if you don't like that intuition, then you can appeal to its being a parameter of the pdf for the grand canonical ensemble, a definition which carries over even to small gatherings of particles and beautifully asymptotes to the macroscopic temperature with increasing particle number. – Selene Routley Oct 31 '13 at 23:13
  • I would disagree that there is intuition for temperature itself. There is intuition for what happens in the presence of a temperature difference, but I would not describe that as intuition for the quantity alone. A better intuitive understanding of thermodynamic equilibrium can be had through entropy and statistical mechanics. I do appreciate what you've written above in terms of the historical origin of entropy. However, unlike Carnot, we now understand what entropy is in a concrete sense and don't need abstract notions of hotness and coldness to imply entropy. – SimpleLikeAnEgg Oct 31 '13 at 23:35
  • If it is only an intuition for temperature difference, why does it yield a concrete, experimentally meaningful definition for $\Delta Q/T$ and not $\Delta Q/\Delta T$ (the former, unlike the latter, is most certainly not invariant with respect to linear shift in the temperature "origin")? I think what you're trying to say is that, strictly speaking, it yields only a ratio definition for temperature, which concept thus becomes fully defined upon our definition of $T=1$. However, the gig is the same for all quantities: e.g. length is a ratio of an observed "length" to the unit metre. – Selene Routley Oct 31 '13 at 23:46
  • I would not call any of that intuition about what Temperature represents. The best intuition of what temperature represents can be found in the definition $$T=\frac{\partial e}{\partial s}|_v$$ This statement provides no intuition alone, but with an understanding of entropy it provides a great deal of intuition. – SimpleLikeAnEgg Oct 31 '13 at 23:57
  • A VERY big maybe. Intuition depends on the beholder. Are the works of E. T. Jaynes wonted to you? Check out http://bayes.wustl.edu/etj/node1.html and be sure to include http://bayes.wustl.edu/etj/articles/theory.1.pdf and http://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf if not. You might not think entropy so concrete then. The only "concrete" (i.e. one we can experimentally get at) definition of the entropy is the Boltzmann (what Jaynes calls the "Experimental" one), in general quite different from what notions of probability and statistical mechanics deal with. – Selene Routley Nov 01 '13 at 00:09
  • But yet you want to say that you have a "concrete" intuition for temperature? I think not. In fact, the lack of intuition regarding temperature was the cause of the difficulty in arriving at the second law of thermodynamics. Temperature is, in and of itself, a completely abstract quantity in the absence of some notion of entropy. – SimpleLikeAnEgg Nov 01 '13 at 00:40
  • What is concrete aside from experiment? Read the Jaynes works. – Selene Routley Nov 01 '13 at 00:41
  • -1. Sorry, the only physics in this answer is how we define the temperature, which the other answers have addressed: the rest is basic math assuming everything is a function of state. Moreover, when we downvote, it is in keeping with the knowledge sharing nature of this site to leave a note why: other readers might learn something. I make VERY, VERY few downvotes for the reason that I believe physics needs to be as free as possible flow of ideas and an assuming of as many different, oblique standpoints. Pedantry for what a subjective notion of "intuition" is gainsays that freedom. – Selene Routley Nov 01 '13 at 00:53