45

I am studying entropy and its hard for me to catch up what exactly is entropy.

Many articles and books write that entropy is the measure of randomness or disorder of the system. They say when a gas system is let expand the randomness increases etc. But they end up saying $\frac{\mathrm dQ}{T}$ is the measure of increase in randomness and is called the entropy.

Even if I believe that entropy is the measure of randomness of the system I don't understand:

  1. How does $\frac{\mathrm dQ}{T}$ hold the information about the randomness of the system?
  2. How is entropy independent property of any system. I suppose that any two parameter in the equation $PV=nRT$ should completely describe the system. Why would we need entropy?

Thank you.

pranphy
  • 724

7 Answers7

21

This answer is somewhat hand-wavy, but I do believe it should help to grasp the concepts on an intuitive level.

First of all, entropy is not a measure of randomness. For an isolated system in equilibrium under the fundamental assumption of statistical mechanics, the entropy is just $$ S=k\ln\Omega $$ where $\Omega$ is the number of microstates - microscopic system configurations - compatible with the given macrostate - macroscopic equilibrium state characteristed by thermodynamical variables.

It follows from the second law $$ \delta Q = T\mathrm{d}S=T\mathrm{d}(k\ln\Omega)=kT\frac1\Omega\mathrm{d}\Omega $$ or equivalently $$ \mathrm{d}\Omega = \Omega\frac{\delta Q}{kT} $$ The energy $kT$ is related to the average energy per degree of freedom, so this formula tells us that the transfer of heat into a system at equilibrium opens up a new number of microstates proportional to the number of existing ones and the number of degrees of freedom the transferred energy may excite.

Christoph
  • 13,545
  • This answer effectively complements what has been already said, and succintly shows the relation between added heat and the increase in the number of possible microstates corresponding to the same macrostate; and does it without assuming no more than an elementary background of statistical mechanics. +1 – Mono Aug 04 '12 at 17:26
20

In my opinion, it isn't strictly correct to say that entropy is "randomness" or "disorder". The entropy is defined in statistical mechanics as $-k_B \sum_i p_i \log p_i$, where $k_B$ is Boltzmann's constant (which is only there to put it into physically convenient units) and $p_i$ is the probability that the system is in state $i$. These probabilities do not mean that the system is "randomly" jumping from one state to another (although quite often it is), they just mean that you, as an experimenter observing the system, don't know exactly which state is in, but you think some are more likely than others. Since Shannon (1948) and Jaynes (1957), this formula for the entropy has been interpreted in terms of the information that an experimenter has about a system: the less information, the more entropy. (Those links are just for completeness - I wouldn't recommend reading them as your first introduction to the subject.) The amount of information an experimenter has about a system can decrease for many reasons, but the only way it can increase is if the experimenter makes a new measurement. This is the reason for the second law of thermodynamics.

It should be noted that there are many different perspectives on the meaning of entropy and the second law, and not everyone agrees with the one I outlined above. However, I will try to answer your two questions from this point of view.

  1. From a modern perspective, it's better to view $dS = \frac{dQ}{T}$ as a definition of $Q$ rather than of $S$. After all, $S$ already has a definition in terms of the probabilities. If we view $dQ$ as being defined as $TdS$ we can see that it's equal to $dU + PdV - \sum_i \mu_i dN_i$ (by rearranging the fundamental equation of thermodynamics), which is equal to the total change in energy minus the energy that's transferred in the form of work. (Here I've defined work as "mechanical work" $PdV$ plus "chemical work" $-\mu_i dN_i$. You can also add terms for electrical work, etc.)

  2. There are several reasons we need to consider the entropy of an ideal gas. One is that $T$, which appears in the ideal gas law, is defined as $T=\frac{\partial U}{\partial S}$, so the $S$ comes in that way. Another is that the equation $PV = nRT$ does't tell you how the temperature changes when you add energy to the system. For that you need to know the heat capacity, which is closely related to the entropy. Finally, the concept of entropy is extremely useful in understanding why you can't build a perpetual motion machine.

If this point of view sounds like it might make sense to you, it might be worthwhile reading this paper by Jaynes, which takes a historical perspective, explaining how entropy was first discovered (and defined in terms of $dQ/T$), and how it then unexpectedly turned out to be all about information.

N. Virgo
  • 33,913
  • 1
    We can't really use this equation to define Q because it can not be extended to systems which are not in equilibrium. For such systems, the heat flux and entropy are well defined but not necessarily T. Temperature is a macroscopic parameter which happens to be equal with the average kinetic energy for systems in equilibrium. For systems out of equilibrium, everything is possible: from a multiple temperature to no temperature at all. I don't see any simple explanation to the question out of statistical physics. – Shaktyai Aug 04 '12 at 20:09
  • @Shaktyai I disagree that the heat flux can be well defined for systems with no definable $T$. The energy flux is always definable, but if there's no $T$ then there's no meaningful way to partition it into work versus heat. Or at least, I don't know of an example where this can be done. If you can show me one I'll change my answer. – N. Virgo Aug 04 '12 at 22:57
  • In non equilibrium statistical mechanics, the heat flux is just the moment of order 3 of the velocity: =int(1/2m(V-v)^2vf(r,v,t)) where V is the average velocity V=int(v*f(r,v,t)) If the system is not in LTE, then f(r,v,t) is not a Maxwellian and T is not defined. In collisional radiative models (star's atmosphere or fusion plasmas) it is very common to encounter distribution function with two or no temperature. – Shaktyai Aug 05 '12 at 17:42
14

A wealth of meaningful info is contained in the above answers. However, a short and simple intuitive picture still seems missing.

The bottom line is that temperature measures the energy per degree of freedom, and hence $\frac{dQ}{T}$ measures nothing more than the number of degrees of freedom over which the energy has spread. The number of degrees of freedom describes the microscopic complexity (as others have remarked, the term 'randomness' many consider less appropriate) of the system - the amount of information needed to specify the system down to all its microscopic details. This quantity is known as the (statistical) entropy.

You might like this blog that discusses the subject.

Johannes
  • 19,015
8

You should think of the equation

$$ dS = {dQ\over T}$$

As the definition of temperature, not of entropy. Entropy is more fundamental--- it's the size of the phase space, the log of the number of possible states. The temperature is a derivative of this with respect to energy.

To understand why this makes sense, put two systems side by side. If the energy flows from hot to cold, the loss of entropy in the hot system is more then compensated by the gain in entropy of the cold system. So energy will flow from hot to cold, statistically, on average.

It is not the properties of temperature which make $dQ\over T$ an entropy change, rather it is the properties of the entropy which makes the coefficient of $dS\over dQ$ an inverse-temperature.

4

It's worth noting that your definition of an elemental change in the entropy of a system, namely:

$dS=\displaystyle\frac{\delta Q}{T}$

It's just valid for an internally reversible change. This is not a technicism which can be omitted; I think part of your question might be related to the notion of heat (a measurable amount of energy transferred) and statistical uncertainty (which is, up to alternative and equivalent interpretations, the intrinsic meaning of entropy).

In an internally reversible process which involves heat addition or substraction from a system, that T under de heat (inexact) differential must be a uniform temperature across the system's spatial extension up to it's boundaries, so that at every moment the temperature of system's boundaries is equal to it's bulk temperature (and unique). That means that there are no temperature gradients inside the system of interest, and because of that very fact, there aren't any possible heat exchanges inside the system's boundaries. That is because, for a system to exchange heat with something else, there must be a difference in temperature between them, and if the difference is zero (they are equal) then no heat will be transferred. If you think about it this is a sound argument: a cold glass of water gets increasingly hotter when you leave it in a room, but when it reaches the same temperature of the air around it, then there's no more change and it stays there indefinitely.

Going back to the original equation, you can now interpret the RHS as telling you that, at situations where the system's temperature is uniform at every moment, the ratio of the infinitesimally small amount of heat added or substracted to the system by it's environment, and the unique temperature at every point of the system (which is nothing more but a measure of the mean kinetic energy of individual molecules which make it up), is equal to it's change in entropy. And what is entropy? Well, macroscopically talking, you can take what I've written above as a definition of entropy, and you can thermodynamically deduce that it is indeed a state function (it only depends on the point properties of the system, like it's pressure and temperature) and it doesn't depend upon the chain of events by which that state was reached.

On the other hand, statistical mechanics (which is a more recent way of addressing what we see macroscopically as thermodynamical properties, like entropy, starting from a mechanical description at the molecular level) gives us more details on the nature of entropy. I think it's better to think about it not as a measure of randomness but as the (macroscopic) uncertainty of the (microscopic) state of the system.

I'll give you a simple example: imagine you had a pool table with it's top totally covered by an opaque fabric, with just one open end for introducing the cue stick. Assume now that you know (by some means) that the eight balls are distributed in the table forming a straight line with an equal spacing between them, but you don't know where exactly this line stands in the table's rectangular area; and that, for the purpose of the experiment, the white one is just next to the hole (and of course you know it). Now, you take the cue stick, introduce it in the fabric's hole left open, and strike the cue ball. After a few seconds of (hearing) collisions, you can be sure that movement stopped under the fabric. What happened to your knowledge about the system?

Well, you don't know where does each ball gone (we've sealed the pockets, of course!) but you didn't knew it before the strike, did you? But then, you at least knew they were forming a line, and that information is now gone. From your outside point of view, your prior information about the positions of the balls and the energy and momentum you introduced in the system trough the strike isn't enough to rule out a huge number of possible actual distributions of the balls. At the begining of the experiment, you could at least write down the number of possible positions of the line of balls (perhaps by drawing a grid over the table's area, with each cell's side length equal to a ball's diameter, and counting the number of longitudinal cell lines) but now the number of possible positions has multiplied. Before and after you only have partial knowledge of the system's configuration (all you can do is count the possible ones, based on what you know about the system from the outside, which restrict the possibilities) but that knowledge has decreased after the experiment. It has nothing to do with the physics of the collisions between the balls: it has to do with the fact that you can't see the balls from your point of view, and all you can do is retrieve partial information through indirect measurements.

The analogy with the example above in a statistical system is that by measurements of macroscopic observables (like temperature, pressure, density, etc) we only measure mean microscopic properties. For example, temperature is a measure of the mean molecular kinetic energy, and pressure is a measure of the mean rate of momentum transferred by striking molecules per area unit. Measuring them gives us partial knowledge of it's microscopic configuration (like the original information you held about the positions of the pool balls). And any change in the macroscopic observables is correlated to a change in the possible (i.e. not ruled out) microscopic configurations, and then that causes a change in our knowledge about it. It turns out that those changes can be measured, and that's indeed entropy variation, in the sense that an entropy increase correlates to an uncertainty increase, or a knowledge decrease. Showing that this relation holds, starting from a mechanical framework, is the whole point behind statistical mechanics.

Finally, I hope you can see now that what $\displaystyle\frac{\delta Q}{T}$ is just analogue to the energy introduced by the strike in the experiment in relation to the previous knowledge of the position of the balls (lower temperatures imply less molecular translational, roational and vibational molecular movements, and vice versa, so it is actually a "partial measure" of their positions). So:

  1. It doesn't hold the information about the randomness of the system, it is just a measure of the increase in uncertainty from a macroscopic perspective, and only holds for reversible processes (in general, entropy can increase without adding energy to a system).

  2. As other answers have stated, entropy is needed to define some of the terms in any state equation (like the Ideal Gas law), and by the way, state equations are just approximations to the actual behavior of real substances (something pretty clear in the "ideal" part of the the law you cite), so it's natural for them to be based on more fundamental concepts (like entropy).

EDIT: As Nathaniel rightly pointed out below, my original statement that the validity of the macroscopic definition of entropy in terms of heat and temperature depended on the (tacitly) total reversibility of the process, was flawed. The only requirement for it to be valid is that the heat exchange process must be internally reversible, becasue we're only measuring this way the change in entropy inside the system (and so external irreversibilities associated with the process are irrelevant).

Mono
  • 447
1

"How does $\frac{dQ}{T}$ hold the information about the randomness of the system"

The answer lies in the microscopic definition of the heat. The velocity of any particle can be written: $V=V_b+v_i$ . $V_b$ is the bulk velocity and $v_i$ the "random" velocity where $\overline{v_i}=0$. The kinetic energy associated to $v_i$ is the heat. So measuring the heat is nothing else than measuring the degree of randomness of the molecules in the system. If all the molecules fly in the same direction then $v_i=0$ and $V=V_b$ : the kinetic energy is the macroscopic kinetic energy $E_c=\frac{1}{2}m{V_b}^2$, if all the directions are equiprobable $V_b=0$ and the kinetic energy is purely heat.

" I suppose that any two parameter in the equation $PV=nRT$ should completely describe the system. Why would we need entropy?" Take two gases $(P_1,V_1,T_1)$ and $(P_2,V_2,T_2)$ put them in contact. You can't predict how the temperature evolves without the entropy.

Shaktyai
  • 2,040
0

An microscopic approach to entropy has lead to great insight and is in detail explained in the given answers.

To understand the concept of entropy there is an equally valid but macroscopic approach that might complement the given answers. The idea has been developed on the basis of 'adiabatic accessibility' and the authors Elliott H. Lieb and Jakob Yngvason have done an excellent job explain this concept, although a little heavy on the mathematical side (arxiv link). Their work has been summarized in the book The Entropy principle by André Thess.

So for whoever is interested in a different approach to rigorously define entropy should give this concept a closer look.

Alexander
  • 4,390
  • 1
  • 17
  • 35