10

I read that thermodynamic entropy is a measure of the number of microenergy states. What is the derivation for $S=k\log N$, where $k$ is Boltzmann constant, $N$ number of microenergy states.

How is the logarithmic measure justified?

Does thermodynamic entropy have anything to do with information entropy (defined by Shannon) used in information theory?

New Horizon
  • 1,762
  • 2
  • 17
  • 25

6 Answers6

13

I think that the best way to justify the logarithm is that you want entropy to be an extensive quantity -- that is, if you have two non-interacting systems A and B, you want the entropy of the combined system to be $$ S_{AB}=S_A+S_B. $$ If the two systems have $N_A,N_B$ states each, then the combined system has $N_AN_B$ states. So to get additivity in the entropy, you need to take the log.

You might wonder why it's so important that the entropy be extensive (i.e., additive). That's partly just history. Before people had worked out the microscopic basis for entropy, they'd worked out a lot of the theory on macroscopic thermodynamic grounds alone, and the quantity that they'd defined as entropy was additive.

Also, the number of states available to a macroscopic system tends to be absurdly, exponentially large, so if you don't take logarithms it's very inconvenient: who wants to be constantly dealing with numbers like $10^{10^{20}}$?

Ted Bunn
  • 19,793
  • 2
    I think there's also a further argument to be made about the arbitrary cell size chosen in phase space when counting energy states. Is it not true at least to an approximation that using a smaller cell size changes the entropy by only an additive constant? So changes in entropy are independent of the choice of cell size? – Marty Green Jun 28 '11 at 02:08
  • @Marty Green: I once had an interesting discussion about it with one of my professors and we concluded (without proof) that every experimental quantity is independent of the cell size, as every experiment I could come up with that supposedly measured it, was flawed in some very subtle way. – Kasper Jun 28 '11 at 11:15
  • 1
    Good point! Of course, in quantum statistical mechanics the size of the cell is not arbitrary -- that is, you can actually count states. But in situations where you can't, the size of the cell shouldn't matter. In classical thermodynamic situations, that means that entropy is only determined up to an overall additive constant. – Ted Bunn Jun 28 '11 at 14:31
5

The thermodynamic entropy is not what you wrote. That is the information entropy. Microstates and the counting of are not thermodynamic concepts, but rather statistical mechanics ones. Thermodynamics proceeds by identifying reversible processes, and goes on to study them. A key point is noticing that there are quantities called state variables which are intrinsic to a substance, which do not depend on the path used to obtain it, such as temperature, or pressure. Heat transfer is explicitly (and experimentally) not one such quantity. However, by considering a perfect reversible heat engine, it is possible to show that the quantity $dQ/T$ is a state variable. Integrating it gives $$\Delta S = \int \frac{dQ}{T}$$ and is the change of entropy. Thermodynamically one can only talk of changes, and the zero is not particularly well defined --- the 3rd law of thermodynamics tries to define it, but it is not always applicable.

genneth
  • 8,669
4

The connection between thermodynamic and information theoretic entropies is a deep connection, and has been an active research area for more than a century.

To get more information, you can look at

Various Researches on Maxwell's demon,Szilard's engine and Landauer's principle can also be helpful.

  • 1
    The web page (http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html) referred to by Frédéric Grosshans no longer seems to exist. – RussAbbott Jan 22 '12 at 17:16
  • One can find it on the wayback machine: http://web.archive.org/web/20100920092146/http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html – Frédéric Grosshans Jan 24 '12 at 10:29
2

The reason for the logarithm in the statistical definition of entropy is that it makes the entropy additive and the number of microstates multiplicative. This means that if you have subsystems A & B, total number of microstates is the product of the subsystem’s number of microstates, and the entropy is additive over subsystems due to the properties of the logarithm. A way to see the relation of thermodynamic entropy and statistical entropy is to picture the volume occupied by a system of classical particles as partitioned in cells of very small volume. Then you start counting the number of ways to distribute N particles in the cells. The number of microstates goes like the numbers of cells to the power of the number of particles. The number of cells is the volume divided by the number of partitions. Taking the logarithm of this expression you get number of particles multiplied by the logarithm of the volume over the number of partitions. Multiplying by k you get an expression for the entropy. Then the change of entropy, for a process where volume changes, is obtained by subtracting the final entropy from the initial entropy. The number of partitions is cancelled from the equation by the properties of logarithms. So this is another reason for the logarithm in the expression. What you get finally is the Boltzmann constant multiplied by the number of particles and the logarithm of final volume over initial volume. The way this result is obtained makes clear that the change in entropy is path independent. The same expression for the change in entropy can be obtained, using thermodynamics, for the reversible isothermal expansion of an ideal gas, by dividing the heat absorbed over the temperature. If a process is not isothermal then you can break it in small nearly isothermal infinitesimal paths, then integrate the change in entropy. This is the easiest way to answer your question. Similar arguments can be used for quantum systems but they are more complex.

1

Great question about the relationship between thermodynamic entropy and information entropy. I think the most important thing to state is that this is an open scientific question. Jaynes (1957) wrote a paper that is often cited as resolving the two, but many disagree. Read the paper and see for yourself.

One key question is the role of the "alphabet" in information theory. Information entropy is based upon the number of letters in the alphabet; but this makes entropy relative to the lexicon of the observer. This would imply that thermodynamic entropy is relative, which many vehemently disagree with. Thermodynamic entropy is, however, based on a frame of reference (the system being investigated). Further, it is based on distinguishable differences; if two indistinguishable gases are combined, entropy does not change. The number of distinguishable elements might be used as the "alphabet" in thermodynamics; but this still seems to require an observer.

One issue is that Entropy can be used in several different ways. Entropy is sometimes used to describe a system state (how much entropy in the system; as in Shannon, Boltzmann and Gibbs equations), a change in state (Clausius' equation) or even as a force (see the work of Adrien Bejan at Duke). Schrödinger famously described the idea of "negative entropy" as the objective of all living things, in his essay "what is life?". That is, the intelligence of life -- the information processing that living things must do -- must reduce local entropy. This points at the importance of links between thermodynamics and information processing.

One area where thermodynamics and information converge is in artificial intelligence, and specifically, in the idea of the Boltzmann Machine. Using simulated annealing, Boltzmann Machines (from Ackley, Hinton, and Sejnowski, precursors to today's "Deep Learning" algorithms) "heat up" their internal weights and then "cool". This cooling process minimizes internal energy. Paul Smolensky's "Harmonium" described this same process, but described the objective as "maximizing harmony". When Hinton and Smolensky collaborated, they compromised and called the metric "Goodness of Fit."

How does harmony relate to entropy? In music analyses, the "Harmonic Entropy" is used to quantify dissonance. This is still informational. However, there is a strong relationship between the coherence/decoherence of harmonic oscillators (e.g., any particle) and thermodynamic entropy.

Tegmark, M., & Shapiro, H. S. (1994). Decoherence produces coherent states: An explicit proof for harmonic chains. Physical Review E, 50(4), 2538.

Jaynes, E. T. (1957). Information theory and statistical mechanics. Physical review, 106(4), 620.

Rumelhart, D. E., Smolensky, P., McClelland, J. L., & Hinton, G. (1986). Sequential thought processes in PDP models. Parallel distributed processing: explorations in the microstructures of cognition, 2, 3-57.

Ackley, David H; Hinton Geoffrey E; Sejnowski, Terrence J (1985), "A learning algorithm for Boltzmann machines", Cognitive science, Elsevier, 9 (1): 147–169

0

Hey you have the Thermodynamic definition which is :

$$\Delta S = \int \frac{dQ}{T}$$

and the statistical definition:

$S=k\log N$

N- the number of the possibly states of the system (including degeneration).

For mostly everything they are equivalent!!

note that you can not measure enytopy, but you can measure the $\Delta$ (the change) in the entropy of the system.

0x90
  • 3,316