2

I am having trouble in understanding the following concepts :

Pg 231 Appendix B of the link http://books.google.ca/books?id=lEu7CTGjdDkC&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q=entropy&f=false which is of the book Chaos and the Evolving Universe by Sally J. Goerner mentions that Entropy $S$: $$ S = \ln V $$ where $V$ is the phase space volume. According to the book, this equation is from the concept of Boltzmann's entropy.

(Q1) How is This equation coming? How can we say that entropy = log of phase space volume? References and explanation would be appreciated. According to the book, the Bolzmann's constant, $k_B$, is taken to be $1$. But actually there is a value to the constant. Can I take the Boltzmann constant to be equal to $1$?

Also, if the entropy increases, does this mean that the volume decreases?

(Q2) Secondly, can Kolmogorov entropy, from Information theory be stated as logarithm of phase space volume that is equivalent to entropy from statistical mechanics? I am unsure if I can replace Boltzman with Kolmogorov Sinai (KS) entropy.

(Q3) What is the difference between Gibb's entropy and Shannon's entropy since the formula http://en.wikipedia.org/wiki/Entropy_%28statistical_views%29 is the same.

N. Virgo
  • 33,913
Srishti M
  • 291
  • Regarding your fourth question, I would refer here. Although the form of both equations definning entropy seems similar, Shannon entropy has its motivations from information theory. Although contextually different, these two entropies imply physically similar situations, the boltzmann factor however comes due to the logarithmic base conversion. – user35952 Jan 21 '14 at 09:59
  • I am sorry, it should have gone as regarding your third question !! – user35952 Jan 22 '14 at 05:43
  • Thank you for your comments. Is it possible to provide a reference which explains in simplistic manner that statistical entropy can be used in information theory/communications and that they are linked? – Srishti M Jan 22 '14 at 18:05
  • Historically, entropy showed up only in Statistical Mechanics first. As I have already pointed out in this the Shannon entropy (that came later) is associated to information theory, can be used to derive statistical (Boltzmann) entropy. Statistical Physics by Kerson Huang has a smalll section on this. – user35952 Jan 23 '14 at 03:28
  • Related: https://physics.stackexchange.com/a/739917/247642 – Roger V. Dec 07 '22 at 12:52

3 Answers3

2

Since you sound like a self-learner, I'll recommend Chris Cramer's free MOOC course on Statistical Molecular Thermodynamics via Coursera. He's a great lecturer and will give you a very clear-cut explanation to your first question of derivation within the first 3-4 lectures. Depending on the units one is using, it's often common to massage them so that Boltzmann's constant equal to one to make the mathematics easier.

Your following questions can be most easily answered by referencing Arieh Ben-Naim's text A Farewell to Entropy, which, based on E.T. Jaynes' work in statistical mechanics (there's a great fundamental paper he wrote in the mid-20th century), creates an elegant tie relating entropy in statistical mechanics with that of Shannon. You're not very likely to find this type of link in much of the literature given it's recent and growing acceptance and most statistical mechanics texts and journal articles will generally say the two concepts are completely unlinked aside from their names and the forms of their equations.

2

Q1: The important thing to know is that there are several distinct concepts of entropy in statistical physics and mathematics and there is no "the entropy" (life is hard). Only in thermodynamics, the word entropy has clear meaning by itself, because there it is the Clausius entropy. To answer your question, in short: it can be shown that for macroscopic systems (large number of particles), in quasi-static processes that isolated system undergoes, the quantity $\ln V$ behaves as the Clausius entropy in thermodynamics, that is, it does not change as the process proceeds. So it is sometimes called entropy too (I do not think Boltzmann entropy is a good name for it, since it is not clear whether Boltzmann thought this to be "the entropy"; it is said he never wrote in his papers and was first written down by Max Planck). It would be better to call it, say, phase-volume-entropy or so :-). The volume of the phase space could be that of the region of phase space accessible to the system, or to the region corresponding to lower energy than the system has. For ordinary macroscopic systems, these give the same value of entropy.

Q2: I do not know, but there seems to be a connection.

Q3: they are largely the same, the difference is that the Gibbs formula with probabilities is meant for states of a macroscopic physical system; for such system in thermodynamic equilibrium, the Gibbs formula with the Boltzmann exponential probabilities gives value that is practically equal to the value of the phase-space-entropy (for phase space that is consistent with the macroscopic variables of the equilibrium state).

The Shannon expression describes something very different, a degree of uncertainty of the actual value of some variable (say, one character of a message). There is a connection; the maximum possible value of the Shannon expression given fixed average energy and temperature is almost equal to the phase-volume-entropy for phase space region that would be assigned were the system isolated and in state with the same values of macroscopic quantities (energy, volume, ...) This is basis of the information-theoretical approach to statistical physics (see works of Edwin Jaynes on statistical physics http://bayes.wustl.edu/etj/node1.html ).

  • In Answer to Q1, I would appreciate a reference where it is specifically mentioned that V = phase space volume i.e entropy is represented as phase space volume. What is then Boltzmann entropy? – Srishti M Jan 22 '14 at 02:37
  • That is how does the number of microstates become the phase sapce volume? – Srishti M Jan 22 '14 at 03:00
  • 1
    Take a look into the book Shang Keng Ma: Statistical Mechanics, World Scientific, 985, chapter 5 for discussion of the Boltzmann phase-volume-entropy. Region of phase space can be divided into small cells. The number of these cells is sometimes called "the number of microstates". See Landau, Lifshitz, Statistical physics part 1 (3ed., Pergamon, 1980), sec. 7 on entropy. – Ján Lalinský Jan 22 '14 at 03:46
  • I've edited the first part of the answer to clarify. – Ján Lalinský Jan 22 '14 at 03:51
  • The thing is I have used the entropy as phase space volume in communications in the context that entropy S= Kolmogorov entropy = log phase space volume. Is this relation correct?I need to justify and I am unable to find such a link.Could you help?Thank you – Srishti M Jan 22 '14 at 04:22
  • Appreciate for the links, very valuable. I have a last question > if the volume is very small then entropy is negative. So, should entropy be a positive quantity? The value of volume should be a positive quantity? – Srishti M Jan 22 '14 at 19:14
  • @SrishtiM : I am at loss to understand you here, how can volume be negative possibly ?? Secondly, the smallest volume element would some hypercube (depending on the dimensions). Thirdly, entropy can never be negative, since if you look at the definition here with the fact that probability is always such that $ p_i \le 1 ::::: \forall i=1,..n $ – user35952 Jan 23 '14 at 03:36
  • Well, say if the numeric value of volume = 0.0001 then the logarithm of it which is the entropy will come out to be negative.Thats why I asked that question. – Srishti M Jan 23 '14 at 17:55
  • In classical physics, entropy is defined only up to a additive constant; only differences of entropy matter. In quantum theory, entropy is defined as log of dimensionless number that is $\geq 0$ , so it will always come out positive. – Ján Lalinský Jan 23 '14 at 18:05
1

Thought I would throw a bit of history and philosophy of science in here for your amusement, starring none other than Von Neumann and Shannon...

Shannon replied: My greatest concern was what to call it. I thought of calling itinformation', but the word was overly used, so I decided to call it `uncertainty' . . . John von Neumann, he had a better idea. Von Neumann told me, "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage."

(McIrvine and Tribus, 1971 See also Tribus, 1988)

Bruce Long
  • 261
  • 1
  • 8