3

From what I gather (and please correct me if I'm wrong), Jaynes argues that thermodynamic and information entropy are the same since the assumption in statistical thermodynamics that the energy distribution attained is that which maximizes the ways energy is distributed is equivalent to assuming the maximum ignorance distribution, which appears to me to be a subjective concept or at least one relating to information.

However, isn't this similarity just a coincidence? There are physical reasons energy maximizes the ways it is distributed (2nd law) having nothing to do with the fact that the distribution it attains happens to maximize our inability to describe the energy (maximizes ignorance). .

(Here I am talking about classical physics. I realize the uncertainty principle may unite information and thermodynamics in quantum.)

Update: The discussion in the link below has some claim that information and thermodynamic entropy are independent specific examples of a more general concept, but they are not equivalent.

Is information entropy the same as thermodynamic entropy?

Qmechanic
  • 201,751
SuchDoge
  • 427
  • Related: https://physics.stackexchange.com/q/398883/109928 – Stéphane Rollandin Apr 10 '18 at 19:02
  • We often use anthropomorphic terminology, e.g. "ignorance", when discussing mathematical or physical concepts. But that does not mean that these concepts are subjective. It simply means that we are appealing to our intuition in order to understand an abstract concept. – Lee Mosher Apr 11 '18 at 20:17
  • What paper and section do you refer to? I do not think Jaynes meant, generally, that thermodynamic entropy is the same as information entropy. I think he meant that thermodynamic entropy can be calculated/understood as information entropy of probability distribution, if that distribution is assigned based on macroscopic state of the system. – Ján Lalinský Apr 11 '18 at 20:47
  • @Jan Lalinsky, His "The Physical Review, Vol. 106, No. 4, 620-630" titles "Information Theory and Statistical Mechanics" in the introduction he states : "The mere fact that $-\sum p_i \log p_i$ occurs both in statistical mechanics and in information theory does not in itself establish any connection between these fields. This can be done only by finding new viewpoints from which thermodynamic entropy and information-theory entropy appear as the same concept. In this paper we suggest a reinterpretation of statistical mechanics which accomplishes this". – SuchDoge Apr 13 '18 at 19:10
  • He also says in that same article "...however, we can take entropy as our starting concept, and the fact that a probability distribution maximizes the entropy subject to certain constraints becomes the essential fact which justifies use of that distribution for inference". This is what I am arguing is not justified. Just because two ideas lead to the same result doesn't mean the ideas are the same thing. – SuchDoge Apr 13 '18 at 19:17
  • @SuchDoge, that is an unclear statement that can probably be read in more ways, but further reading of Jaynes later works makes clear he was aware of the difference between the concept of thermodynamic entropy (function of macrostate) and the concept of information entropy (functional of probability distribution). Your question is very unclear - why do you think his viewpoint has to be justified just by that single argument? He provided a different, useful viewpoint on statistical physics - the justification is in the thinking aid and the results it gives. – Ján Lalinský Apr 13 '18 at 22:07
  • @JánLalinský I don't know how to interpret your question "why do you think his viewpoint has to be justified just by that single argument?" Can you please explain what you are referring to as "that single argument"? Also, I think there are people out there who conflate information and thermodynamic entropy, thus adding to the already monumental confusion surrounding thermodynamic entropy e.g, entropy is disorder, entropy is information the universe lost track of etc...both statements I think are not precise and thus likely wrong. Some of these people refer to Jaynes for support. – SuchDoge Apr 16 '18 at 19:32

2 Answers2

1

We needn't go to QM for the moment. If you take a lecture course in classical thermodynamics, the second law is derived from informational considerations, although they typically discuss it in terms of maximising the number (typically denoted $W$ or $\Omega$) of microstates per macrostate. We can then show the second law's entropy-increasing formulation is equivalent to heat moving from warmer sources to cooler ones, but again, the information analysis is fundamental (it's even needed to define what temperature is).

J.G.
  • 24,837
  • You've confirmed how it is taught by saying "they typically discuss it in terms of maximising the number (typically denoted W or Ω) of microstates per macrostate" but I don't see your argument for how this is fundamentally an information phenomenon rather than a phenomenon that results from the fact that it's easier to maximize the ways energy is distributed than to attain a specific, non-maximal energy distribution. – SuchDoge Apr 10 '18 at 18:43
0

The similarity pointed by Jaynes between thermodynamic entropy and information entropy could be coincidental but its appeal is too powerful to resist. Just think about it:

  • Information entropy: Someone tells us they have a coin in their pocket and ask us to guess the probability of Heads and Tails. They won't let us see the coin, let alone experiment with it. Knowing nothing else and forced to give an answer most people would say that the probability is 50-50. It turns out this is the probability of heads and tails that maximizes the Shannon entropy of all possible distributions we can assign to the coin.
  • Thermodynamic entropy: A system at constant energy, volume and number of particles exists in an enormous number of possible microstates. We want to guess the probability distribution but we cannot see the microstates, nor can we play with them to see what are their relative probabilities. Knowing nothing else and forced to give an answer we say that all microstates are equally probable. Again it turns out that this is the distribution that maximizes the Shannon entropy among all possible distributions we can assign to the microstates.

The argument does not "prove" anything, but it is hard to ignore. To Jaynes it was so profound that the first version of his paper on this topic was titled How Does the brain Do Plausible Reasoning?. The paper was rejected (the review and Jaynes's response can be found at the end of the manuscript in the previous link) but its subsequent reincarnation as Information Theory and Statistical Mechanics has some 15000 citations as of today.

Themis
  • 5,843