1

I am currently taking a class on Decision Theory where I was introduced to the notion of "entropy" in an information theoretic setting while we were studying decision trees. I was given the following example:

Imagine that you have three sacks - one with four red balls, one with three red and one blue ball, and one with two red and two blue balls. In an information-theoretic way, the first sack has minimum entropy because all the balls are alike hence maximum information; the second one has higher entropy than the first sack while the last one has maximum entropy because you are completely unsure about the information.

To me, this goes against the intuition of entropy from a thermodynamics point of view. The second law of thermodynamics asserts that the entropy is increasing and this will result in the Heat Death of the Universe when all the molecules will have the same temperature and we will not be able to extract any new energy from them. So comparing to the sack example, this seems like that all the molecules at this point of the Universe will have the same temperature so they have the lowest information entropy. However, according to the 2nd law of thermodynamics, the entropy will be at maximum during this stage.

I have always understood the thermodynamic entropy as a way to measure how "similar" different microstates are. The more similar, the higher the entropy. But the information theory definition is the exact opposite. I would love any insight on this apparent contradiction.

  • 1
    You may find useful the answers to this previous question: https://physics.stackexchange.com/questions/375904/information-entropy-and-physics-correlation?rq=1 – GiorgioP-DoomsdayClockIsAt-90 Mar 06 '19 at 22:50
  • Atoms do not have a temperature. – my2cts Mar 06 '19 at 22:52
  • "this will result in the Heat Death of the Universe when all the molecules will have the same temperature and we will not be able to extract any new energy from them." Where did you read this? It is a simplistic extrapolation of kinetic theory of gases to large scales and distant times that is very hard to justify. On the scale of galaxies, gravity makes equal temperature very unlikely, such a state would have low entropy.

    – Ján Lalinský Mar 07 '19 at 01:17
  • I think that you should look at the statistical definition of thermodynamic entropy to resolve your confusion. https://en.wikipedia.org/wiki/Entropy#Statistical_mechanics . – anna v Mar 07 '19 at 04:47

1 Answers1

1

the last one has maximum entropy because you are completely unsure about the information.

This is misunderstanding. The information "content" is quantified by the information entropy, if we know this entropy, we know the information "content". What we are unsure of is what color will be a ball randomly drawn from the sack.

will have the same temperature so they have the lowest information entropy. However, according to the 2nd law of thermodynamics, the entropy will be at maximum during this stage.

No, molecules having the same temperature does not mean the information entropy assigned to them is minimal. Temperature is not a property of a molecule, it does not play the same role as color in the balls-in-sack example. Temperature is a macroscopic property of a piece of matter, or a property of statistical distribution used to describe this piece of matter.

Molecules in a volume of gas where we know only temperature, volume and number of particles, have unknown positions and momenta, or other microscopic states. This lack of knowledge is what makes information entropy for given $T,V,N$ large. In physics, the maximum possible value of information entropy consistent with $T,V,N$ gives thermodynamic entropy of the system.