Lack of information is entropy. But entropy has dimension and information has no dimension. Then how do we relate these two terms..
-
4The Boltzmann constant $k_B$ in "natural units" is just 1. – Quillo Oct 06 '22 at 16:55
-
@Quillo Is information mathematical or physical quantity – quanity Oct 06 '22 at 17:10
-
You can always normalize physical quantities (usually with meaningful quantities of the problem you're investigating) to get non-dimensional equations. And especially when you find log and exponentials in physics, you should always remember that their argument are numbers without physical units, to avoid questions like what's the log of a meter of exponential of a Kelvin? – basics Oct 06 '22 at 18:53
-
Entropy has absolutely nothing to do with "information". Entropy is temperature dependent. Information is not. Shannon really messed up a lot of minds, I am afraid. – FlatterMann Oct 06 '22 at 23:01
-
1@FlatterMann Sorry to be that blunt, but please stop to pass off your opinions as facts. If you don't like the information theoretic approach, fine. But what you say is nonsense. – Tobias Fünke Oct 07 '22 at 06:58
-
@JasonFunderberker You can find the physical definition of entropy in every thermodynamics textbook: dS=deltaQ_rev/T or dS>=dQ/T. You are welcome to accuse the authors of publishing only their "opinion", if you like. – FlatterMann Oct 07 '22 at 09:19
-
1@FlatterMann I did not mean to say that the definition of entropy in thermodynamics is 'wrong'. I meant that the definition of entropy in terms of information theory makes sense, i.e. statistical mechanics (and thus thermodynamics) can be build on information theoretic principles and there are a lot of books, articles etc. doing so (that doesn't mean that there are no room for criticism, tho). If you don't understand this approach, please don't say that "Entropy has absolutely nothing to do with information". – Tobias Fünke Oct 07 '22 at 09:23
-
@JasonFunderberker Statistical mechanics still has temperature. Absolutely everything in physics either has temperature (third law of thermodynamics) or is a system far from thermodynamic equilibrium (i.e. it has strong directional energy flow and then statistical mechanics doesn't apply). This is the physics stack exchange. If you want to talk about information and coding, they will be entertaining your thoughts in the computer science stack exchange just fine and there you won't have to think about temperature either. Over here, however, you will have to. – FlatterMann Oct 07 '22 at 09:31
-
1@FlatterMann Sorry, it seems that you have absolutely no idea on the information theoretic approach of statistical mechanics and I guess you've never looked into a single textbook /article dealing with this. I thus welcome you to write the authors and publishers of all the books on this subject that they're completely wrong :). PS: Temperature has the very same definition in an information theoretic approach as in 'classical' statistical mechanics. I don't know what you want with temperature here. – Tobias Fünke Oct 07 '22 at 09:33
-
@JasonFunderberker Like I said, you are welcome to argue with the authors of thermodynamics textbooks. Set up a play date with pillow fight, if you like. "You are wrong" is not an argument and honestly, this is growing tedious and is not supposed to be in the comment section to begin with. Have a nice day. – FlatterMann Oct 07 '22 at 09:36
-
@quanity, to understand my comment on $k_B=1$, please take a look at this detailed answer: https://physics.stackexchange.com/a/231065/226902. In short: it is possible to regard the physical dimension of k_B (and hence of the entropy) as a "historical accident". – Quillo Oct 07 '22 at 11:11
-
@FlatterMann see, e.g., these questions: https://physics.stackexchange.com/a/269504/226902 https://physics.stackexchange.com/q/263197/226902 (both closely related to this question). Moreover, Maxwell’s demon thought experiment showed how entropy is related to the lack of knowledge on the system. Szilárd’s engine (1929) showed how information on a system can be put to advantage to apparently “violate” the 2nd law... some links between thermodynamic entropy and "information" pre-date Shannon. – Quillo Oct 07 '22 at 14:01
-
@Quillo Maxwell's demon has a temperature. Seriously... the third law of thermodynamics doesn't go away. You can't magically remove it from physics. – FlatterMann Oct 07 '22 at 14:09
-
2Maybe a quote from Jaynes might in place here: "By far the most abused word in science is "entropy". Confusion over the different meanings of this word, already serious 35 years ago, reached disaster proportions with 1948 advent of Shannon's information theory, which not only appropriated the same word for a new set of meanings; but even worse, proved to be highly relevant to statistical mechanics." (E. T. Jaynes, Ann. Rev. Phys. Chem. 31, 579 (1980)) – John Oct 07 '22 at 20:03
-
@John thank you, I liked the quote.. but what should we learn from it? Maybe not the original Shannon's entropy, but the "information" (or "inference", as advocated by Jaynes itself) interpretation adds meaning to what was a purely phenomenological quantity. That's all, as far as I understand. What's your understanding? – Quillo Oct 08 '22 at 01:53
-
See also: https://physics.stackexchange.com/questions/131170/what-is-entropy-really – hft Oct 12 '22 at 20:40
-
See also: https://physics.stackexchange.com/questions/263197/is-information-entropy-the-same-as-thermodynamic-entropy – hft Oct 12 '22 at 20:41
-
See also: https://physics.stackexchange.com/questions/606722/how-do-different-definitions-of-entropy-connect-with-each-other – hft Oct 12 '22 at 22:14
-
@FlatterMann The original Shannon entropy invented for communications has temperature in it albeit indirectly. You have the temperature buried in the conditional distribution $Pr[\mathbf Y|\mathbf X]$ upon which the channel capacity (mutual information and receiver entropy) depend. Here the $X_k$ and $Y_j$ are the input and output symbols and of course $Pr[Y_j|X_k]$ depends on the receiver noise because $Y_k$ depends on the noise, hence its temperature. The source entropy, that is the entropy of $X_k$ is independent of thermal noise. – hyportnex Oct 12 '22 at 23:17
-
@hyportnex That's interesting. I need to read the original article, then. It is not about "information", then, but about digital signal transmission in a noisy environment. That does, of course, have temperature and that concept is usually handled correctly in textbooks about digital information transmission. The way most people talk about "information", though, it is not a thermodynamic quantity. – FlatterMann Oct 13 '22 at 21:10
-
@FlatterMann if I may suggest so then read Viterbi & Omura: Principles of Digital Communications and Coding, 3.1 and 3.2, pp 128-143; (yes, that Viterbi!) and if something is not clear just read some of the prior sections. see here https://archive.org/details/ost-engineering-principledigcom00viterich/page/n61/mode/2up – hyportnex Oct 13 '22 at 21:24
-
@hyportnex I own several books on digital signal transmission and I know all those things from practical applications. That is why I am saying that "information" by itself has absolutely nothing to do with entropy. Entropy is a state function of physical systems. Information is just a complicated form of counting beans. Nature simply can't count beans without some uncertainty due to a finite temperature. – FlatterMann Oct 13 '22 at 21:27
3 Answers
Information has a pseudo-unit. Like angles can be measured in "radians," "cycles," or "degrees," entropy can be measured in "bits," "digits," or "nats." The units for entropy are $J\,\mathrm{K}^{-1}$. If you look at the Boltzmann or Gibbs formulas for entropy, though, those units are just bolted on in Boltzmann's constant out front. Now, it may be intellectually dissatisfying to theorists who like to work with "natural units", but there's a good reason for it.
The universal gas constant in $PV=nRT$ is approximately $R = 8\,\mathrm{J\,mol^{-1}\,K^{-1}}$. It has a nice value similar to 1. If we look at the Boltzmann constant version, though, we get $PV=NkT$ giving $k$ a value like $10/(\text{Avogadro's number})$ (in fact, $R=kN_A$). In other words, because of the way combinatorics works, real thermodynamic entropies would be stupidly large numbers without a pseudo-unit to bring it down to size, even inside of a logarithm. In other words, you can think of $\mathrm{J}\,\mathrm{K}^{-1}$ as measuring entropy in approximately deci-Avogadro's number of nats.
Now, there are many who object that thermodynamic entropy and information entropy are not the same thing. For starters, information theory entropy is observer dependent, while thermodynamic entropy is, to the best of our ability to determine it, not. I think, honestly, that thermodynamic entropy may end up being observer dependent but only if quantum entanglement can make the number of states available to the system observer dependent.
In a little more detail, the Boltzmann formula for entropy is $$S = k\ln W$$ where $W$ is the number of microstates consistent with the macro-state of the system. This is the form of entropy that you use for an isolated system. If you compare it to the Gibbs entropy, it is the same as if you assume all states with non-zero probability are equally probable.
Information theory borrowed the formula for Gibbs entropy, that's why they called "information" "entropy". There the probabilities aren't tied to "states of the system" but to some enumerated set of symbols that could be produced by a communication channel.
If you say that the output of a communication channel is analogous to the outcome of an experiment measuring the state of a system, it's tempting to say that these quantities are the same thing. They're not, though. Classically, you could invoke a Maxwell's demon type character that knows the micro-state of the system/the future outputs of the channel. For this demon, both systems would have zero information entropy. In information theory that isn't a problem, since the entropy there is just a measure of the surprise of the observer of the channel's output.
For thermodynamics, though, the entropy is tied to things like temperature, chemical reaction rates, internal energy, etc. All of those things have an observer independent status to them. Furthermore, an observer who is simply ignorant of some aspect of the system's macro-state (e.g. she doesn't know the number of moles, or he doesn't know the total internal energy) cannot affect these things, even though there are a whole lot more micro-states consistent with the ignorant observer's picture. Thus the information entropy and thermodynamic entropy cannot be the same things.
Where quantum mechanics comes in is inherently in Schrödinger's cat-like scenarios. Suppose some thermodynamic system $\sigma$ has two macro-states available to it labeled $1$ and $2$. Suppose, also, that these states are, somehow, quantum mechanically coherent and the system has a 50-50 chance of being in either. Finally, let's suppose that the total entropy of $\sigma$ is $S_0$. If observer $A$ measures the system then, in multi-universe type interpretations of quantum mechanics, then they enter into an entangled state where there is a 50-50 chance for $(A,\sigma)$ to jointly be in states 1 or 2. Now, for observer $A$ $\sigma$ is now in a state with an entropy of $S_0 - k_B\ln 2$. For observer $B$, who is not entangled, the combined $(A,\sigma)$ grouping will still have entropy $S_0$.
That's one way that quantum mechanics may introduce an observer dependence to entropy that is still physically sensible. There's an awful lot of "if" in that description, though, so I wouldn't bet on it ultimately being one way or the other.

- 22,482
-
' I think, honestly, that thermodynamic entropy may end up being observer dependent but only if quantum entanglement can make the number of states available to the system observer dependent.' Please elaborate – quanity Oct 06 '22 at 17:20
-
2Regarding your last paragraph: The work of Jaynes, especially section VI of this paper, might be of interest. See also this PSE post. – Tobias Fünke Oct 06 '22 at 17:22
-
1
-
All of those things have an observer independent status to them - no! I mean yes, for a given set of macro states, the thermodynamic entropy is fixed. But the set of macrostates obviously depends on the knowledge of the observer, cf. the references I mentioned above. And different knowledge gives e.g. different amount of useful work that can be extracted and thus we'd ascribe different changes in entropy! – Tobias Fünke Oct 06 '22 at 17:50
-
1"But the set of macrostates obviously depends on the knowledge of the observer, cf. the references I mentioned above. And different knowledge gives e.g. different amount of useful work that can be extracted and thus we'd ascribe different changes in entropy!" I'll grant you that the ability to extract useful work from a system is observer dependent, but even Maxwell's demon, for whom the system has no information entropy, will agree on the system's internal energy, temperature (as defined by a measurement process), reaction rates, etc. – Sean E. Lake Oct 06 '22 at 17:56
-
1see Wiki https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory. Information entropy and thermodynamic entropy "can" be the same thing, in the sense that information entropy provides a way to interpret the thermo one. I use "can" because it must be applied as Jaynes and others described: https://physics.stackexchange.com/a/408512/226902 – Quillo Oct 07 '22 at 18:21
Entropy has pseudo units. In statistical mechanics we encounter the quantity $\beta = 1/k_B T$ with units of inverse energy, where $T$ is absolute temperature and $k_B$ is Boltzmann's constant. Temperature alone is never encountered, it is always in the form of the product $k_B T$. By a fluke of history and human inertia temperature got the arbitrary unit of kelvin, which necessitates the introduction of Boltzmann's constant $k_B = 1.38\times 10^{-23}$ J/K as a unit conversion between kelvin and joule. But we could as well define the unit of temperature to be joule, which would make Boltzmann's constant dimensionless and equal to 1. So, when in thermodynamics we write Gibbs's entropy, $$S = -k_B \sum_i p_i \ln p_i,$$ we are writing the same equation as Shannon, only reported in arbitrary units.

- 5,843
The other two answers are correct, but here's my take on it: the thermal energy on a particle is kB/2 per degree of freedom. The units on the entropy come from the kB, which is simply a conversion factor between joules and kelvins as Themis clearly states. (The 1/2 comes, I think, because in a collision you have two particles hitting each other, each with thermal energy)
If you refine entropy down to degrees of freedom, it ought to have the same unit basis as information. The ln comes in because the degrees of freedom in a number rise more slowly than its magnitude (which is why you can write 1000001 with less than a million and one digits). The binary bits in the number of microstates written in base 2 (which, times kB ln 2, is the entropy) are the same as the bits of information you would need to obtain to nail that undetermined macrostate down to one particular microstate. (The same would be true for entropy written in base 3, or I suppose base e if you can manage that. :)

- 299