58

Entropy as it is explained on this site is a Lorentz invariant. But, we can define it as a measure of information hidden from an observer in a physical system.

In that sense, is entropy a relative quantity depending on the computation, measurement and storage capacity of the observer?

Qmechanic
  • 201,751
veronika
  • 2,706
  • 5
    Entropy measures the number of microstates that correspond to some set of (not necessarily macroscopic) order parameters. If two observers agree on what the order parameters are then they must also agree on the entropy. So the question is whether the relevant order parameters (e.g. $N$, $V$, $T$ for canonical thermodynamic entropy) are invariant. – lemon Feb 27 '17 at 09:31
  • @lemon Why not necessarily macroscopic? I thought that macroscopic order was necessary – veronika Feb 27 '17 at 10:15
  • @lemon If length is not invariant, does this mean volume isn't either? Is there such a thing as relativistic entropy? –  Feb 27 '17 at 10:24
  • 1
    Related: http://physics.stackexchange.com/q/193677/50583 and its linked questions. – ACuriousMind Feb 27 '17 at 10:37
  • @veronika One can select any set of parameters that represent a simplified description of the system (e.g. average bond length). An entropy may then be assigned. – lemon Feb 27 '17 at 11:15
  • Proving the Lorentz invariance of the entropy and the covariance of thermodynamics gives a very clear treatment of foundations of relativistic thermodynamics and the invariance of entropy (for both classical and quantum systems). – Quillo Aug 01 '23 at 21:22

2 Answers2

64

E.T. Jaynes agrees with you, and luckily he is a good guy to have on your side:

From this we see that entropy is an anthropomorphic concept, not only in the well-known statistical sense that it measures the extent of human ignorance as to the microstate. Even at the purely phenomenological level, entropy is an anthropomorphic concept. For it is a property, not of the physical system, but of the particular experiments you or I choose to perform on it.

This is a quote from his short article ``Gibbs vs Boltzmann Entropies'' (1965), which is a great article on the concept of entropy in general, but for this discussion in specific you can turn to section VI. The "Anthropomorphic" Nature of Entropy. I will not try to paraphrase him here, because I believe he already described himself there as succinctly and clearly as possible. (Note it's only one page).

I was trying to find another article of him, but I couldn't trace it at the moment. [EDIT: thanks to Nathaniel for finding it]. There he gave a nice example which I can try to paraphrase here:

Imagine having a box which is partitioned in two equally large sections. Suppose each half has the same number of balls, and they all look a dull grey to you, all bouncing around at the same velocity. If you now remove the partition, you don't see much happen actually. Indeed: if you re-insert the partition, it pretty much looks like the same system you started with. You would say: there has been no entropy increase.

However, imagine it turns out you were color blind, and a friend of yours could actually see that in the original situation, the left half of the box had only blue balls, and the right half only red balls. Upon removing the partition, he would see the colors mix irreversibly. Upon re-inserting the partition, the system certainly is not back to its original configuration. He would say the entropy has increased. (Indeed, he would count a $\log 2$ for every ball.)

Who is right? Did entropy increase or not? Both are right. As Jaynes nicely argues in the above reference, entropy is not a mechanical property, it is only a thermodynamic property. And a given mechanical system can have many different thermodynamic descriptions. These depend on what one can --or chooses to-- measure. Indeed: if you live in a universe where there are no people and/or machines that can distinguish red from blue, there would really be no sense in saying the entropy has increased in the above process. Moreover, suppose you were color blind, arrive at the conclusion that the entropy did not increase, and then someone came along with a machine that was able to tell apart red and blue, then this person could extract work from the initial configuration, which you thought had maximal entropy, and hence you would conclude that this machine can extract work from a maximal entropy system, violating the second law. The conclusion would just be that your assumption was wrong: in your calculation of the entropy, you presumed that whatever you did you could not tell apart red and blue on a macroscopic level. This machine then violated your assumption. Hence using the 'correct' entropy is a matter of context, and it depends on what kind of operations you can perform. There is nothing problematic with this. In fact, it is the only consistent approach.

  • Related http://physics.stackexchange.com/questions/218505/why-doesnt-the-entropy-increase-when-two-similar-gases-mix-with-each-other – Paul Feb 27 '17 at 11:44
  • 9
    "turns out you were color blind, and a friend of yours could actually see that in the original situation, the left half of the box had only blue balls" - I'd recommend changing that to red and green. Most color-blind people have trouble distinguishing red from green. None of the three types of dichromats have trouble distinguishing red from blue. A condition with no hue discrimination whatsoever is extremely rare (1 in 40000 vs 1 in 40). – John Dvorak Feb 27 '17 at 12:56
  • 7
    How exactly could work be extracted from the pools of pure red and blue balls? – Mike Wise Feb 27 '17 at 16:06
  • @MikeWise The part which matters is that there is a distinguishing trait which can be used to generate work. Red and blue are convenient visual images for people when they're thinking about a very abstract concept like "entropy." – Cort Ammon Feb 27 '17 at 23:12
  • 2
    Yes, I get that part, what I don't get is that fact that if there is a distinguishing trait, there is automatically a way to exploit that to generate work. Why does one follow from the other? – Mike Wise Feb 27 '17 at 23:17
  • @MikeWise If you have an empty memory space where all the bits are initialized to zero (or 1 or in some other state that is describable with just a few bits), then you can operate a Maxwell's demon in a gas where you have a separation between two parts that are in thermal equilibrium, e.g. by only letting fast molecules pass from one to the other side and slow molecules in the opposite direction. The fundamental problem with Maxwell's demon is that after every action it adds one bit of information to its memory, so it can only run for as long as its memory space lasts. – Count Iblis Feb 28 '17 at 04:40
  • So, if you have enough spare memory available, you can periodically dump the Demons memory there, allowing you to lower the entropy of the gas by an amount proportional to the available memory space. – Count Iblis Feb 28 '17 at 04:46
  • Won't the red and blue balls still mix, even if only observed by the color blind person? The logic here does not make sense. – yters Feb 28 '17 at 07:08
  • 14
    @MikeWise For a way to generate work think osmosis (and osmotic pressure). Box has red balls in left compartment, blue in right,equal volume, number average kinetic energy and therefore pressure. Now let the membrane be permeable only to red balls. Red balls will go to the right until equilibrium is reached. Eventually left compartment will have only red balls, while the right will have red and blue balls, and it will have higher pressure. Now allow your membrane to slide, turning it into a piston and you have preformed work! – Hennadii Madan Feb 28 '17 at 09:07
  • I posted this as a question, so feel free to respond in more detail :) [added link - ACM] – Mike Wise Feb 28 '17 at 09:27
  • 1
    If you have a thermometer which can't tell the difference between 0C and 100C, boiling a cup of water doesn't become free. This colourblind argument is sophistry. – J... Feb 28 '17 at 10:06
  • I don't really buy this. If I don't know about relativity, I might not realize that a massive object has $mc^2$ of additional energy. Does that make energy subjective too? – Rococo Mar 12 '17 at 19:18
  • 2
    To be clear, I agree with all the actual physical content of what you/ Jaynes are saying. But the obvious conclusion to me is not that entropy is subjective, but that one can sometimes get away with ignoring irrelevant contributions to it (as one can do with energy). – Rococo Mar 12 '17 at 19:21
  • @Rococo I'd say there is a difference. Energy is a function of the microstate, quite in contrast to entropy. So while there is a correct, objective energy, independent of our knowledge or experimental control (say, at least in classical physics), this does not hold for entropy. For example, a demon would assign zero entropy to every system (or it is undefined for it). So energy is not subjective, but your knowledge about can be. In contrast, entropy cannot be defined objectively in the sense that it is not a function of a microstate, i.e not a property of the physical system itself. – Tobias Fünke Oct 20 '22 at 15:00
  • @TobiasFünke my perspective is fully elaborated here: https://physics.stackexchange.com/questions/145795/what-is-the-entropy-of-a-pure-state/145851#145851 – Rococo Oct 20 '22 at 17:49
  • @Rococo Thanks for the link. I agree with you to some extent, but more with the comment of Mark Mitchison; regarding your comment below that: One does not need to speak of an observer, but entropy in thermodynamics becomes useful if we think about what experiments we can perform or what (macroscopic) parameters we can observe, manipulate and are aware of (and in that sense it is objective). So just replace observer with experimenter. In any case, we don't have to discuss this here. I just wanted to give my two cents here regarding the contrast of energy vs. entropy. – Tobias Fünke Oct 20 '22 at 22:09
  • The thing is, unless I'm wrong, blue particles absorb more energy than red particles if you shine light on them. So regardless if the observer is colorblind, they would be able to see that the setup "gray balls + light" has lower entropy than just "gray balls".

    Of course, since they would see that the left side interacts differently with light, then they would cease to be "colorblind". So it's not the observer that is "colorblind", it's the setup. Colorblind = no light interacting with the particles.

    (Probably quibbling about words here, but still)

    – Michael Mitsopoulos Oct 21 '22 at 14:25
0

I think that Shannon-von Neumann definition of Entropy pass this anthropocentric paradox by establishing the minimum amount of information that cannot be reversibly be exchanged between two states of the same system, no matter if there is an agreement or even the presence of observers. In such way Entropy is indeed a physical characteristic and not an observer artifact, plus establishes a unique direction for the flow of information, hence causality, flow of time etc..

I know I am simply placing postulates against each other and I am in no position to establish or hint at the correctness of one or another, but I prefer keeping my physics understanding within the boundaries of experimental verification.

MdM
  • 1