21

Unitarity of quantum mechanics prohibits information destruction. On the other hand, the second law of thermodynamics claims entropy to be increasing. If entropy is to be thought of as a measure of information content, how can these two principles be compatible?

10 Answers10

23

Unitarity of quantum mechanics prohibits information destruction. On the other hand, the second law of thermodynamics claims entropy to be increasing. If entropy is to be thought of as a measure of information content, how can these two principles be compatible?

I don't think there's anything inherently quantum-mechanical about this paradox. The same question could be posed in classical physics. For a Hamiltonian system, the dynamics are always reversible, so information is conserved. One could then ask how entropy can increase for a classical system, if entropy is a measure of information.

The resolution is that entropy isn't a measure of the total information content of a system, it's a measure of the amount of hidden information, i.e., information that is inaccessible to macroscopic measurements.

For example, say a book slides across a table until friction brings it to a halt. In theory, we can walk into the room, observe the stopped book, measure the positions and momenta of all the particles it's composed of, and then use Newton's laws to extrapolate back in time and see that the book must have been pushed in a certain direction, at a certain time, at a certain speed. But in reality this information is hidden from us because other histories of the book would have resulted in final states that are indistinguishable from this state by macroscopic measurements.

The total information has stayed the same, but the amount of hidden information has increased.

  • 1
    +1, Or in other words: It is the coarse grained entropy that increases(because as you mentioned it measures hidden info). The fine grained entropy that measures the total info remains constant. –  Aug 23 '15 at 19:40
6

Having written this down, it seems to me this is just an enlarged version of Trimok's answer, but I'm not quite sure, whether I understood him correctly.

My view is that the thermodynamic entropy is not one-to-one the quantum mechanical von-Neumann entropy. As you already pointed out, unitarity of quantum mechanics implies that the total entropy of the universe stays constant. However, in order to compute this, you need the state of the universe.

Now, the entropy in thermodynamics is an extensive property. The entropy of a system is equal to the sum of the entropies of its (noninteracting) subsystems. This means, in order to calculate the thermodynamic entropy, you can subdivide your system into smaller, noninteracting pieces. Often, you will do this without actually having noninteracting systems - you'll just neglect certain dissipative processes, subdivide your system and add the entropies. One example you'll have heard of in statistical mechanics is the two identical boxes with particles that you put together and then you'll have twice the entropy.

This is not true for the quantum mechanical entropy. Entropy is not additive, but subadditive (except for separable states, then it's additive). This means that the sum of the entropy of a systems parts is bigger than the entropy of the sum. And it is in this sense that the second law makes sense to me: An interacting system will distribute entanglement through interaction - this entanglement destroys the additivity of the entropy and ensures that in the end, the thermodynamic entropy will actually be larger than the von Neumann entropy of the system (which, with unitary time development, does not grow).

Put differently: The von Neumann entropy of the system is the total information in the system and is conserved. If the thermodynamic entropy is the same (the system is in big product states), then all of this information can in a sense be accessed locally. The change in the thermodynamic entropy then tells us how much of this information becomes globally distributed, so that it is inaccessible locally (if the entropy starts at 0, then this would correspond to saying that the entropy measures the amount of information that lies in global entanglement, i.e. "uniform information"?).

In conclusion: Since our thermodynamic entropy will usually be computed using local entropies of subsystems, the second law states that with time, the system will become more and more entangled in a global way. The true von Neumann entropy of the whole system, however, will always be the same.

Martin
  • 15,550
  • This article supports your answer:

    'Entanglement builds up between the state of the coffee cup and the state of the room. ... In their view, information becomes increasingly diffuse, but it never disappears completely. So, they assert, although entropy increases locally, the overall entropy of the universe stays constant at zero.

    The universe as a whole is in a pure state...but individual pieces of it, because they are entangled with the rest of the universe, are in mixtures.'

    https://www.quantamagazine.org/quantum-entanglement-drives-the-arrow-of-time-scientists-say-20140416

    – Nick Gall Oct 29 '18 at 14:48
2

Also keep in mind that some of the familiar notions of thermodynamics (e.g. temperature) deal with cases where there are degrees of freedom hidden from us (that of the thermal reservoir, for example). The way to formulate this in quantum mechanics is with composite systems, the density matrix formulation, and partial traces.

If we have complete knowledge of a system, quantum mechanically speaking, then that system is in one state, and only one state (Heisenberg picture, if you wish), and stays in that state forever, and so there is no entropy or entropy change or whatnot. But by claiming ignorance of parts of the system we recover the classical notions of statistical mechanics. You may want to read about Von Neumann entropy.

lionelbrits
  • 9,355
1

The information interpretation of entropy, also known as Shannon entropy, interprets entropy as an increase over time in the amount of information contained in a closed system. This interpretation is appealing to information theorists, including many computer programmers, but it does, on the surface, appear to contradict the quantum mechanics conservation of information theorem.

The resolution of this apparent contradiction, as others have noted, is that Shannon entropy and quantum mechanics use slightly different definitions of "information". Loosely: In quantum mechanics, it is the total uncompressed information, while in Shannon entropy, it is the total compressed information. To illustrate, consider the following bitmaps:

Bitmaps

In words: The 1st bitmap is described: A 6x6 grid with 6 pixels across the top row (nice and short); while the 2nd bitmap is described: A 6x6 grid with 6 pixels in positions (4,1), (5,3), (6,3), (2,4), (4,5) (much longer).

In computers: A compressed file (.gif) of the 1st bitmap is much smaller than that of the 2nd bitmap.

Both bitmaps are the same size, and contain the same number of pixels, so the total amount of uncompressed information in them is the same, as per conservation of information. However, the information can be losslessly compressed to a greater degree in the 1st bitmap, so the 2nd bitmap contains more compressed information. Correspondingly, with no (uncompressed) information loss, the 2nd bitmap's entropy is higher.

Notes:

  • This example is a vast oversimplification, but good enough to get the point across. For one thing, "information" in both systems refers to dynamic systems, while this example is based on a static system. Anyone who understands MPEG compression can extend the example to a dynamic system and see that the same basic principles apply. For another, Shannon entropy deals with probabilities rather than a specific compression algorithm, but the analogy works well enough.
  • Instead of "compression", some like to use "order": The 1st bitmap is more "ordered" than the 2nd bitmap. A similar interpretation is "uniformity": The 1st bitmap is more "uniform" than the 2nd bitmap. The difficulty with these choices of words, is that in plain English, a fluid in a closed thermodynamic system at maximum entropy appears "uniform" or "ordered" - temperature, pressure, and molecules are evenly distributed across the system - so it can easily lead to confusion. Compression is more difficult to understand without a computer science background, but is less likely to be misinterpreted.
  • For those without a computer science background, hopefully the explanation in words helps. Alternatively, the use of the term "hidden" information (as per another answer) may or may not help.
0

Some comments on the premises may be useful and answer the question:

[1] Unitarity of quantum mechanics prohibits information destruction.

What this really means is the von Neumann evolution equation for density matrix $D(t)$ of an isolated system prohibits change in the von Neumann entropy:

$$ N[D]=\sum_{k,l} -D_{kl} \ln D_{lk}. $$

Very similarly, in classical mechanics, Hamiltonian evolution prohibits change of the information entropy functional ("Gibbs entropy"):

$$ I[\rho]=\int - \rho \ln \rho\,\mathrm dq\mathrm dp $$

Neither $N$ nor $I$ are the same concept as thermodynamic entropy $S$. It is perfectly possible for the thermodynamic entropy $S$ to increase while these remain constant (for example, when irreversible work is done on the system with no heat exchange). This is explained by Jaynes in his papers.

[2] On the other hand, the second law of thermodynamics claims entropy to be increasing.

No, for thermally isolated system (which the above refer to) the second law of thermodynamics claims :

when thermally isolated system has passed from one thermodynamically equilibrium state $A$ to another thermodynamically equilibrium state $B$ (the process is thus adiabatic), increase of its thermodynamic entropy $S$ is greater than or equal to 0.

If entropy is to be thought of as a measure of information content, how can these two principles be compatible?

Thermodynamic entropy $S(U,V,N)$ may be regarded as the maximum value of the von-Neumann/Gibbs entropy for all $D$ or $\rho$ compatible with constraints $U,V,N$ and thus indeed can be regarded as measure of information that is lacking to state the microstate, or "information content", although it is a very misleading name. Better term is simply "information entropy of macrostate", since it depends on the latter.

These statements do not contradict each other. On the contrary, it can be shown that in the process where the state A has changed into state B in an irreversible adiabatic process, constancy of $I$ together with the reproducibility of the resulting state of the fixed adiabatic process imply that $\Delta S \geq 0$.

Jaynes, E. T., 1965, `Gibbs vs Boltzmann Entropies,' Am. J. Phys., 33, 391 http://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf sec. 4,5

0

Wave function collapse (or a measurement on the QM system) is not unitary since it is a projection of the state vector.

0

In my humble opinion I think the answer to the question is related to the fact that for Projective Measurements the Entropy of the state always increases or stays the same. However I believe one should also take into account the entropy of the measurement system. Intuitively I think that the entropy of the measurement system must decrease or stay the same, while the entropy of the measured system increases or stays the same. I think one would have to come up with a new formalism to describe such measurements. In my humble opinion the formalism would look something like this:

  1. The measurement system must be described by a state as well as the measured system.
  2. The state of the measurement system as well as the state of the measured system change after the measurement.

If you are able to come up with a formalism that describes this then hopefully it should follow as a corollary that the entropy of the measurement system decreases or stays the same while the entropy of the measured system increases or stays the same.

onephys
  • 289
0

I wrote an article exactly on that question ! https://aurelien-pelissier.medium.com/on-the-conservation-of-information-and-the-second-law-of-thermodynamics-f22c0645d8ec

To make it short, the entropy (or information) constrained by Liouville’s theorem is not the same as the entropy that the Second Law is talking about. The latter is talking about the amount of hidden information (information that is inaccessible to macroscopic measurements), while the former is the total information content of a system. Or in other words, the irreversibility of thermodynamics is a statistical effect and does not conflict the reversibility of classical /quantum mechanics.

To go a bit further, these have to do with the differences between the fine-grained and the coarse-grained entropy of the system. Von Neuman entropy is typically fine grained, so it stays constant in a closed system. On the other hand, the second law states that the coarse grained entropy of a closed system increases.

0

Entropy may be seen as "uniform" information. For instance, in a canonical formalism (constant temperature), we may write $\beta F = \beta U- \frac{S}{k}$, where $F$ is the Helmholtz Free Energy, and $U$ the internal energy. $\beta U$ represents the total information, $\beta F$ represents the non-uniform information, and $\frac{S}{k}$ represents uniform information.

Trimok
  • 17,567
-2

Prohibition of information destruction due to unitarity is a hypocrisy (sorry, I go to reiterate some stuff from the Where does deleted information go? posting), and the concept of entropy is foggy, especially in the quantum context: in spite of definitions mentioned in previous answers, there is no possibility to ever know the quantum state of a realistically complex material system.

You ask how to reconcile thermodynamics with quantum mechanics. Short answer is: there are two different evolutions. Unitary evolution describes the view of quantum mechanics but we, conscientious beings, see a completely different picture because we consume negentropy (or produce entropy, if you like it more). Whereas quantum mechanics sees all the “quantum state” (from outside-the-universe), while it undergoes a unitary evolution, we cannot see it.

Why it is so? Pretexts for the answer are three points:

  1. we, conscientious beings, belong to this world too;
  2. quantum superposition; and
  3. open quantum systems.

From the point 1 follows that a conscientious being should be described as a quantum system as well as objects, that s/he should have quantum states. Although there are substantiated doubts that a quantum state can be well-defined without an external observer, it is not crucial since we can assume a very remote (another) observer. Imagine two “versions” of a conscientious being, one with a thought A and corresponding state ΨA, and another with a thought B and corresponding state ΨB. We assume that each of states admits evolution in the future. Now apply the point 2, consider a state in superposition, such as: $$\frac 1{\sqrt 2} (Ψ_A + Ψ_B).$$ Due to linearity of quantum evolution (either unitary of not) this looks that the versions A and B of the conscientious being go to coexist in the future without any mutual influence. That’s why each “version” sees only his/her part of great quantum state. This is an idea behind the famous many-worlds interpretation of quantum mechanics, but it doesn’t explain direction of time: a unitary transformation can make a superposed future from “non-superposed” past (although, technically, such term is nonsense), but could make a reverse transition too.

We feel the arrow of time. In other words, we, in future states, retain memories about the past. This is where the point 3 is necessary. An open quantum system is a thing that interacts with environment (thus, potentially, with all the universe). Processes specific to this theory are called sometimes decoherence, sometimes superselection, but essence is the same: one-to-one correspondence of state vectors is not necessarily one-to-one correspondence of our conscience. Unfortunately, I’m not really an expert in this domain, and nobody understands well how this happens; please, do not ask me about it. Nobody knows certainly whether could our mind run in a system perfectly closed. Personally me, I think that cosmological factors play some rôle in openness of quantum evolution (although many physicists would qualify it as heresy). But independently of mechanisms, so named “entropy” increases when we progress into the future (you can ask how increase in entropy provides our arrow of time, but it is purely thermodynamics and information theory, and no quantum physics). That’s why our experience directed transitions, and these transitions from one state to another, with possibly alternative versions of the future, are our evolution, of thermodynamic beings. It is a thing absolutely dissimilar to unitary evolution, and it manifests at quantum level as von Neumann’s state vector reduction, a.k.a. wave function collapse.

Why do these strange processes actually occur and, moreover, only in one direction of time? Why, theoretically, can one our past transform to multiple futures, but multiple versions of the past can’t join in one future? There are some suggestions, but generally it is an open problem. Roger Penrose thinks it is due to low entropy of the cosmological singularity, I think it follows from the “true structure of space-time”, and possibly different views are not mutually exclusive. But increase of entropy during thermodynamic processes is an experimental fact that we, thermodynamic beings, can easily notice.