3

If information and entropy are equivalent, and information is a conserved quantity because of unitarity, what does it mean to say that entropy is growing since the big bang (I know why it grows, I'm not getting how to conciliate the two affirmations).

I undestand the process by which entropy grows, and I know why information (quantum states) are conserved. What I am not getting is how these two facts are reconcilable with each other, since information is equivalent to entropy.

donut
  • 383
  • I don't understand how information is a conserved quantity - is there an associated symmetry? – Mozibur Ullah Sep 10 '18 at 08:30
  • An observation: I accept the duplicate does address the same issue but this is not so obvious since different language is used. – isometry Sep 11 '18 at 07:35

3 Answers3

2

I will be using this answer by Bob Bee to address this part of the question:

If information is conserved, why did the early universe had lower entropy than now?

It seems that the concept of conservation of information arises within a quantum mechanical system of complete solution of the early universe . This presupposes that quantization of gravity has been achieved. The quantum level does not have a definition of entropy as defined by thermodynamics, which is a classical theory, emergent from the underlying classical statistical mechanics level.

From Bob:

information in its basic simplest form, in quantum theory, is the state of the system (which could be composed of many subsystems). A physical system is defined by a state vector. It could and often is infinite dimensional, but could also have finite dimensional Hilbert subspaces (like the spin). The evolution of a system,considered a pure state, is given by a unitary operator which preserves causality (at the Hilbert space level, not in the probabilistic interpretation of collapse and measurements). You can always go back by applying the inverse operator. When the state becomes mixed information can be considered to be lost, and entropy increases.

italics mine

The way I understand it, during the time of the universe when conservation of information holds due to a pure quantum mechanical solution for the universe, entropy is constant. Once decoherence sets in, the classical statistical mechanics, entropy increases. In this view there is no conflict between a low entropy in the beginning of the universe and fixed at a given value, and the increase after quantum mechanical solutions decohere. The entropy law has a larger or equal sign in front.

"since information is equal to entropy" is not true in the quantum level. Look at the article on information entropy which uses classical probabilities, not quantum mechanical. Also this on thermodynamics and information theory does not involve the unitarity argument . It seems that the unitarity argument is important for conservation of information in quantum systems, but not in defining information entropy.

One should also keep in mind that quantization of gravity is still an open research field.

anna v
  • 233,453
  • 1
    downvoter why don't you teach me something? – anna v Sep 09 '18 at 14:00
  • 3
    This is just totally wrong. during the time of the universe when conservation of information holds due to a pure quantum mechanical solution for the universe, entropy is constant This makes no sense. There was not a time when the universe was described by quantum mechanics, followed by ... a time when it wasn't? There was not a time when the universe's entropy was constant. Once decoherence sets in, the classical statistical mechanics, entropy increases. You seem to be imagining decoherence as a process that occurred at some point in the history of the universe. That's not correct. –  Sep 09 '18 at 14:02
  • 1
    @BenCrowell If you look at the the big bang model there is the inflation age, then the quark age, and these are quantum mechanical descriptions, as quarks are not free then. When protons form the quantum system has decohered. Of course there is no instant cuts in time in this type of model. Decoherence happens continuously after the quark age. Entropy is constant if it does not increase, by definition. – anna v Sep 09 '18 at 14:19
  • 2
    Decoherence is a non-unitary process, normally due to coupling to an external environment. What did the universe couple to? The whole point of the question is to reconcile thermodynamics with unitary evolution, this answer essentially says "at some point the universe evolved non-unitarily and became classical". – fqq Sep 09 '18 at 17:28
  • 1
    @fqq decoherence can happen many ways, when quarks bind into protons and photons balance the energy differences, already there is decoherence from the universal wavefunction which described the quark antiquark gluon etc "soup", thermodynamic scatterings and microstates can be defined once protons and neutrons form. – anna v Sep 09 '18 at 17:39
  • I down-voted by mistake - I skim read it rather than reading it through properly. Apologies. I've tried to rescind the vote, but it appears to be locked now. – Mozibur Ullah Sep 10 '18 at 08:46
  • @MoziburUllah it is OK. I will do an edit and then for a while votes can be rescinded. In any case I have asked Bob Bee to look into it , because if I am wrong, I want to know it and correct my brain's data bank – anna v Sep 10 '18 at 10:11
  • I've rescinded my vote. I don't understand why it is still greyed out though. – Mozibur Ullah Sep 10 '18 at 10:19
  • @MoziburUllah it is ok at the moment, until a next negative ;) – anna v Sep 10 '18 at 10:36
  • +1 to correct M.U.'s mistake. – safesphere Sep 10 '18 at 18:40
1

As far as I know, there's no mainstream consensus regarding this. Reconciling irreversibility into quantum mechanics (much like quantum mechanics itself) doesn't have a well accepted interpretation.

But that doesn't mean we don't have a formalism for it! We do, and it involves a form of generalized quantum state, the density matrix formalism. Within it it's much more natural to include and derive irreversible (non unitary) terms through couplings to systems that aren't of interest to our model (say, a thermalized electromagnetic field).

The irreversibility then stems from coupling with an external system which we have no information about. Quantum theory in this density matrix formalism is extremely well founded as implications of information theory.

With this in mind, for the entropy of the Universe to increase, it would have to couple to something which is extremely complicated to describe with detail. We can then argue that (this is just my intuition, and could be horrendously wrong), since the Universe is infinite, if I describe any local region of it, it's coupled to the endlessness that surrounds it, which definitely is complicated to describe. The information is lost to the rest of the Universe.

0

Entropy and information are not identical. The entropy of a system is a measure of what you do not know about the microscopic details of a system given some macroscopic observation. Boltzmann's entropy equation: $S = k_B \log W $ means that the entropy $S$ increases when the number of microstates corresponding to a given macrostate ($W$) increases.

The second law means that as the universe evolves it becomes describable by more and more microstates aka. entropy increases. The information is still 'in' the microstate but it is not information available to macroscopic observation.

Perhaps an example will help. Imagine a box with a gas contained in one half of the box. The gas is made up of $N$ particles with each have a position (3 components $x$, $y$, $z$) and velocity (three components also). Therefore we can describe the gas with $6N$ variables. As the gas spreads to the rest of the box this number doesn't change; this is the type of 'information' your question wonders about. However, there are fewer ways to arrange the particles such that they are all on one half of the box than there are ways to arrange them spread over the whole box, so therefore the entropy is increasing as the gas spreads.

Stuart
  • 163
  • I wonder if information about the system should also include the boundary conditions and thus its amoubt is more than $6N$ in both cases. – safesphere Sep 10 '18 at 18:47
  • The amount of information about the system is not a well defined idea if you want to include everything it is possible to know about the system. For an ideal gas then the microstate is fully distinguished by the 6N coordinates. But really this is just a toy example to motivate the relationship between information and entropy and the exact specifics can be changed without breaking the argument. – Stuart Sep 11 '18 at 09:08