2

The fact that information is preserved is often stated as a fundamental fact in physics.

Not being a physicist I do not understand how information enters physics at all. What is it an attribute of? How is it defined? What laws exist?

I hope it is not only the notion of entropy that introduces information into physics. In mathematics it is very clear cut: entropy is an attribute of probability distributions. You can make a convincing case that the information gain from observing an event of probability $p$ is $-log(p)$. With this the entropy of a discrete probability distribution is the average gain of information from observing an atomic event.

A more general distribution $P$ needs to be related via a density to the most random distribution $U$ and you can then interpret $(entropy(U) - entropy(P))$ as a measure of the gain of information when you know that outcomes are controlled by $P$ rather than $U$.

If you accept this, the Second Law of Thermodynamics states that the information from observing the macroscopic state of a system is steadily decreasing as the configuration becomes more and more random.

So it is unlikely that entropy is the way in which information enters into physics. I also do not believe that the information associated with a system is the information needed to completely specify the state of the system since this is infinite in almost every case (e.g. whenever any initial condition is a random real).

So what is it?

Qmechanic
  • 201,751
gcc
  • 35

1 Answers1

2

By "conservation of information", physicists mean that you can reconstruct the past if you're given the present state. That is, all the information about the past is never destroyed. You can, in principle, extract it all from the present state by re-winding the laws. Conservation of information may be false in Quantum Mechanics depending on the interpretation, because the wavefunction collapse is probabilistic. But an interpretation like Many-Worlds still allows for the laws to be re-wound.

Entropy does not come into this "information conservation" concept. A different definition of information can be given using entropy, but that information is not what the law of conservation of information is referring to. Information defined using entropy is less fundamental, as it is merely a measure of our ignorance about the exact state of the system, due to our choice of "macro-variables" to describe the system. The macro-variables contain compressed information about a physical system.

Ryder Rude
  • 6,312
  • Thanks, that clarifies a lot, but yields no definition of information. The question the moderator linked to does not answer it either and my question is stated precisely to point this out. Do you know a definitive reference on the subject? – gcc Dec 03 '22 at 17:02
  • @gcc Honestly, I don't think "conservation of information" is that deep a topic. I haven't seen any book stressing about it or even mentioning it as much as other conservation principles which are more mathematical. It basically just says that laws are deterministic backwards. The entropic definition of information which you gave is much more mathematical, but that information is not conserved. – Ryder Rude Dec 03 '22 at 17:22
  • @gcc Just take "conservation of information" to mean "laws are deterministic backwards". This statement does not even need to mathematically define information. It's just a statement about the nature of the differential equations that show up in physics theories. – Ryder Rude Dec 03 '22 at 17:29