The fact that information is preserved is often stated as a fundamental fact in physics.
Not being a physicist I do not understand how information enters physics at all. What is it an attribute of? How is it defined? What laws exist?
I hope it is not only the notion of entropy that introduces information into physics. In mathematics it is very clear cut: entropy is an attribute of probability distributions. You can make a convincing case that the information gain from observing an event of probability $p$ is $-log(p)$. With this the entropy of a discrete probability distribution is the average gain of information from observing an atomic event.
A more general distribution $P$ needs to be related via a density to the most random distribution $U$ and you can then interpret $(entropy(U) - entropy(P))$ as a measure of the gain of information when you know that outcomes are controlled by $P$ rather than $U$.
If you accept this, the Second Law of Thermodynamics states that the information from observing the macroscopic state of a system is steadily decreasing as the configuration becomes more and more random.
So it is unlikely that entropy is the way in which information enters into physics. I also do not believe that the information associated with a system is the information needed to completely specify the state of the system since this is infinite in almost every case (e.g. whenever any initial condition is a random real).
So what is it?