From @Humble's answer to "What is information?":
Information contained in a physical system = the number of yes/no questions you need to get answered to fully specify the system.
That is, however, relative to a given model. So either an "infinite, countable" or a "continuous" model would have infinite (possible) information content.
The notion of information seems to find many uses in modern statistical mechanics, quantum computing and other fields, so how do physicists formulate a sound and unambiguous definition of information, given that, naïvely, it would seem to be model-dependent?
Leaving the finite case aside, even in the countably infinite case you would have a starting point close to zero from where you develop the answers. But if you take the analog case as given, there is no reason why not to start at pi instead of "3" or anywhere inbetween.
– Chuquicamata Jul 25 '16 at 01:29http://www.ihes.fr/~gromov/PDF/probability-Symmetry-Linearity-Paris-Lecture-Oct-2014.pdf
I think it's not correct.
– Chuquicamata Jul 31 '16 at 16:20