7

From @Humble's answer to "What is information?":

Information contained in a physical system = the number of yes/no questions you need to get answered to fully specify the system.

That is, however, relative to a given model. So either an "infinite, countable" or a "continuous" model would have infinite (possible) information content.

The notion of information seems to find many uses in modern statistical mechanics, quantum computing and other fields, so how do physicists formulate a sound and unambiguous definition of information, given that, naïvely, it would seem to be model-dependent?

Nat
  • 4,640
  • 8
    ...and the question is? – ACuriousMind Jul 24 '16 at 00:07
  • any statement you can do about any present of future formal system, has a countable cardinality. –  Jul 24 '16 at 00:07
  • Information is not something that physicists are really concerned about. That's the computer science department. The closest you can get is signal to noise ratio. – CuriousOne Jul 24 '16 at 00:14
  • Dear Chuquicamata, I added the last paragraph to your question; is this an accurate statement of your question? If so, I think it is a good question and therefore am voting to reopen it. – Selene Routley Jul 24 '16 at 05:47
  • Well, yes. There is, for example, the "no information loss theorem" which would be no loss of the information represented in a specific model, the possible parameters available to particles (entities) in that model. – Chuquicamata Jul 25 '16 at 01:12
  • What Wolphram jonny seems to suggest is that all theories have only an at most countably infinite set of possible answers to questions posable to the theory.

    Leaving the finite case aside, even in the countably infinite case you would have a starting point close to zero from where you develop the answers. But if you take the analog case as given, there is no reason why not to start at pi instead of "3" or anywhere inbetween.

    – Chuquicamata Jul 25 '16 at 01:29
  • It looks to me as though even the state of a continuous system actually has only countably-many bits: if a system's state is defined by a vector in a countably-dimensioned space, then, in some basis, each component is a real (or complex) number, which has a countable number of digits, and there are countably-many components. Of course there are an uncountable number of states, but each state is countably-describable. –  Jul 30 '16 at 14:48
  • Information is already a model-dependent concept on its own though. It is a theory in which its objects are probability distributions, so it is implicitly dependent on the domains of those probability distributions. So Information Theory itself satisfies your question as far as I can tell. – Daniel Kerr Jul 30 '16 at 23:38
  • I'm starting to think that you need a finite state space for a definiti0n of information content. Countable state spaces can have infinite entropies, so any "no information loss"-theorem would need a finite state space. – Chuquicamata Jul 31 '16 at 15:10
  • I can't erase the above comment, but after scimming this

    http://www.ihes.fr/~gromov/PDF/probability-Symmetry-Linearity-Paris-Lecture-Oct-2014.pdf

    I think it's not correct.

    – Chuquicamata Jul 31 '16 at 16:20

2 Answers2

1

how do physicists formulate a sound and unambiguous definition of information, given that, naïvely, it would seem to be model-dependent?

The practice is, take the model you are interested in, with set of possible states, and work with probability distributions on those states and optionaly discuss information entropy or information content of those probability distributions. This is useful even if in another, more refined or just different model of the same physical thing, uses different states and thus assigns different information entropy.

In short, it is model dependent. There is no universal answer to question: what is information entropy of ice cube of 1cm side. It depends on the model of the cube. If the model cares only about which side is up, it has only 6 possible states, then maximum entropy is $\ln 6$. If the model is molecular simulation and state involves positions and orientations of all water molecules in the cube, at 1atm and zero Celsius, the state space is immensely bigger and the information entropy is much higher number.

0

Algorithmic information theory defines the complexity of a given chain of bits (which could represent either the number $\pi$ or a full axiomatic theory) as the smallest program that can give that string as an output. Some irrational numbers like $\pi$ are short on information as you can compress them into a short algorithm that outputs one digit after another with no end in sight. But most irrational numbers are not compressible in that way: for most reals there is no formula that can compute all its digits.

Any formal system, or theory, in physics is a finite set of strings. Thus you can map it to the natural numbers and always find that its complexity is low, as compared at least with the complexity of random strings, even of the same size.

But to specifically answer your question, it is a theorem of set theory that if a countable first-order theory has an infinite model, then for every infinite cardinal number $\kappa$ it has a model of size $\kappa$.

To make it short, for most theories you might come up with, there is an infinite number of models that satisfy it, with any cardinality of your choice.

  • This is called the Kolmogorov complexity. It, also, however is a model-dependent notion because it depends on the description language chosen, which is the arbitrary parameter. – The_Sympathizer Aug 20 '19 at 23:35