4

In On independence and large cardinal strength of physical statements we see that their are physical statements which are independent of ZFC, and even strong cardinal axioms. There were many answers, but unfortunately do not lead to finite experiments whose outcomes are independent of ZFC.

We can get close by running a machine that looks for contradictions in ZFC. If ZFC is inconsistent in our universe, then after (non-standard) finite amount of time, the machine will output a contradiction in ZFC. If ZFC is consistient in our universe though, this machine will never halt, and running the machine for a finite amount of time is not guaranteed to answer the question (indeed, assuming ZFC is consistient in our universe, it won't).

If we had a computer that could perform its first step in $1$ second, second step in $\frac12$ seconds, its third step in $\frac14$ steps, etc, we could perform the experiment in finite time. If ZFC is inconsistent, there will be some finite non-standard real number $r=\frac{2^{n}-1}{2^{n-1}}$ such that machine halts in the $r$ seconds (if the machine runs for $n$ steps). If ZFC is consistent, it will never halt after $r$ seconds for any $r \in [0,2)$. After $2$ seconds, we will know what the result is.

The problem is that, as far as we know, no such machine exists. Is there some other way we could devise an experiment whose result is independent of ZFC? I think it will have to involve the real numbers in some way, but I'm not sure.

Gerry Myerson
  • 39,024
  • 3
    If we believe a certain form of the Church-Turing thesis, any "experiment" of this kind amounts to performing a computation that could run on a Turing machine; if the machine stops in finite time, then this (and the value it computes) is a fact provable in a trivial version of arithmetic, so the only possibility is that the machine does not stop and ZFC cannot prove it, which is essentially what you consider. You can, of course, sugar-coat it in various ways (like hydra games for Peano arithmetic) but the gist will always be the same. – Gro-Tsen Nov 17 '17 at 23:42
  • @Gro-Tsen Wouldn't that form of the Church-Turing thesis imply that all measurable physical constants (like the Fine-structure constant) are computable? – Christopher King Nov 17 '17 at 23:49
  • What is non-standard about the real number $(2^n-1)/2^{n-1}$? – Gerry Myerson Nov 18 '17 at 01:56
  • @GerryMyerson Because $n$ is nonstandard (because it takes a nonstandard number of steps $n$ to find a contradiction in $ZFC$). – Christopher King Nov 18 '17 at 01:57
  • 2
    What is this use of non-standard numbers in your question? You should explain quite carefully how this is supposed to work. Until then I am voting to close because the question just invites mathematically imprecise speculation. – Andrej Bauer Nov 18 '17 at 10:38
  • @AndrejBauer I'm not "using" them. Its simply that there is no proof of standard length that ZFC is inconsistent, so a proof would need to be of nonstandard length. See non-standard model. – Christopher King Nov 18 '17 at 18:27
  • That's an invalid inference, things don't work that way. By your reasoning: there is no standard number satisfying $x = x + 1$, therefore there is a non-standard number $n$ satisfying $n = n + 1$. – Andrej Bauer Nov 18 '17 at 18:50
  • 1
    @AndrejBauer If we are working in ZFC (or even PA) There is no non-standard $n$ satisfying $n = n+1$ because $n \neq n + 1$ is a theorem of ZFC. On the other-hand, there is no proof of Con(ZFC) from ZFC, so there is a model of set theory that satisfies ZFC+$\lnot$Con(ZFC) (see the Model existence theorem). In this model, there will be an object representing a proof of $0=1$ from the axioms of ZFC. Since presumably ZFC is consistient, this proof will be a non-standard object (i.e. does not correspond to an actual proof). – Christopher King Nov 18 '17 at 19:21

2 Answers2

3

Based on our current partial understanding of the laws of physics, it does appear that, to the extent to which the results of physical experiments are governed by mathematical theories that we understand, they are computable, and in fact computable on a quantum computer manipulating at most $10^{122}$ bits. This means that all occurrences of real numbers in the theory are prevented from introducing incomputability.

Let me sketch how this is supposed to go. An amplitude of a particular outcome in a particular experiment might be expressed as a path integral, i.e. an integral over the infinite-dimensional space of paths a particle can take. To compute this, we could try to argue that this integral is well-approximated by an integral over a finite-dimensional space of piecewise linear paths, as long as the steps are sufficiently small. We would then argue that the integral over a finite-dimensional space could be approximated by the sum over a large number of random points in the space. By approximating the function we're integrating, we approximate the amplitude. (Of course it must be possible for physicists to computably approximate the predictions of their theories, so that they can test them in experiments. Methods like Feynman diagrams and lattice QFT enable them to do this in practice.)

Now the key point is that, whatever experiment we perform only allows us to approximate the amplitude. If we perform an experiment $n$ times, we can estimate the probabilities, and thus the amplitudes, to within an error of $~1/\sqrt{n}$. So performing the experiment $2^k$ times we only learn $k$ bits of the amplitude. Moreover, because the cosmological constant is positive, only a bounded amount of matter can fit in our observable universe, containing a bounded amount of entropy, with which we can only perform the experiment a bounded number of times before the heat death of the universe.

This phenomenon is also what shields the values of physical constants from our understandings. Only a bounded number of bits of the fine structure constant need to be known to predict every experiment that occurs in our universe.

Beyond the difficulties presented by the actual rules of the actual universe, there are a couple philosophical difficulties with your idea. The first is that, if such a theory were true, we would have difficulty testing it, as its predictions would be independent of ZFC. So what you would require is a physical phenomenon whose behavior is described by an elegant matheamtical model, which for some experiments produces predictions that can be calculated within ZFC, and for other experiments produces predictions that are independent of ZFC. Furthermore, we would have to be highly confident that no other, more mathematically sedate, theory, predicts the same behavior on the first set of experiments by a different mathematical formalism - e.g. by using whatever mathematical tools we were using to guess the answer.

I think sufficiently strong evidence for a theory of this type would indeed convince people of what you want, although there would still be dissenting views - for instance that the universe is a simulation and that the answers reflect only what the creators of the simulation think the answers to these ZFC-independent problems are, rather than the true answers. There is just no evidence I am aware of for any such theories.

Will Sawin
  • 135,926
  • Can you offer some evidence for the first sentence of your answer? For instance, a list of "laws of physics"? – Andrej Bauer Nov 18 '17 at 10:04
  • @AndrejBauer I don't know what you mean by a list of laws of physics. The evidence is provided by (1) the existence of computable approximations to many physical theories, for instance by summing Feynman diagrams or by lattice QFT, which when computed agree well with experiments, (2) the lack of any fundamental difference between these theories and theories of, e.g., quantum gravity, that would make one computable and the other not, (3) the Bekenstein bound, which suggests that the observable universe can be described by a quantum Hilbert space of dimension $\approx 2^{10^{122}}$. – Will Sawin Nov 18 '17 at 10:27
  • My point is that I don't think there is such a thing as "complete list of the laws of physics", at least not a coherent one. You can look at this or that aspect of mathematical models used by physicsts and notice that they are computable (in a suitable sense). I agree that we get some amount of evidence for computability this way, but we should discuss how reliable it is. – Andrej Bauer Nov 18 '17 at 10:33
  • But more importantly, what does computability have to do with independence? Turing machines are computable, yet there are questions about them that are independent of ZFC. – Andrej Bauer Nov 18 '17 at 10:36
  • @AndrejBauer Turing machines use an unbounded number of bits, whereas the universe uses only a bounded number. – Will Sawin Nov 18 '17 at 10:40
  • @AndrejBauer: Do you doubt that there is such a list or that we know the list? The former seems sufficiently strange to me that I'd be tempted to turn it around and ask you what evidence you can offer for that claim. The latter is unimportant to Will Sawin's Argument if I understand it correctly. – Johannes Hahn Nov 18 '17 at 10:58
  • @JohannesHahn: I think it's meaningless to discuss "is there a list of laws governing the universe". It's only meaningful to discuss "are we able to formulate our explanation and understadning of the universe using a list of laws expressed in a certain well-delineated form". But I think it's even worse than that: the physicists are not able to agree on and compile a precise list of all laws of physics and proclaim it "official". And they are quite right in doing so. But we're drifting away from the question. – Andrej Bauer Nov 18 '17 at 11:39
  • One thing to consider is that Chaitin's constant might be measurable, potentially. Then we would need only a finite number of bits to measure the consistency of ZFC, for example. – Christopher King Nov 20 '17 at 18:43
  • @PyRulez Chaitin's constant could only be physically measurable if (1) we discover shocking new laws of physics or (2) our understanding of the currently known laws of physics is badly wrong. – Will Sawin Nov 20 '17 at 18:55
1

There's an example due to Leonid Levin https://www.math.ias.edu/csdm/02-03/abstracts:

As is well known, the absence of algorithmic solutions is no obstacle when the requirements do not make a solution unique. A notable example is generating strings of linear Kolmogorov complexity, e.g., those that cannot be compressed to half their length. Algorithms fail, but a set of dice does a perfect job!

The assertion that such a string has Kolmogorov complexity greater than half is length is almost certainly independent of any reasonable axiom system.

user117444
  • 11
  • 1