6

Note. The title was modified. Previous title was "Every theorem t has a proof no more complex than~|t|. Is this right?"

The question ("Is Kolmogorov complexity (KC) relevant for proof theory?") arises because every theorem has a proof with low Kolmogorov complexity". To be more specific,

Every theorem $t$ has a proof $p$ with $K(p)\leq K(t)+c\leq|t|+c'$, where $c$ and $c'$ are constants.
Proof: It is assumed that there is a proof checking algorithm $C(x)$ that outputs TRUE if $x$ is a correct proof, FALSE otherwise. Then there exists a fixed (depending on the formal system) Turing machine $M$ with Input: $t$ a string (that may be a theorem)
- $M$ enumerates all the proofs until (and if) $t$ is proved.
- When $t$ is proved, print the proof $p$ and halt.
Thus, if $t$ is a theorem, $M(t)$ prints a proof of $t$.
Using the universal TM, we get $ K(p) \leq K(t) + c \leq |t| + c' $ where $c$ and $c'$ are constants.


Thus, and apart from a constant, no theorem needs a proof with Kolmogorov complexity greater than $|t|$. To me this seems a bit strange.

In other words, the number of bits of "inspiration" (or "non-deterministic bits", or "oracle bits") needed to prove any theorem $t$ is at most $|t|+c$.

If this is true, the very lengthy proofs that sometimes are needed to prove some simple to state theorems are necessarily very regular / structured / compressible (synonymous).

  • Very interesting observation! – Joel David Hamkins Jan 24 '14 at 13:22
  • 3
    The technical part is correct, but I disagree with the interpretation. The fact that a proof can be (in principle) found by exhaustive search does not make it in any way regular or structured. – Emil Jeřábek Jan 24 '14 at 13:24
  • 13
    This is why people don't use Kolmogorov complexity to measure proof length: it's not very meaningful to compress a proof of the classification of finite simple groups all the way down to essentially saying "it's the shortest proof of the classification of finite simple groups". I wouldn't interpret Kolmogorov complexity as "bits of inspiration", though, since almost all these bits are just stating the theorem and describing the formal proof system. All you need is one bit of inspiration: if you know there is a proof, then you can definitely find it by brute force if you search long enough. – Henry Cohn Jan 24 '14 at 13:55
  • 3
    What exactly is the question here? (Also, Henry's comment is spot-on.) – Noah Schweber Jan 24 '14 at 18:37
  • 1
    Would you say that the digits of $\pi$ are "very regular, structured, and compressible"? After all, the first $n$ digits of $\pi$ have Kolmogorov complexity at most $\log(n) + c$. I guess they compress pretty well, but they don't look regular to me... Perhaps what makes your thesis unsatisfying is that you don't measure the other resources required to run the program that finds the proof. – Sam Nead Jan 24 '14 at 19:31
  • why are you sure that such a TM exists? – Sh.M1972 Jan 25 '14 at 10:48
  • 1
    I think a better measure for proof theory would actually be time and space complexity, in the computational sense. In the KC sense, a description of just $k$ bits can produce a $BB(k)$-length string (busy beaver function). So perhaps there is "order", but it is very hard to work with. On the other hand, if you can make a time-complexity statement, this may be a much more reasonable measure of regularity or simplicity. – usul Jan 25 '14 at 13:14
  • E.g. consider a theorem of the form "For all $x$ there exists $y$". A constructive proof actually produces a $y$ from any given $x$, i.e. is an algorithm. If this algorithm runs in polynomial time, then surely the theorem is very simple. If it can produce a certificate that it has produced a valid $y$, and this certificate may be checked in polynomial time, then surely the theorem is not too complicated. (For a concrete example, a proof of Brouwer might construct a fixpoint. I can probably not do so in polynomial time, but the fact that my fixpoint is correct can be checked in polytime.) – usul Jan 25 '14 at 13:31

1 Answers1

2

After making the above comment I did a quick Google search and found a paper entitled "Kolmogorov Complexity and Computational Complexity" by Lance Fortnow. The paper is about resource-bounded variants of Kolmogorov complexity. After giving the standard definition and some consequences he writes the following.

Of course this definition of Kolmogorov complexity makes any complexity theorist cringe: what good is a small program [for] $x$ if it takes the life of the universe to produce $x$?

Perhaps he could give you more precise pointers to the literature.

[Edit] - Your main observation has been made before, by Henry Cohn, on MathOverflow. See the last paragraph of his answer here.

Sam Nead
  • 26,191