7

I tried to read Prof. 't Hooft's new paper The Cellular Automaton Interpretation of Quantum Mechanics A View on the Quantum Nature of our Universe, Compulsory or Impossible? and encountered difficulty about the motivation for introducing ontological states and cogwheel models.

Suppose quantum mechanics is deterministic, the probabilistic nature in the Born's rule must be an artifact. Namely the Born's rule somehow likes throwing a classical dice. The probability comes from our incomplete knowledge and limited computational power.

In another paper, How a wave function can collapse without violating Schroedinger's equation, and how to understand Born's rule, it is stated that

According to our ontological theory of quantum mechanics, the probabilities generated by Born’s rule, are to be interpreted exactly in the same terms. If we do not know the initial state with infinite accuracy then we won’t be able to predict the final state any better than that.

I am fine with all that. However, in the "The Cellular Automaton Interpretation of Quantum Mechanics", if I understood correctly, Prof. 't Hooft constructed a series cogwheel models to show these deterministic models exhibit Schrodinger equation.

My question is, what is the motivation for introducing ontological states and cogwheel model? Would the Schrodinger equation itself to be sufficient, since it is already deterministic anyway? If one wants to get rid of Bell's inequality, Schrodinger equation seems to be sufficient (Related post, Why was quantum mechanics regarded as a non-deterministic theory? ). And even tried to derive Born's rule?

If one feels the Schrodinger equation is insufficient, i.e. there is something behind it, why the object behind Schrodinger equation is so essential? I think I missed some important aspect in his paper... (Presumably I did not read it carefully enough)

user26143
  • 6,331
  • 1
    I gave an answer to a similar question a while ago (since migrated to Philosophy.SE) which you might find helpful. http://philosophy.stackexchange.com/q/6670/ – N. Virgo Jun 12 '14 at 07:28
  • Thank you very much, though I am not sure whether it is helpful for understanding 't Hooft's approach... – user26143 Jun 12 '14 at 07:30
  • From reading a few of 't Hooft's papers a while ago, I got the impression his motivation was very similar to what I wrote. I guess the part I didn't answer was "would the Schrodinger equation itself to be sufficient, since it is already deterministic anyway?". If I get time I will try to formulate a good answer to that and post it. – N. Virgo Jun 12 '14 at 07:39
  • Related https://physics.stackexchange.com/q/748384/226902 https://physics.stackexchange.com/q/34217/226902 – Quillo Feb 02 '23 at 23:49

1 Answers1

3

Thanks for the links, I will read through,

Ontological states are somehow "real" states of the underlying deterministic system, to be distinguished by quantum states which are a superset. Still a bit of unclarity for me here an ontological state can be a quantum state but the way round is not always true. I would start from the perspective that the underlying theory may have states in a totally different domain than the quantum theory, but you can argue that experimental results should match, and is easier if some states match. Therefore the most simple case is this one.

The standard approach was to do this starting from some microscopic hyphotesis (e.g. Bohm) and build it up from there. This was rejected as politically incorrect because it introduce hidden variables and ad hoc hyphothesis. Ideally this approach should be able to derive schrodinger, Bell and all the rest.

In this approach presented he is saying more or less, whatever theory/system is there below this system and theory will have states and transitions between states, that is why the cogweel models.

If the theory below is deterministic, fine, then there will be less states available for choice.

Therefore approaching it top down from the quantum perspective going down, the quantum world is somehow a continuos set of states plus a discrete set of them. Therefore is politically correct to state that may be we need to throw away some of them, such as two atoms at the opposite end of the solar system cannot really do strange swapping or tunnel effects between the two.

Therefore a reasonably generic theory of the below is in any case a shape of cogwheel theory, and as far as we can reasonably derive something for all these theories we can investigate further what is the mapping between the above and the below, and we can somehow derive what is a correct hyphotesis about the below.

There is a legitimate separate discussion for the entanglement part/Bell, that needs to be justified and expanded.

One example of the below is the hyphothesis of the discretization of time, we don't really experimentally know if time is discrete or not beyond a certain scale, and that he glance across.

As far as you have a brand new fundamental hyphothesis of the below (the time one is just one of many and has a couple of big names on it, but there is one per each armchair physicist on this planet), then you can try to achieve the next goal which is a non-lagrangian/non-perturbative theory of the below, from which ideally you want to derive the rest.

What is the rest for thoff is substantially what has not been achieved with strings and is a kind of back to basics for everybody.

There are a few people that can take these directions of research because of the "heretics" issue, susskind on last but one number of nature is another example.

  • user @george.di commented : " very good answer flyredeagle!! " as an answer that was deleted because answers are not for comments. – anna v May 16 '19 at 06:00