12

To my understanding, there is currently no scientific consensus on which interpretation of quantum physics is the correct one, if any. The most famous one, perhaps for historical reasons, is the Copenhagen interpretation, though I'm always unsure to what extent it is usually accepted.

Recently, I've come across a paper titled Probability in Quantum Theory by E.T. Jaynes, in which Jaynes argues that the Copenhagen interpretation is a prime example of a scrambling of epistemological (to do with inference) and ontological (to do with physical reality) views.

To supply some context: Jaynes has had much success solving long-standing problems in a variety of fields using his interpretation of probability as an extension of logic (it basically comes down to objective Bayesian inference). To Jaynes, probabilities represent only ones incomplete information about a system, a view that obviously encounters problems in quantum mechanics, where probabilities seem fundamental.

Now, with regards to the paper: I want to particularly focus attention on pages 4-11, where the main argument is presented. If I'm understanding correctly Jaynes' argument is not specific to the Copenhagen interpretation per se. Rather, he is using it as a convenient `target' - if you will - to make his point that probabilities in quantum mechanics are fundamentally different from classical probabilities (as is well known), and that the obvious next step is then to look for a new hypothesis space in which probabilities actually represent inferences.

The paper contains some mentions of the historical Bohr-Einstein (EPR) debate, and claims that much of the confusion surrounding it is the result of this scrambling of epistemological and ontological statements.

On to my specific issue then: I haven't been able to find any criticism of Jaynes' arguments, nor have I found many references that continue in the (to me rather promising) direction suggested on page 10: `to exhibit the variables of the deeper hypothesis space explicitly'. If Jaynes is in any way correct then this seems like a promising direction for future research, and so I am surprised to see so few papers published on the subject.

Am I missing any obvious criticism and/or publications? Or has this point of view simply become a victim of its controversial stature and the fact that Jaynes himself died a short time after its publication?

P.S. For extra context, this question may be seen as a follow-up to `Has Jaynes' argument against Bell's theorem been debunked?'. One of the comments there linked to this blog post, contending that Jaynes was indeed wrong in his assertion that a hidden variable theory could explain quantum mechanics. I agree with this criticism, but the paper that is the subject of my current question seems to take a whole new angle of attack. I'm not certain that this is the case (it might be that the papers linked in the blog post also debunk this new argument by Jaynes), so I would also appreciate any response that clears this up.

Timsey
  • 1,007

2 Answers2

4

I think the most interesting approach in this direction is Caticha's entropic dynamics, for example in his "Entropic Dynamics, Time and Quantum Theory", arxiv:1005.2357. "Quantum mechanics is derived as an application of the method of maximum entropy. No appeal is made to any underlying classical action principle whether deterministic or stochastic. ... Both the magnitude and the phase of the wave function are given statistical interpretations: the magnitude gives the distribution of x in agreement with the usual Born rule and the phase carries information about the entropy S(x) of the extra variables."

So, this looks like a successful implementation of Jaynes' hopes. I think some other modern approaches from the quantum information community, like G. Chiribella, G.M. D'Ariano, P. Perinotti, , Quantum from principles, arxiv:1506.00398 are also not that far away from Jaynes' ideas.

Schmelzer
  • 403
  • it does not seem to have ever been published in a peer reviewed journal and the name "Bell" does not appear in the text – anna v Nov 12 '17 at 06:27
  • Caticha is published, http://iopscience.iop.org/article/10.1088/1751-8113/44/22/225303/meta Chiribella et al https://link.springer.com/book/10.1007%2F978-94-017-7303-4 and given the formulation of the question I don't understand why mentioning Bell would be obligatory to the development of Jaynes approach toward an interpretation of QT as a theory of inference. – Schmelzer Nov 13 '17 at 13:21
  • Bell is mentioned in the latest edit of the question, and it is Bells theorem that limits deterministic models to non local models afaik. – anna v Nov 13 '17 at 15:19
  • Fine, but as I see it, "Bell" is only in a PS in reference to another question where Bell is considered in more detail. Of course, reasonable proposals for an interpretation of QT as inference has to be realistic anyway (last but not least, it makes no sense to make inferences about something which does not really exist), and therefore has to be non-Einstein-local (to name that nonlocal is completely misleading, a local theory can have 100000 c as a maximal speed of information transfer but would be nonetheless local). – Schmelzer Nov 13 '17 at 15:39
  • @anna I'd like to note that Bell's theorem, or moreso the Copenhagen interpretation, also limits non-deterministic models to non-local models due to the global, instantaneous collapse of the wavefunction. – Mike Flynn Jan 06 '19 at 05:50
  • @MikeFlynn it is all mathematical formulas. The wave function is not a balloon. When the boundary conditions change, a different wave function has to be calculated with the new boundary conditions. Is as instantaneous as the change of the feynman diagrams entering the calculations . – anna v Jan 06 '19 at 06:51
  • @annav I disagree - it's not all mathematical formulas. After all, we are talking specifically about the interpretation of those mathematical formulas. And if you subscribe to the Copenhagen interpretation, you believe that upon measurement there is a non-local discontinuity in the wavefunction at the time of measurement that leads to the new wavefunction. – Mike Flynn Jan 06 '19 at 07:51
0

I have spent many years trying to formulate a realist account of (non-relativistic) quantum mechanics using a rational-Bayesian approach to probability much like that of Jaynes. I was also motivated by the Feynman's idea that probabilities might be better represented by complex numbers rather than by non-negative real numbers as is traditionally done.

I take all the usual properties (particle positions, spins, momenta etc.) to be actually possessed by a system, not just a product of measurements. I have sought the simplest complex-valued probability theory, and the simplest possible physical laws (mostly of a general character), that will reproduce the usual quantum mechanical formalism. The 'new' theory of probability includes extensions of the methods employed by Jaynes to find prior probabilities (the principle of indifference and the method of transformation groups). The principle of maximum entropy is also employed.

In this way, I think light is cast on the physical nature of particles, allowing a more precise and a realist physical picture of systems to be formed in the mind. For example an electron is always at a definite point (not somehow at two or more points at once) and moves continuously through space though not, it seems, on a smooth path even at the smallest scales. Momentum seems to be an internal property of an electron like its spin angular momentum, and so on.

The physical laws assumed are nothing like sufficient to predict particle motions fully, but given knowledge of certain particle properties, we can calculate the probabilities of others by Bayesian methods. Measurements can be modelled and the collapse of the wave function follows naturally when we acquire new knowledge based on observing (or failing to observe) particles moving in the classical limit.

A new aspect of the probability theory relates to the meaning of a (complex-valued) probability whose squared modulus equals 1. Only if the phase of the probability is determinate is the event in question certain to occur. Otherwise we may only expect it to occur. It might be that the acquisition of the knowledge we hold has caused it to occur, or that we should just logically expect it to occur (with the caveat that we might be wrong). Using this (very Bayesian!) rule, it seems possible to resolve the arguments (Bell inequalities, Kochen-Specker paradox) that claim to show real possession of properties can only lead to contradictions.

If anyone is interested in looking at this work, I can send them a pdf containing it. See the abstract to the publication in my LinkedIn profile.

  • +1 for the courage :) There are too many John Hemps on linked in to find you. Could you upload the work e.g. to arxiv and link here or at least provide the link to your profile? – Hennadii Madan Jan 09 '19 at 05:17
  • Thanks for your interest Hennadii. You can find the pdf free at http://vixra.org/abs/1802.0030 – John Hemp Jan 10 '19 at 17:22