I am trying to understand what decides the outcome of an experiment and if there is any theory (e.g. non-local hidden variable theory) that is able to predict the outcome.
3 Answers
According to the Everett Interpretation, unitary evolution of the wave equation predicts that in the outcome state of an experiment there will be multiple versions of you in superposition, each one of you seeing one outcome, none of you able to see any of the others. (See here for an outline of how.)
This interpretation is local, deterministic, and assumes only the standard unitary wavefunction evolution that other interpretations agree applies between observations.
But because none of the different versions of you can see any of the others, each one can make the assertion that all the others vanish, leaving them and their observation as the lone outcome of the experiment. From each observer-instance's point of view, the surviving outcome appears to be random. Since nobody can say how or why the other alternatives disappear, (or - given that they are mutually unobservable - even tell whether or not they have,) there is no way to explain why this particular outcome remains and not any other.
Non-local theories (hidden variable or otherwise) posit that effects propagate non-locally, faster than light and (in some reference frames) backwards in time. This obviously raises problems with the word "predict", as the time-order of events is not defined. In some sense, the outcome of the experiment predicts the experiment that will be done.
Hidden variable theories hypothesize that there is some underlying deterministic mechanism that deterministically decides the outcome based on the input conditions, some of which input information must propagate faster than light, and some of which are not curently able to be or can not ever be observed (i.e. are hidden). I haven't seen any proposals for what this mechanism might be, and given that some parts of the mechanism have to be superluminal, they would not be part of current mainstream physics. It seems their existence is only inferred from the assumptions that physics should be deterministic, and experiments should yield only one outcome.
Given that no specific mechanism has yet been identified, we can't actually make any predictions right now - but should the hypothesised theory/mechanism actually exist, and we assumed a God's-eye view of the hidden variables, then such a theory would by definition predict the outcomes of experiments.
But until we know what the theory actually is, we can't say how.
-
Correction, hidden variable theories do not posit backwards-in-time causation, which is a simple contradiction in terms. If they posit faster-than-light signals, the very existence of such signals would completely upset the basis on which a backwards-in-time result is implied, and so render the implication moot. Also, with regard to Everett, the main objection there is that it may be local and deterministic, but it isn't real. – Steve Jun 03 '23 at 13:07
-
I didn't say hidden-variable theories posited backwards in time, I said non-local theories did. Non-local hidden-variable theories posit backwards in time. Local hidden-variable theories don't, but are experimentally disproved by Aspect's confirmation of the violation of Bell's inequalities. I agree backwards-in-time causality is paradoxical, but it seems physicists still prefer to propose it (i.e. non-local theories) rather than accept Everett's deterministic, local, reversible, realist interpretation. (Which has its own problems, but surely none as bad as FTL.) – Nullius in Verba Jun 03 '23 at 13:28
-
Well all hidden variable theories must be non-local (per JS Bell), so it amounts to all the same thing. Nobody posits backwards-in-time causation - it's an implication of faster-than-light within the terms of how time and causation are defined in certain interpretations. If faster-than-light were actually shown to exist (or became a necessary implication of another theory), it would simply knock out the Einsteinian (block-universe) interpretation of relativity, and leave only the Lorentzian (wave-medium) interpretation remaining. – Steve Jun 03 '23 at 14:07
-
MWI is trivially wrong. All you have to do is to read Everett's thesis and you will find his mistake in the second sentence. He simply mistook the ensemble for the individual system. That is what leads to an infinite multiplication of worlds. – FlatterMann Jun 03 '23 at 15:01
-
@FlatterMann MWI maybe trivially wrong, I for one would not know, but here is sentence #2 in Everett's thesis: "A physical system is described completely by a state function $\psi$, which is an element of a Hilbert space, and which furthermore gives information only concerning the probabilities of the results of various observations which can be made on the system." What is trivially wrong here? – hyportnex Jun 03 '23 at 19:47
-
@hyportnex It is trivially wrong that the wave function describes the state of a single system. It does not even describe an unmeasured ensemble of systems as the example of the density matrix shows. I suspect that Everett copied this language unquestioned from someone who took von Neumann a little bit too literally. He might have gotten it from von Neumann's book itself. I don't have a copy and the full text doesn't seem to be online (if it is, I would love to have a link). Wherever this mistake originated, it's high time to eliminate it from the public discourse. – FlatterMann Jun 03 '23 at 19:52
-
@FlatterMann The language is probably Wigner's, his teacher, but I think if it was that trivially wrong both Wigner and Neumann would have seen and accepted its wrongness, just me thinking. Naah, it is not that trivial. Anyhow, the smartest people, since the time of Fermat and Laplace, such as Mises, Kolmogorov, Jaynes have been arguing over the meaning of probability and it has not settled. See https://www.degruyter.com/document/doi/10.1515/9781400868056/html – hyportnex Jun 03 '23 at 20:15
-
@hyportnex As an experimentalist I don't ever have to deal with probabilities. I only have to deal with frequencies and histograms. This is most obvious if we are looking at experimental high energy physics data, which very often comes in form of a histogram (at least before we make a theoretical fit to it), but I have done plenty of histograms in spectroscopy as well. Basically everything at the single quantum measurement level reduces to finite counts. I think that clarifies the ontology of QM quite a bit. Nature doesn't do probability. Only the mathematicians like von Neumann do. – FlatterMann Jun 03 '23 at 20:18
-
@FlatterMann I do not want to stretch the patience of the readers of this post, I just want to make one more comment regarding that you are an experimentalist. I used to be engineer, radar/comms, and I had to deal with both frequencies (objective) AND probabilities (subjective). The bit error rate of a comm system is objective, repeatable as many times as you wish, the probability of radar detection and/or false alarm set to be say, $10^{-9}$, of an incoming nuclear tipped ballistic missile is obviously more subjective, fortunately. – hyportnex Jun 03 '23 at 20:28
-
Let us continue this discussion in chat. – FlatterMann Jun 03 '23 at 22:48
-
As multi-verse is not measurable, l am not able to subscribe to it. I would like to focus on theories that can predict. If we bounce a ball in a 3D room, it will fall at different points on the 2D floor. For a physicist living only in 2D, the cause of the appearance of the ball at different locations would be hidden but possible to discover based on 2D projection. I would like to know if any attempt has been made to describe the space in which the particles actually reside. The wave function in Hilbert space is like a portal to the space where particles reside. – Rajaram Venkataramani Jun 04 '23 at 08:21
So-called hidden variable theories posit the existence of variables that account for the outcome of an experiment.
They lack predictive power in the sense that the state of the variables cannot (by any currently-known means) be determined independently of, and before, the materialisation of the effect they cause.
In other scientific theories, the variables which are proposed to exist can usually be measured by some means which is independent of the main effect they are said to control, and machines can be devised where the state of the variables can be measured without the machine evolving further towards its main effect, and without the act of measurement disturbing the existing state of those controlling variables.
The ability to measure in this way is closely linked with the "settability" of the machine - the ability to bring it into a specific initial state.
The problem QM poses is that the physical variables being measured appear to be so physically fundamental, that there aren't means available to examine the states of variables, other than by either operating the mechanism itself toward its main effect, or by subjecting the mechanism to tests that either leave the variables in an altered state following the measurement (so that the previously measured states of variables no longer correspond with the current states that will ultimately determine the main effect of the machine), or destroy the mechanism completely (so that the main effect can no longer occur, and cannot be compared with a prediction).
The absence of an ability to measure the machine without disturbing it's state, also means it cannot be set into any particular initial state.
There is, of course, much physics which explains the nature and difficulty of this measurement conundrum. There isn't dispute about its characterisation or implications, so far as I'm aware.
The main point of dispute about "hidden variable" theories appears to come from those who are primarily battling over the definition of science and who vary in their axiomatic preferences.
The main alternative to hidden variable theories, are what I could dub "non-variable theories". They are theories that essentially posit the possibility that the effects we see may arise from something other than physical causes - something other than "hidden" controlling variables.
They differ from hidden variable theories in not even accounting for effects in terms of any reference to something occuring beforehand - many posit the existence of some fount of fundamental randomness or indeterminism - and in therefore leaving no further avenue for useful scientific investigation or explanation.

- 2,689
-
I am afraid non-variable is a way of saying non-measurable or unpredictable hidden variable. – Rajaram Venkataramani Jun 04 '23 at 07:56
-
@RajaramVenkataramani, for the proponents of those theories, it isn't a way of saying a non-measurable variable, it's a way of denying the existence of such variables. Hidden variable theories - having been broached as early as the 1930s - were out of favour amongst physicists for a long time because they posit the existence of variables without positing any means by which their states can be measured independently of the outcomes they are said to cause. – Steve Jun 04 '23 at 08:22
Experiments dealing with ever smaller currents or voltages, mechanical forces, electric charges and magnetic dipoles are subject to ever greater percentage error. This is obvious because subatomic particles are the smallest units we can use as measuring devices. A voltmeter or a voltmeter also influence every measurement result, but do not require any interpretation because of the smallness of the error.
The most glaring counter-example is the diffraction of electrons or photons at edges. Any attempt to observe what happens at the edges with light (i.e. photons) or currents/voltages fails. The intensity distribution on the observation screen is completely subject to the interpretation of how the diffraction occurs. The advantage of the wave function is that it does not contain any hidden variables.
The advantage of a theory that includes the interaction of the particles with the electrons of the obstacle would be the controllability of the diffraction, which would be of immense advantage for chip production (and an immense time saving from having to explain to the students again and again that phenomena of quantum mechanics can be described, but cannot be mentally grasped :-).

- 10,334
-
-
Diffraction at the edges is constant and evenly spread across the whole screen. A photons trajectory is altered proportional to it's proximity to the edge as its passing. – Bill Alsept Jun 04 '23 at 09:00
-
@BillAlsept Do you have a reference for your statement that light trajectory is altered proportional to edge proximity as its passing? I don’t believe the edge itself has any effect on a photon, although the shape itself does. For example, a slit shaped like a diamond will lead to diamond shapes when there is 2 slit interference. I don’t believe that would happen if your rule were correct. – DrChinese Jun 04 '23 at 15:21
-
1Light does diffract at all edges, and it also scatters from edges. You don't need a slit to see this because any single edge will create a pattern on a detection screen. Google images of razor straight edge. In the near field you will see single edge pattens on all edges. A diamond shaped object would form single edge patterns on all four edges and would project a diamond shape (Single Edge Pattern) on the screen. – Bill Alsept Jun 04 '23 at 17:43