Well your question really gets at the Measurement problem. It's actually not well established that the outcome of a measurement is truly probabilistic; the answer to that question cuts to the basic interpretation of QM. Different schools of though resolve measurement differently and currently there is no universally accepted answer. You may also be interested in learning about decoherence. Even in the Copenhagen interpretation, many physicists have abandoned a true probabilistic wave function collapse in place of an apparently probabilistic but fundamentally deterministic evolution of the system. In this case, we have to consider the wavefunction of the entire observer-observed system.
Here is an article I wrote about the unitarity of wave function collapse during measurement, if you are interested.
Understanding \ Uncertainty
Today, the interpretation of quantum mechanics remains perhaps the greatest open question in all of science and the resolution of this problem has profound implications for our concepts of determinism, philosophical realism and the limits of knowledge. At the heart of this debate lie the fundamental uncertainty relationships between classically conjugate variables as well questions such as "what constitutes the measurement of a quantum system." We will briefly explore how uncertainty arises and the ways in which different interpretative frameworks attempt to reconcile the problem of measurement.
Heisenberg's Microscope
The famous Heisenberg Uncertainty Principle states that it is theoretically impossible to know exactly the values of sets of related measurable quantities. These are known as complementary variables. The most common examples of these inequalities include:
\begin{equation}
\Delta x \Delta p \ge \frac{\hbar}{2}
\end{equation}
\begin{equation}
\Delta E \Delta t \ge \frac{\hbar}{2}
\end{equation}
\begin{equation}
\Delta J_i \Delta J_j \ge \frac{\hbar}{2} |\langle J_k\rangle |
\end{equation}
or the relationship between linear momentum and position, energy and time and the different components of angular momentum along three orthonormal spatial basis.
In the context of introductory quantum mechanics, these inequalities are nearly always presented as either a priori axioms of the system or derived by the commutation relationships for the quantities corresponding operators. However, Heisenberg himself derived these elegant relationships through a simple thought experiment, known as Heisenberg's microscope, which provides a powerful intuition about the source of these enigmatic statements.
Consider a classical electron (i.e, a point particle) being investigated by a scientist using a microscope as portrayed in fig. 1. In order to determine the position of the particle, an incident photon must scatter off the electron at some angle, $\epsilon$, before being focused by the series of optical lenses and arriving at the observer. This step represents a critical loss of information about the system. In particular, the scattering process perturbs the momentum of the electron according the classical relativistic equations for conservation of momentum. However, because lens accepts scattered photons over the entire angle, $\epsilon$ and focuses them to the same point, information about the angle of reflection (and therefore resultant momentum) of both particles is lost. Therefore, we are left with an inherent uncertainty in p, $\Delta p_x \sim \frac{h}{\lambda} \sin(\epsilon)$, where $\lambda$ simply represents the classical wavelength of the light. Additionally, due to the wave nature of light, the microscope can only resolve the position of the electron to a range of $\Delta x = \frac{\lambda}{\sin(\epsilon)}$. Multiplying these terms together, we have arrived at an approximate form of the uncertainty principle purely through a classical thought experiment![2] This simple thought experiment demonstrates that in order to measure the value associated with one quantity, we must always perturb the value of another quantity. In fact, this statement essentially contains the same information as the traditional approach based on commutation relationships. These relations fundamentally express that measuring two different conjugate quantities such as position and momentum sequentially will yield different results depending on whether position or momentum is determined first, because measuring either affects the other.
Fourier Analysis of Uncertainty
In fact, this analysis can be extended to any pair of quantities who are each others Fourier transforms. More rigorously, the derivatives of action are conjugate to the variable which one differentiates.
Generally, Fourier transforms relate the time-like component of a system to the frequency-like component. This relationship is best illustrated with a simple example. Consider a transverse wave. For such a system, we determine the time-like component to be the period, or the amount of time necessary for the wave to travel one full cycle from peak to peak. Similarly, we take the frequency to be the inverse of the period. For an electromagnetic wave, the period (or time-like component) is proportional to the wavelength while the energy is proportional to the frequency. Thus energy and period (or the integral of time between two events) form a Fourier pair.
We will now turn our attention back to the relationship between position and momentum. Here we identify the position as the time-like quantity and the momentum as the frequency-like quantity. We map between these bases using the Fourier transforms:
\begin{equation}
\Psi(x) = \frac{1}{\sqrt[]{2\pi\hbar}}\int_{-\infty}^{\infty}\psi(p)e^{ipx/\hbar}\,\mathrm dp
\end{equation}
where $\Psi(x)$ is the position space wave function and $\psi(p)$ is its representation in momentum space.
We are now ready to derive the relation given in eqn. 1. What follows is a skeletal representation of the a proof for the uncertainty principle [3]. By the definition of uncertainty, we recall:
\begin{equation}
(\Delta x)^2 = \langle x^2\rangle - \langle x\rangle ^2
\end{equation}
and similarly for momentum. As absolute position and momentum are gauge freedoms, we may simply set the expectations of x and p to be zero, leaving us with:
\begin{equation}
(\Delta x)^2 = \langle x^2\rangle =\int_{-\infty}^{\infty} x^2 |\Psi(x)|^2 \,\mathrm dx
\end{equation}
\begin{equation}
(\Delta p)^2 = \langle p^2\rangle .
\end{equation}
Now let us define $f(x) = x\Psi(x)$. Therefore, we have
\begin{equation}
(\Delta x)^2 = \int_{-\infty}^{\infty} |f(x)|^2 \,\mathrm dx = \langle f|f\rangle .
\end{equation}
We define an equivalent function for p, \ $\tilde{g}(p) = p\psi(p)$. We can find the x-domain representation of $\tilde{g}(p)$ using an inverse Fourier transform. We find:
\begin{equation}
g(x) = \left(-i\hbar\frac{\mathrm d}{\mathrm dx}\right)\Psi(x).
\end{equation}
We can now write the variance in momentum as
\begin{equation}
(\Delta p)^2 = \int_{-\infty}^{\infty} |f(g)|^2 \mathrm dx = \langle g|g\rangle .
\end{equation}
By the Cauchy-Schwartz inequality, we know
\begin{equation}
(\Delta x)^2 (\Delta p)^2 = \langle f|f\rangle \langle g|g\rangle \ \geq |\langle f|g\rangle |^2
\end{equation}
\begin{equation}
|\langle f|g\rangle |^2 \geq \left(\frac{\langle f|g\rangle _\langle g|f\rangle }{2i}\right)^2.
\end{equation}
Finally, by explicit evaluation, it can be shown that
\begin{equation}
\langle f|g\rangle - \langle g|f\rangle = i\hbar.
\end{equation}
At last we have arrived at the uncertainty relation as promised. Plugging in this value to the inequalities above and taking the square root, we find
\begin{equation}
\Delta x \Delta p \geq \frac{\hbar}{2}.
\end{equation}
Uncertainty: Ontological or Epistemological
We have so far established inherent limits on our ability to specify the state of a quantum object in bases such as position and momentum. It is therefore clear that we must describe the attributes as distributions over some range of values. At best, we can determine the probability that an observed attribute of some particle will fall within a range of possible values. In position space, this is simply accomplished by taking the modulus of the particles wave function and integrating over some distance.
However, we have not established whether this uncertainty in measurement arises from the ontological nature of the system, that is to say that particles truly exist as probability distributions over space rather than at fixed points, or instead is simply an unavoidable epistemological artifact resulting from our inability to fundamentally determine the particle's position. However, we only ever observe a system to be in a single eigenstate or a set of eigenstates bounded by minimum uncertainty. For example, if we measure the path a particle takes in the two slit experiment, we, of course, cannot observe the particle taking both paths simultaneously. This act alters the system's wave function and destroys the characteristic interference pattern. Even if an observation of the path taken is made after the classical wave has exited the two slit apparatus, the wave function of the particle is collapsed leading to the two bands characteristic of a particle. Known as a delayed choice experiment, this startling result indicates that measurement can retroactively collapse the quantum state of a system. Information about the future arrangement of the quantum state propagates backwards in time. This is a clear violation of local realism, the classical belief action at one point in space time cannot instantaneously affect the state of the system at another point.
The most orthodox interpretation of quantum mechanics, the Copenhagen school of thought, points to the interference pattern of the double slit experiment as incontrovertible proof that the true state of a quantum system prior to observation must be treated as a superposition of eigenstates. According to the many of the early proponents of quantum mechanics including Heisenberg and Bohr, a quantum system has no definite properties before observation [4]. In this framework, an electron initially in a superposition of spin states is neither spin up nor spin down until observed. In the minds of many of early quantum physicists, measurement alone forces particles to adopt real properties. This approach to quantum mechanics has found itself profoundly at odds with our classical intuitions where we believe that objects truly have values associated with observables such as position and angular momentum. However, this non-realistic theory of existence has come under fire in recent decades from proponents seeking to reconcile the problem of measurement with a more traditional view of reality. Before we can understand these competing theories, we must first establish a firm grasp of what is meant by measurement, the mathematics of observation and who or what constitutes a valid observer in the quantum world.
Evolution of the \ Wave Function
The Mathematics
First, let us recall the equation determining the time evolution of an arbitrary wave function, $\Psi$. Given some initial state, we can described the wave function at any future point in time using the Schrodinger equation,
\begin{equation}
i\hbar\frac{\partial \Psi(x,t)}{\partial t} = -\frac{\hbar ^2}{2m} \frac{\partial ^2 \Psi (x,t)}{\partial x^2} + V(x)\Psi (x,t).
\end{equation}
We are interested in determining the value of some continuous eigenvalue, for clarity, we will observe the particles position. Initially, we described the state of our system as an integral over these eigenstates,
\begin{equation}
|\Psi\rangle = \int_{a}^{b}c(x)|x\rangle \,\mathrm dx,
\end{equation}
where c(x) parameterizes the amplitude of the wave function over all space. We can express the probability of find our particle between any two points, a and a + $\Delta$ x as,
\begin{align}
Pr(a < x < a +\Delta x) &= \int_{a}^{a + \Delta x}|\langle x|\Psi\rangle |^2 \,\mathrm dx \\
&= \int_{a}^{a + \Delta x}|c(x)|^2 \,\mathrm dx.
\end{align}
Now suppose that we perform a measurement of position and determine the particle sits at some point, $x'$. The wave function has spontaneously collapsed such that $|\Psi'\rangle = |x'\rangle .$ A repeated measurement immediately following this collapse would yield the same eigenstate. However, in the absence of repeated measurement, the system will simply continue to evolve according to the standard Schrodinger equation following observation.
This highlights the fundamental aspects of the process of measurement, wave function collapse and wave function recovery. John von Neumann first formalized this description of measurement with the Postulates of Reduction [2]. Specifically, these postulates state that:
Measurement of some observable, $q$, which is initially in a superposition of eigenstates, $q_n$, will yield some eigenvalue such that $|\Psi'\rangle = |q_i\rangle $ with probability equal to the square of coefficient for $q_i$.
Measurements of $q$ which are repeated immediately, one after the other, will always yield the same eigenstate.
The second postulate results from the fact that the commutator of any operator with itself is zero. However, we can show that for repeated measurements of two different non-commuting observables, the second postulate will not be satisfied.
Let us consider another the simple example of a particle in box. We initially determine the particle to be in the ground energetic state, $|\psi_1\rangle $, with associated energy, $E_1 = \frac{\pi ^2 \hbar ^2}{2mL^2}$, where $L$ represents the length of the box. The spatial wave function will then be,
\begin{equation}
\langle x|\psi_1\rangle = \sqrt{\frac{2}{L}}\sin(\frac{\pi x}{L}).
\end{equation}
Now suppose we make a measurement of the particle's position. We know the probability of find the particle between two points, a and a $+ \Delta x$ will be $\frac{2}{L}\sin^2(\frac{\pi a}{L})\,\mathrm dx$.
If we find the particle at $x = a$, we can describe the resulting wave function as the sum over energy eigenstates,
\begin{equation}
|\Psi '\rangle = \Sigma_{n} |\psi_n\rangle \langle \psi_n|a\rangle
\end{equation}
with $\langle \psi_n|a\rangle = \sqrt{\frac{2}{L}}\sin(\frac{n\pi a}{L}).$
We immediately see that the probability of observing the system in some energy eigenstate for $n \neq 1$ is greater than zero. Thus by two non-commuting measurements, we can change the energy state of the system.
Questions Affecting Measurement
These two simple examples have allowed us to explore the basic process of measurement. However, this description raises two important questions, together known as the measurement problem. We have seen that the wave function for a quantum system evolves deterministically according to the Schrodinger equation. However, upon observation, the wave function apparently collapses probabilistically to a single eigenstate [5]. The questions of interest then become,
"How do we reconcile the transition between deterministic and probabilistic evolution of a quantum system, especially in the light of the apparent role of an observer?"
"What constitutes a measurement, or what is an observe?"
Both of these quandaries ultimately tie back to our discussion of uncertainty. If uncertainty is fundamentally an epistemological limit, measurement simply reveals the true state of the system previously shrouded by our inability to adequately specify the system. In this case, both questions are easily reconciled as a quantum system evolves deterministically in all settings with real values associated with observables at all times. In such an interpretation, measurement presents no more of a problem than any other quantum interaction and the wave function collapse simply amounts to a refining of our knowledge about the system. Alternatively, if uncertainty proves to be an ontological property of the system the questions become significantly more intractable and demand a careful definition of terms.
Measurement in Different Interpretations
The measurement problem, perhaps more than any other question in quantum mechanics, serves as the basis for competing interpretations of quantum theory. This sections merely aims to serve as a brief introduction to a subset of the most prominent interpretations.
The Copenhagen Interpretation and Decoherence
Thus far, our discussion of measurement and uncertainty has largely been implicitly in terms of the Copenhagen school of thought, first advanced by Heisenberg and Bohr. For this reason, we will only provide a precursory, qualitative overview of the theory. Specifically, Heisenberg and Bohr believed that uncertainty represented the ontological state of the system and that the act of measurement simply behaves as an irreversible thermodynamic process which alters the state of the system. In this sense, wave function collapse occurs any time that a quantum system interacts with a classical environment [4].
Importantly, the probabilistic wave function collapse and the associated question of what constitutes an observation can be solved by treating the interaction between the measuring device and the quantum particle as a process of entanglement.Thus measurement occurs constantly in any macroscopic system. In this manner, the apparent significance of the observer vanishes as the collapse represents the resulting superposition of the entire system. The interaction with the classical ensemble leads to the loss of information about the original quantum system as the wave function becomes diluted by many repeated scattering events between both the measuring device and the quantum particle and measuring device and itself. Due to the far larger number of particles in the classical device, the original quantum system becomes dominated by its classical environment and phase relationships between each quantum particle lead to interference between the possible eigenstates of the system as a whole, leading to a single apparent eigenstate for any observable in a process known as decohernece [6]. In short, the Copenhagen interpretation deals with the questions of measurement by eliminating wave function collapse entirely.
The Many Worlds Hypothesis
Today, one the leading opponents to the Copenhagen interpretation focuses on the objective reality of the wave function. First developed by Hugh Everett in 1957, the Many Worlds Interpretation begins by positing a universal wave function. In other words, we treat all of existence as a single quantum system obeying the conventional Schrodinger equation. Everett resolved the apparent probabilistic nature of measurement by expanding the superposition of the system to include the measuring apparatus or classical environment[7].
In this formulation, the position of a particle described by a continuous distribution of eigenstates will be found at every possible eigenstate if observed. In this sense, uncertainty represents all possible universes. In fact, not only is uncertainty an ontological aspect of the quantum system, uncertainty represents the complete state of the system with each possible eigenstate corresponding to an equally real and valid outcome. The apparent observation of a single eigenstate results from a sort of universal splitting. If we observe the spin state of an electron initially in a superposition of up and down, the entire universe continues to obey the Schrodinger equation and the apparent collapse occurs as our conscious is confined to an single eigenstate of the universal superposition with many other versions of us existing in tandem. In Everett's interpretation, the universe is deterministic and measurement (or any interaction) simply leads to a branching of possible futures.
Proponents of the Many World's interpretation point to its mathematical simplicity and lack of assumptions. Rather than relying on multiple equations necessary to describe the evolution of a system (a characteristic of other interpretations), the Schrodinger equation stands alone. Unlike traditional Copenhagen theory, no nebulous definitions of irreversibility or measurement need to be advanced. Prominent physicists including Sean Carroll have claimed that the Many Worlds interpretation therefore satisfies both Occam's razor and represents the most mathematically pure solution to the problem of measurement.
However, Everett's approach has not been without detractors, including Bohr himself. Specifically, objections have taken the form that while the number of equations and assumptions may be minimized, this simplification does not outweigh the infinite number of possible realities which seem to make the theory more complex, not less.
De Broigle-Bohm Pilot Wave Theory
At odds with the other major interpretations of quantum theory outlined above, Bohmian mechanics treats fundamental uncertainty as an epistemological property of a quantum system rather than an ontological one. Popularly known as Pilot Wave theory, this interpretation sets itself apart from both Copenhagen and Many Worlds by being both realistic and deterministic [8]. In this sense, Bohmian mechanics maps naturally onto classical mechanics. However, the interpretation abandons locality (or the principle that action cannot spontaneously affect a system separated in space or time).
Bohm posits that real, localized particles with definite properties exist at all times, thus any uncertainty in the eigenstates of a system simply come from our inability to adequately probe the system. Thus if we say that an electron is in a superposition of spin states, we are merely stating that we have insufficient information to properly specify the alignment of angular momentum along a particular axis. However, while the metaphysical interpretation of Bohmian mechanics differs explicitly from the Copenhagen school, the theory makes identical predictions to the more orthodox interpretation.
The basic premise of the theory includes both a point like particle and a wave which determines the evolution of this particle and whose form matches identically the conventional Schrodinger equation. Bohmian mechanics starts by positing that a real configuration of the universe, $q$, exists which can be described by the sum of the individual configurations of each constituent particle, $q_i$, in configuration space $Q$ with coordinates $q^k$. Typically, configuration space takes the form of the spatial positions and orientations of each particle.
The dynamics of this system obey the traditional wave function in configuration space, $\Psi(q,t)$. In this sense, a particle embedded in this space follows a trajectory corresponding to an exact eigenvalue solution to the Schrodinger equation [7]. Specifically, the configuration of the system evolves according to a guiding equation of the form:
\begin{align}
m_k\frac{\mathrm dq^k}{\mathrm dt}(t) &= \hbar \nabla_k Im \ln\psi(q,t)\\
& = \hbar Im\left(\frac{\nabla _k \psi}{\psi}\right)(q,t)\\
&= \frac{m_k j_k}{\psi * \psi} = Re\left(\frac{\hat{P_k}\Psi}{\Psi}\right),
\end{align}
where $j$ is the probability current and $\hat{P}$ is the momentum operator. In correspondence with the Copenhagen interpretation, the configuration of the system follows a distribution according to $|\psi(q,t)|^2$.
In this manner, all the predictions of quantum mechanics are preserved and simply the metaphysics are changed. For example, the two slit interference pattern becomes the result of a distribution of initial positions of the incident particles. However, any individual particle follows a definite trajectory. Fundamental uncertainty according to the Heisenberg principles prevent us from specifying these initial positions. Measurement then merely becomes a minimization of uncertainty in the respective quantity corresponding to a maximization in the uncertainty of the conjugate barrier. However, measurement does not lead to a probabilistic outcome, merely a qualification of knowledge.
While Bohmian mechanics successfully restores a realistic theory of the world, the interpretation continues to be a more niche approach to quantum theory. Critics have often pointed to its inclusion of two equations (the Schrodinger equation and the guiding equation) rather than one as a sign that it is axiomatically weaker than its competitors. Additionally, so far no fully relativistic formulation of Bohmian mechanics has been successful, although current attempts in this direction appear promising.
Conclusion
Modern quantum theory has firmly established our inability to precisely specify the values of certain conjugate variables. However, the interpretation of this uncertainty has not been decisively determined. While the argument between a epistemological and ontological approaches has thus far led to no varying experimental predictions, unresolved questions surrounding uncertainty and measurement have profound implications for our metaphysical concept of reality.
- G. Vandergrift, Creative Commons. 2015
- V. Braginisky, F. Kahlili, "Quantum Measurement" Cambridge University Press. 1995
- L.D. Landau, E.M. Lifshitz, "Quantum Mechanics: Non-Relativistic Theory" Pergamon Press. 1977
- W. Heisenber, "Physics and Philosophy: The Revolution in Modern Science" Harper Perennial. 1957
- S. Weinberg, "Einstein's Mistakes" Physics Today. 2005
- H. Price, "Times' Arrow and Archimedes' Point". 1996
- B. DeWitt, "Quantum Mechanics and Reality" Physics Today. 1977
- J, Wheeler, H. Zurek, "Quantum Theory and Measurement" Princeton Legacy Library. 2014