0

The wavefunction evolves unitary. We can apply a unitary evolution operator containing the Hamiltonian (which shows the relation between time and energy). This evolution is smooth and continuous. And then: interaction or measurement. The evolution breaks down abruptly. Some say it's a real collapse, others say it's a collapse in knowledge.

Is there a physical principle that forbids such singular non-local event as wavefunction collapse? I mean, probability is conserved, or any other quantity, for that matter. What are the arguments that an actual collapse doesn't happen? Is it the very assumption to take the wavefunction for real? Can math not describe it properly? Any idea is welcome!

Edit As wisely said in a comment below, experiments suggest that collapse is not an instantaneous event. In atomic electron collapse to lower energies, there seems to be states in between. Thus we can logically infer that this might be the case for a free particle also. Which should make mathematical description easier. It could be though that I throw the arrow far from bull's eye hit.

So maybe the question should be: Can collapse be described mathematically?

  • 2
    I do not want to be repeating myself, here are some answers on "collapse" questions , https://physics.stackexchange.com/q/622155/ ,https://physics.stackexchange.com/q/67756/ , https://physics.stackexchange.com/q/33513/ , – anna v Jun 07 '22 at 07:23
  • 1
    What is your definition of the word real? – Qmechanic Jun 07 '22 at 07:39
  • @Qmechanic That the wavefunction is not mathematical but made of something. Hidden variables or space densities or what do I know? – MatterGauge Jun 07 '22 at 08:14
  • @Qmechanic Why did you tag decoherence? Collapse has to to with a decohering state? How can one particle decohere? – MatterGauge Jun 07 '22 at 08:22
  • 2
    I don't think a physicist who has studied quantum measurement seriously would ever say that it happens instantaneously. There are even experiments showing the dynamics of "wavefunction collapse" these days. – DanielSank Jun 07 '22 at 08:31
  • @DanielSank You mean the collapse to a lower energy orbital of an atomic electron? – MatterGauge Jun 07 '22 at 08:53
  • @DanielSank Wouldn't that be a good argument for objective collapse. – MatterGauge Jun 07 '22 at 11:02
  • 2
    @DanielSank I understand collapse is instantaneous, that's the whole point of the concept. You can believe collapse is only an approximation of another process that isn't instantaneous, like decoherence, but that's a different concept altogether. – Juan Perez Jun 07 '22 at 11:46

3 Answers3

4

Collapse in knowledge seems like a better description. Indeed, once one believes in quantum mechanics, no collapse is possible/necessary - we can describe any macroscopic measurement device quantum mechanically, write down the interaction Hamiltonian, the bath, etc. and follow through how the wave function loses its coherence and collapses to an eigenstate of the measurement Hamiltonian.

However, quantum mechanics is not a matter of belief, but a scientific theory, intended to describe a real world, that must be verifiable in experiments. The only means we can observe quantum mechanical experiments is via the changes in macroscopical objects, in a world that we perceive as classical. Thus, we infer quantum laws (and hence its equations) in terms of classical ones, in terms of interaction of quantum systems with a hypothetical classical device. Collapse is a means of describing this unphysical link.

In other words, collapse is not a physical phenomenon, but an element of the human theory for describing the observed phenomena. One could think of alternative devices to accomplish the same goal, but one cannot exclude the collapse using the theory constructed using this concept (this would be circular reasoning).

Roger V.
  • 58,522
  • I agree with almost all of this. However, collapse is as physical as anything can be: it's what we actually observe in physics experiments. That wave mechanical models are superficially incompatible with it is not a property of the universe, but a property of the models. I say "superficially" because collapse can be captured through the Mott problem and subsequent developments in decoherence theory. – John Doty Jun 07 '22 at 13:22
  • @JohnDoty I am not familiar with the Mott problem - thanks to bringing it to my attention. – Roger V. Jun 07 '22 at 13:27
2

Before you can discuss whether a "collapse" really happens, you first have to ask what the wave function even means physically, and that's a big part of the dispute.

First off, I'd argue that the most cogent interpretation is that a wave function proximately represents knowledge or information held by an agent about what the physical parameters of a system are, and the collapse is acquisition of new knowledge by said agent. That is, the proper direct, semantic value to attach to the mathematical term $|\psi\rangle$, similar to how we attach to the parameter $\mathbf{v}$ in classical mechanics is "the speed and direction at which something is changing its position", is "a knowledge state" (and in the context where that the system's observable set is tomographically complete for that Hilbert space, a maximal knowledge state). The question is, given the various unusual behaviors of such, is there a physical object, or part of such an object, underlying said knowledge, which is also of the same "shape", if one wills, and that is why that we see said behaviors?

And in that case, we can then further ask if, given that after our measurements we have to update our knowledge to change our subjective wave functions so as to retain statistical accuracy on subsequent measurements, does that change there likewise also reflect one-for-one a change in that physical object which likewise is also, as you suggest, instant and non-local? And if so, what is it about "measurements" specifically that distinguishes them from other interactions in making them liable to generate that change? Or, in fact, if there is no change, why is it that, despite there being no change, our knowledge remains accurate and thus what else has changed so that the unchanged prior "wave function-shaped object" is now behaving as though a different such object is in play?

And that's where there's no established answer. Because given our accumulated experimental science so far, we can imagine any number of such "implementing" constructs at work "behind the scenes" of the subjective theory of quantum mechanics, and they would give us the same results. And some of those may exhibit non-local changes. The only way we could know if one or more of such were "true" or not, would be to find a situation in which quantum mechanics fails, then try to "reverse engineer" what that says about all the other situations where it succeeded.

So no, there is no principle that we are aware of that has empirical support which would forbid such "behind the scenes" non-local objects. It is certainly reasonable to suppose that there is not such an object because all observable phenomena exhibit local causality, and the most parsimonious idea is that local phenomena are generated by locally-causative mechanisms, but that is not a proof of such.

That said, I've long wondered if there may be a way to derive the theory of quantum mechanics from some principle of "economy of information" that the Universe tries to limit the maximal amount of information that systems contain (or in some other way "permits only a finite maximum quantity of information" that is suitably formulated so as to reconcile with the possibility of an infinite spatial extent thereof). This is based on the fact that the existence of nontrivial probabilities corresponds to a restricted quantity information as per Shannon's theory of information, and the fact of how the behavior under measurements "looks" in that squeezing more information out of one aspect of a system causes a loss in other aspects, strongly is reminiscent of that some sort of underlying "capacity", "buffer", or the like has been "saturated". It would not necessarily categorically rule out the existence of a "wave function-shaped carrier or encoder" of that information, but it would strongly argue against it philosophically, because such an encoder would be infinitely wasteful, for it takes infinite precision real numbers to describe such a thing literally.

  • @march : yes, the "half information" model, proposing that each object has twice the information a pure quantum state carries (and thus is a "hidden information" model, due to carrying said extra information, but arguably not like the infinite extra information found in the Everett or Bohm "wave function realism" models), so only half is accessible and the other half is inaccessible. However, it doesn't appear to be a precise reconstruction of QM but rather, as pointed out, only qualitatively reproducing it. – The_Sympathizer Jun 07 '22 at 20:07
  • 1
    Yeah, that model can't reproduce all QM phenomena (it's a local hidden variable model after all), but it recovers some surprising ones. I just figured it fit your intuition for an "economy of information" principle, so I was wondering if you'd seen it or not. – march Jun 07 '22 at 20:11
  • @march: thanks, yeah. – The_Sympathizer Jun 07 '22 at 20:15
2

There is no definitive answer to your question, as there are different schools of thought about the significance of the wave function, but here are some points you might want to consider...

Quantum theory, like the rest of physics, is a mathematic model of reality. When we use quantum mechanics to perform calculations to test the model against experimental results we often make simplifying approximations. For example, when calculating energy levels in solids, we might adopt the 'one electron' approximation, in which we simplify the innumerable interactions between the particles in the solid by assuming they can be averaged out as a classical background potential in the Hamiltonian. So there is a difference between our quantum mechanical model and the underlying reality.

The idea of the wave function 'collapsing' really came out of the Copenhagen interpretation of QM as subsequently developed by von Neumann. That interpretation says that when you make a measurement of a given property of a particle in some random state, you get an answer that is one of the eigenvalues associated with an operator that represents the property being measured, and as a consequence of the measurement the wave function of the particle changes to become one of the eigenfunctions of that operator. Following that prescription allows you to make calculations that agree with experiments, but if you think about it for any length of time you will realise that it is a mathematical ideal, not reality. So in that sense, at least, the wave function is just a mathematical entity that figures in QM models of physical systems.

As for whether the wave function represents something that really is wavelike... one of the problems with that view is that wave functions in QM are complex functions with imaginary components, and the wave function of a system of multiple particles is a function in a multidimensional mathematical space, so it is hard to see how it could correspond to a 'real' wave in 3D space.

Given all that, the idea of an instantaneous 'collapse' of the wave function seems to be just a consequence of the simplifying assumptions built into QM in the early days, in particular, the assumption that a measuring device was a purely classical object and that measurement caused jumps in the quantum state of the object being measured. It should be obvious that a measuring device is a collection of quantum particles, and therefore that a proper theory of measurement needs to take into account all the interactions between the particles that comprise measuring device and the particles that comprise the object. Without working through all the details, you should be able to see that a measurement could come about as a gradual (ie with no 'collapse') interaction between all the particles.

Marco Ocram
  • 26,161