Reading the quantum mechanics textbook we are told the wave function for a definite position at $a$ is $\psi(x)=\delta(x-a)$. Yet, also we are told that the probability must be $\int|\psi(x)|^2 dx$=1. But $\int_{-\infty}^{\infty} |\delta(x-a)|^2 dx = \delta(0) = \infty$. (In fact squaring a delta function is not allowed as it is a "distribution").
Hence it seems like we really want $\psi(x)=\delta(x-a)^{1/2}$ or something similar? Perhaps (formerly) $\psi(x) = \frac{\delta(x-a)}{\sqrt{\delta(0)}}$. Or restricting $x$ to a periodic dimension?
In fact how can we even use the Dirac Delta function at all as a wave function when it gives a ridiculous probability! (And QM is after all based on probabilities.)
Is there a mathematically rigorous way out of this conundrum? I express the flaw as follows:
- We assume $\psi$ is a delta function when position is known 100%
- To get probabilities we take absolute square of wave function
- Distributions such as delta function can't be squared.
- Number 3 contradicts with number 2.
To my eyes, it seems like quantum mechanics would require a mathematics of square-roots of distributions. The same problem occurs in QFT if we want the wave function of a definite valued field $\Psi[\phi]$.