Suppose I have a relativistic wavefunction for some massive particle, defined by Dirac equation. Suppose the particle is free from interactions.
Let us suppose that the probability density takes the shape of a spatially localized wavepacket, let's say a spherical Gaussian distribution (or any localized distribution) with a size dispersion $\sigma$ (standard deviation for Gaussian). I know already that the wavefunction and the equations are invariant but would $\sigma$ shrink under a Lorentz boosts?
The total wavefunction does not have to be localized, but I would like to know if the envelope of dense points of the probability distribution are invariant under Lorentz boosts.
I guess I will have to find a $\psi$ as a sum of plane waves solutions to Dirac equation, such that $$\psi^\dagger\psi\propto \exp\left(-\frac{|x|^2}{2\sigma^2}\right)$$ (if that is even possible), and then see how this distribution changes under Lorents boosts. But this seems complicated already as it is not as simple as changing $x\to x'$ as the Gaussian is a sum of many plane waves with different frequencies. Maybe another distribution would be easier to handle? Any feedback?
Some are suggesting that the single particle picture is lost, but would this happen for free particles?