Answers to questions like this discuss how real measurements retain uncertainty due to limitations of our instruments.
Is this uncertainty quantum or classical?
If it's classical, i.e., the wave function truly collapses into a single eigenstate but due to our instrumentation we cannot detect exactly which one and need to use statistics/ mixed states to model our classical uncertainty, as suggested here, this seems highly unphysical as it suggests, e.g., after observing a particle's momentum, even with measurement uncertainty about the particular momentum eigenstate, it is truly (ignoring boundary conditions) in a single momentum eigenstate and delocalized with equal probability across all space.
This is, of course, not what we experience in day-to-day life. I can observe the momentum of particles and, even with tremendous measurement uncertainty, not see them immediately delocalized across all space as would be expected if my observation truly collapsed the wave function to a single momentum eigenstate.
If this uncertainty is quantum, and observations in reality only sharpen wavefunctions around a particular eigenstate while retaining contributions of "nearby" eigenstates, it would seem we need new operators corresponding to this sharpening phenomenon. Projecting with our existing operators, which collapse wave functions to single eigenstates, don't give us the sharpening we're looking for. How do we fix our operators so that the Born rule and other postulates of QM still apply, while making projection consistent with our sharpened but still multi-eigenstate states?