0

Please clarify this confusion I have:

My understanding of quantum mechanics is this:

"The modern interpretations of the Heisenberg uncertainties state that the uncertainties in the certain quantities in the quantum regime do not arise due to disturbing a system via measurement. These uncertainties are simply inherent, they are actually there in the quantities! Quantum physics states that whether we look/measure or not, some quantities exist that simply have no definite value (for example, the position of a particle isn’t a single value, it’s a probabilistic random variable).

Let us take the example of when a particle is localised at a point. The Heisenberg uncertainty says regarding this that the momentum of this particle is then undefined. The value of momentum should have been one number, because the state must have one value of it’s momentum. But against this intuition, we know that the momentum value now has a large uncertainty associated with it, which means the momentum of the state is no longer fixed; basically we cannot speak of an entity called momentum. "

If this understanding of mine is correct, please also clarify this:

My question is that in the example mentioned, is the momentum distribution's uncertainty a large number (an infinity) or is it even true that the average value of the momentum distribution an infinity?

Since $<P^2>$ - $<P>^2$ is the square of the standard deviation of P, what can we say about each of these terms?

Is it right to conclude that $<P^2>$ is also an infinity but not $<P>$?

SX849
  • 159
  • 3
    What is the difference between the uncertainty being large and the possible measured values being large? The definition of uncertainty is the standard deviation of the potential values, so I don't really understand what distinction is being made here. – ACuriousMind Oct 31 '22 at 16:31
  • I think what I mean is that if a quantity's distribution 's standard deviation is large does it necessarily mean that even it's average value is large? – SX849 Oct 31 '22 at 16:37
  • 1
    no, of course not (think of a Gaußian centered at zero). Why would the average value have something to do with the standard deviation, and what specifically does this have to do with Heisenberg uncertainty? If you're asking about the logic of a common elementary exercise where we seem to use the uncertainty to argue about the actual value, see e.g. https://physics.stackexchange.com/q/679774/50583 – ACuriousMind Oct 31 '22 at 16:52
  • Thanks for the reference! I have understood the point. Just for confirming it, I wish to ask: say a particle's wavefunction has been localised very approximately to a point, (not by an experiment measuring it's position, but by some coincidence, simply due to the effect of a Hamiltonian the system is in). Due to the uncertainty principle, even though we have not $measured$ the position, the momentum of the wave will be a hazy number (that is, it would be like a distribution with a very large standard deviation). In this case, it is not necessary that the average value of the momentum be large. – SX849 Oct 31 '22 at 17:22
  • 1
    One can measure the momentum and position of one quantum perfectly well at the same time. What one can not do is to prepare a quantum mechanical ensemble of quanta where both momentum and position are sharp functions at the same time. This is clearly not a problem introduced by the measurement. The trouble for the human mind is that quantum mechanics forces us to think differently about individual events and the ensemble of these events. We are not used to that from classical mechanics. – FlatterMann Oct 31 '22 at 18:35
  • Thank you @FlatterMann for your clarification. I have understood the concept well now. – SX849 Oct 31 '22 at 18:44

0 Answers0