17

I would like to ask if anyone has found a tight enough way to define the term "quantum fluctuation" so that it can become a useful rather than a misleading piece of physics terminology.

Terminology in science serves, I guess, two main purposes: 1. as an aide to understanding as we learn, 2. to enable us to communicate fluently with one another. It is the first that concerns me here.

Let me first deal with the second issue, to get it out of the way. If one expert in quantum theory uses the phrase "quantum fluctuation" in the course of a conversation with another expert, then they probably both have in mind something about the non-zero value of kinetic energy in the ground state of motion, or the non-zero range of values of some observable, or the mathematics of quantum field theory, or something like that. They may be interested in symmetry-breaking, or phase transitions. The terminology "quantum fluctuation" doesn't matter since they both know what they are talking about in much more precise mathematical terms. They won't be misled by the phrase any more than someone working in QCD would be misled by terminology such as "charm" and "colour".

Now let's return to the first issue, which is the heart of my question. Is this phrase well-chosen? Does it help people to get good understanding, or does it tend to mislead? Does it give good physical intuition? Does it mean something other than "quantum spread"? Can we find a sufficiently tight definition of "quantum fluctuation" so that it becomes more helpful than it is misleading?

The following web article illustrates the use of "quantum fluctuation" as an attempt to describe quantum field theory for the non-expert: https://profmattstrassler.com/articles-and-posts/particle-physics-basics/quantum-fluctuations-and-their-energy/ Language such as "jitter" is freely employed. Such attempts show that the term invites the student to form a physical picture of something dynamically moving around.

Countless books and lectures talk freely about "bubbling and rippling" etc. etc.

Here are my reservations:

  1. In so-called "zero point motion" there is no motion. There is kinetic energy, admittedly, and $\langle \psi | \hat{x}^2 | \psi \rangle \ne 0$, but in my mind's eye I don't see the oscillator "fluctuating" so much as just sitting there, not moving too and fro but simply spread out. It is not like a classical standing wave where the oscillating string or whatever moves up and down, and it is not like a thing moving to-and-fro in a potential well. The electron in the ground state of hydrogen is not fluctuating.

  2. In ordinary English the word "fluctuation" refers to something dynamic, involving change as a function of time. But in the case of a time-independent Hamiltonian, there is no dynamic change for a system in an energy eigenstate, except for the unobservable global phase. This assertion applies as much to field states in quantum field theory as it does to non-relativistic quantum mechanics. So where are these famous "vacuum fluctuations"? Note again, my question does not concern correct mathematical treatment of field theory. It concerns whether the term "fluctuation" is well-chosen to give good physical understanding.

  3. The virtual particles that appear in Feynman diagrams are not fluctuations; they are terms in a series of integrals whose sum describes a smooth and non-fluctuating scattering process.

  4. Classical phase transitions may be said to be "driven" by thermal fluctuations. That's fine; it is an example of a dynamic change, a cause and effect. But if someone says that a quantum phase transition is "driven" or "caused" by quantum fluctuation (and I think people do say that), then what exactly are they saying?

  5. Spontaneous emission by atoms manifests the coupling between the atom and the electromagnetic field in its vacuum state, and demonstrates that the field is not absent nor without physical effect. Since there is a stochastic nature to the clicks if the emitted photons are detected by ordinary detectors, one might loosely ascribe the randomness in the timing of the detection events to a randomness in the behaviour of the field in its vacuum state. Is this perhaps what "quantum fluctuation" means?

I do want an answer if there is one; I don't intend merely to generate discussion. But I think my question may be hard to answer because it touches on unresolved interpretational issues to do with the quantum measurement problem, and on the physics of symmetry breaking.

Related questions.

This question: Quantum Fluctuations of the vacuum gives a link to a video showing a computer simulation of vacuum fluctuations in a lecture by David Tong. What exactly is being plotted in the simulation here?

This question: Understanding quantum fluctuations is similar to mine but different enough to warrant my new one.

I found this question and its discussion helpful:

What is spontaneous symmetry breaking in QUANTUM systems?

Qmechanic
  • 201,751
Andrew Steane
  • 58,183

4 Answers4

10

Thanks for all who posted answers or comments so far. Having learned from them, I thought it might be helpful to post my own answer:

Definition. Quantum fluctuation refers to the fact that when a physical system is prepared in a given state and some property or quantity is measured, and this procedure is repeated many times, then the measured outcomes may fluctuate even though the system was prepared each time in the same state.

This definition captures what I think is the usage in professional physics. It covers things like the quantum contribution to electrical noise, and I think it covers the usage in thermal physics and the study of phase transitions. (In the latter I think the word 'fluctuation' is a shorthand for the quantum 'spreading' through superposition, when the system state is not an eigenstate of observables such as correlation functions.)

However, there is another usage widely employed in popular presentations of physics, where the phrase 'quantum fluctuation' is used as an attempt to help non-experts get a feel for the physics of quantum fields, including the lowest energy state called vacuum state. The aim is to convey the richness of these fields, and the fact that they influence events even when we might naively say 'there is nothing there'. Unruh radiation is a good example. However, as I understand it, when these fields are in their vacuum state they are never themselves causal. Rather, they mediate phenomena whose cause is something else, such as an incoming real particle, or a force which caused something to accelerate. This statement applies to vacuum polarisation too, because that is a statement about the interaction of the electromagnetic and Dirac fields, and is described by Feynman diagrams having two external photon lines.

The upshot of all this is, then, two meanings: one as stated in the definition above, and another which is not really about fluctuation at all, but rather simply a way to encourage people to marvel at the richness and subtlety of quantum field theory and of empty space. Unfortunately this honest attempt to help has resulted in a large amount of nonsense being mixed in with the good sense. It is an ongoing project to find better ways of conveying good physical intuition about quantum field theory, whether for the expert or non-expert.

Andrew Steane
  • 58,183
7

I empathize with your discomfort with the term "quantum fluctuation". I also don't find it to be a helpful term. Largely this is because of your point 2. In English "fluctuations" refers to something dynamical, whereas in the mathematical formalism for quantum mechanics there is nothing which "fluctuates". The evolution is unitary (smooth) and deterministic.

The best translation I have come up with is that "quantum fluctuations" is code for "I will get different outcomes if I measure multiple times". In this case anything having to do with "quantum fluctuations" is no more exotic than anything having to do with the measurement of systems which are in a superposition of the measurement basis states. Of course, superposition and measurement is already quite exotic and confusing so it is still a difficult/controversial topic I think.

However, I must admit that this "dislike" of the term "quantum fluctuations" and my translation of the term is non-conventional. Unfortunately I think this means that the usage of the term is somewhat user dependent and as a result I don't think you'll get a good conclusive answer to this question.

Jagerber48
  • 13,887
  • 5
    Good answer! Your dislike of the term "quantum fluctuations" is not non-conventional at all. Every actual physicist that I know dislikes the popular interpretation of this terminology; they use these words for historical reasons, but they know that there is nothing that actually fluctuates in the dynamics of QM. – AccidentalFourierTransform Nov 15 '18 at 17:47
6

I understand your concern. I believe that the reason for this terminology has to be understood historically, where it is meant to be something different than classical (thermal) fluctuation. Once one remembers this I think the term achieves its purpose (i.e., your point 1.).

The one thing that one has to realize is that there are no "fluctuations" classically at zero temperature. Consider a classical spin model on a graph. $\sigma$ is a configurations of spins ($0$ or $1$) on this graph. The classical Hamiltonian is a function of $\sigma$, $E(\sigma)$. At zero temperature the system is in the state of minimum energy, let it be $\sigma_0$. In other word we can say the system is in state $\rho$ with

$$ \rho_{\sigma,\sigma'}= \delta_{\sigma,\sigma_0} \delta_{\sigma',\sigma_0}. $$

(I'm using a notation also valid quantum mechanically). Clearly the state is diagonal (it's classical) but it's also pure, or in other words, an extremal (a -Kronecker- delta function). The state is "frozen" in the configuration $\sigma_0$. Intuitively there are no "fluctuations", i.e., other configurations contributing to the state. How do we measure this?

Any classical observable $A$ is also diagonal in $\sigma$. Computing averages with the above state one has

$$ \Delta A^2 := \langle A^2 \rangle - \langle A \rangle^2 = 0. \ \ \ \ \quad (1) $$

Indeed the two facts are equivalent (being an extremal and having zero fluctuations for any observable).

If we now raise the temperature, at equilibrium the state of the system is

$$ \rho_{\sigma,\sigma'}= \delta_{\sigma,\sigma'} e^{-\beta E(\sigma)}/Z, $$

with $Z$ partition function. Clearly now Eq. (1) will not be valid in general and we can have a phase transition as we rise the temperature. We say (colloquially) that this phase transition is due to thermal fluctuations.

Now you see the reason for the term "quantum fluctuations". Quantum mechanically Eq. (1) is in general violated also at zero temperature.

lcv
  • 1,967
  • 3
    Thanks for this; I completely agree that the historic connection to phase transitions makes it clear why the term was adopted. – Andrew Steane Nov 15 '18 at 20:55
  • Thanks, I haven't really talked about phase transitions which generically require the thermodynamic limit and a heavier math. But I think the essence can be understood in finite dimension and simply boils down to what I wrote. – lcv Nov 16 '18 at 00:22
4

To me it is just a non-technical way of saying $\langle O^2\rangle\ne\langle O\rangle^2$ for some relevant observable $O$.

G. Smith
  • 51,534