9

If a system of particles is bound, then it has negative energy relative to the same system disassembled into its separated parts. In the nonrelativistic limit, this negative energy is small compared to the sum of the masses of the constituent particles, so the mass of the bound system is still positive.

But relativistically, there is no obvious reason why this has to be true. For example, the electromagnetic charge radius of the pion is about 0.7 fm. A particle-in-a-box calculation for two massless particles in a box of this size gives a kinetic energy of about 1500 MeV, but the observed mass of a pion is about 130 MeV, which suggests an extremely delicate near-cancellation between the positive kinetic energy and the negative potential energy. I see no obvious reason why this couldn't have gone the other way, with the mass coming out negative.

Is one of the following correct?

  1. Some general mechanism in QFT prevents negative masses.

  2. Nothing in QFT prevents negative masses, but something does guarantee that there is always a lower bound on the energy, so that a vacuum exists. If pions could condense from the spontaneous creation of quark-antiquark pairs, then we would just redefine the vacuum.

  3. Nothing guarantees a lower bound on energy. The parameters of the standard model could be chosen in such a way that there would be no lower bound on energy. We don't observe that our universe is that way, so we adjust the parameters so it doesn't happen.

If 1, what is the mechanism that guarantees safety?

If 2, what is it that guarantees that we can successfully redefine the vacuum? In the pion example, pions are bosons, so it's not like we can fill up all the pion states.

If 3, is this natural? Do we need fine-tuning?

Are there no general protections, but protection mechanisms that work in some cases? E.g., in the case of a Goldstone boson, we naturally get a zero mass. Do the perturbations that then make the mass nonzero always make it positive?

Related: Is negative mass for a bound system of two particles forbidden?

Qmechanic
  • 201,751

3 Answers3

8

It's not #1, because it's easy to write down QFTs which are unstable against pair production. One simple example is $$\mathcal{L} = \frac12 (\partial_\mu \phi)^2 + \frac12 m^2 \phi^2$$ which corresponds to particles with negative $m^2$. Since $E^2 = p^2 + m^2$, it is energetically favorable to produce infinitely many particles, since the rest energy is negative. (Note this corresponds to imaginary mass, not negative mass.) There's no vacuum state at all, so #2 doesn't hold either.

For a scalar field theory, as long as we have $$\mathcal{L} = \frac12 (\partial_\mu \phi)^2 - V(\phi)$$ where the potential $V(\phi)$ is bounded below, then we will have a vacuum state, simply because the Hamiltonian is bounded below. Heuristically, if we start, say, at a maximum of the potential rather than a minimum, particle production will occur until the vacuum expectation of the field is shifted to the minimum, which is our true vacuum state. Another way to phrase this is that the particles produced interact with each other (noninteracting particles correspond to a quadratic potential, which cannot have both a maximum and minimum), so as this process goes on it becomes less energetically favorable to create more particles, and the process stops at the true vacuum. All of this is a simplified description of what happened to the Higgs field at some point in the early universe.


Since it's essentially #3, I suppose your real question boils down to: how do people enforce the existence of a stable vacuum in the first place? Well, if you work on formal QFT, you simply assume it. That's called the spectrum condition, and it's a reasonable requirement to make of any field theory, like assuming that a spacetime is time orientable in general relativity.

If you're a model builder adding stuff to the Standard Model, there are a couple things you can use:

  • if your new particles are weakly coupled, the energies are just $E = \sqrt{p^2 + m^2}$ plus small corrections, so you can essentially read off the result from the potential (this covers most papers)
  • if your new particles are weakly coupled but the above treatment is too loose, you can try computing the effective potential perturbatively, e.g. the Coleman-Weinberg potential
  • if your new particles are strongly coupled but analogous to QCD, we assume it's fine because QCD has a stable vacuum, e.g. whenever any paper says "consider a confining hidden sector"
  • if your theory has spontaneously broken supersymmetry, the vacuum energy density is positive

The trickiest part of deciding whether the Standard Model itself has a stable vacuum is QCD. We know the naive vacuum isn't stable, and lattice simulations tell us there is a vacuum containing a so-called chiral condensate of quarks. So in some sense your question about the sign of the pion mass could have gone the other way, and in fact it already has, because the chiral condensate formed and we now define pions about that.

If you are allowed to assume that QCD with massless quarks has a stable vacuum, then it's straightforward to show that pions have positive $m^2$ once you account for quark masses. But actually showing that statement is difficult nonperturbative physics. I don't know how to do it, and I don't know if anybody knows.

knzhou
  • 101,976
  • Thanks for taking the time to write such a detailed answer! You say that if $V$ is bounded below, then the process of particle formation ends at the true vacuum. In the concept of the Dirac sea (which I guess is just an outdated heuristic?), does this result in an infinite positive energy, which we then have to subtract away somehow? –  Nov 11 '19 at 15:43
  • @BenCrowell I guess one can call the Dirac sea outdated, but it's just a special case of how we think about renormalization. In basically every nontrivial QFT, the vacuum energy density isn't equal to what we would naively expect (i.e. the Hamiltonian density evaluated at the classical vacuum). There are always corrections, which are formally infinite with an infinite cutoff and still large with a finite cutoff. But you can absorb this by adding a constant term to the Lagrangian, which has to have been there all along. – knzhou Nov 11 '19 at 18:21
  • @BenCrowell There's a different question which is, what happens if you start at a false vacuum and then transition to the true vacuum? In flat spacetime, energy is conserved, which means you end up with a big positive energy relative to the true vacuum, which in practice can manifest as radiation emitted or topological defects or whatever. In a cosmological context, this energy then starts to be redshifted away. – knzhou Nov 11 '19 at 18:23
  • Is it conceivable that under conditions like those before the Big Bang, the fields are unstable, but the resulting dynamics leads to conditions under which the fields are stable? – S. McGrew Nov 11 '19 at 18:26
  • @S.McGrew If by "big bang" you mean the moment when the universe became extremely hot, then not only is it conceivable, but you've actually just described inflation. The period of inflation occurs because some field is moving toward a stable vacuum, and during this process its potential energy drives accelerating expansion. – knzhou Nov 11 '19 at 18:29
  • In that case there must be a term in the field Lagrangian becomes dominant after the initial unstable period, depending on particle density or some such thing. – S. McGrew Nov 11 '19 at 19:58
  • @S.McGrew Sure, I suppose that's true but a weird way to word it, because "depending on particle density" describes just about any Lagrangian term. For example, $\phi^2$ is an energy contribution that depends on particle density. – knzhou Nov 11 '19 at 20:28
3

Option 3 is the closest match, but it's a bit like saying "Nothing guarantees that spacetime has a Lorentzian signature." We normally only consider spacetimes that do, because so much else depends on it. It's a requirement, not a theorem.

Similarly, for relativistic QFT in flat spacetime, we normally only consider QFTs whose total energy has a finite lower bound. The Lorentz-symmetric statement of this condition is that the spectrum of the generators of spacetime translations is restricted to the future light-cone. This is called the spectrum condition. It's one of the basic conditions that we usually require, just like microcausality. Theories that don't satisfy these basic conditions are rejected as unphysical, because so many other things rely on them. For example, the spin-statistics theorem and the CPT theorem both rely on the spectrum condition.

That's for QFT in flat spacetime. For QFT in a generic curved spacetime, we lose translation symmetry, so there are no "generators of spacetime translations," and the spectrum condition becomes undefined. Candidate replacements have been proposed, like the microlocal spectrum condition, but as far as I know this is still an unsettled research topic. The goal, I suppose, is to find a condition that allows things like the spin-statistics theorem to be derived even in curved spacetime. (If I remember right, this has sort of already been done, but the approach that I'm vaguely remembering relies on the flat spacetime proof, and something about it didn't seem quite satisfying to me. If you're interested, I can try to find that paper and post a link.)

Returning to flat spacetime...

is this natural? Do we need fine-tuning?

Depends on what you mean. If we define the Hamiltonian (total energy operator) to be the generator of time-translations, then the constant term can be shifted by an arbitrary finite value with no observable effects. In that sense, there is no fine-tuning problem. But the real world includes gravity even if our favorite QFT doesn't, and gravity does care about that constant term in the Hamiltonian. In that sense, there is a fine-tuning problem, also known as the cosmological constant problem: if we define our QFT with a short-distance cutoff, then the constant term in the total energy (or the cosmological constant) is extremely sensitive to the precise value of the cutoff, even though the cutoff is artificial. It's not a "real" problem in a QFT that doesn't include gravity anyway, but it's a symptom that QFT and gravity probably don't get along with each other in the way we might have naively expected.

  • 1
    Thanks, this is very helpful, and complements knzhou's answer. Too bad I can only accept one. –  Nov 11 '19 at 15:43
-1

The operational definition of Entropy is roughly proportional to the logarithm of the number of accessible microstates on any given equilibrium macrostate. For positive energy systems, the number of allowed microstates grow quickly with increasing system energy. If one expects that a negative-energy regime to be a mirrored version of the positive energy branch, it would lead to the conclusion that the number of microstates would grow quickly with decreasing system energy, and that the system would be thermodynamically unstable

If one removes the assumption of a large growth of microstates with decreasing accessible energy states, then one might have to consider that some theories with negative energy states can have stable or metastable vacuums. In such theories, a transition to a lower-than-typical-vacuum (or at least what we citizens of galaxies call a typical vacuum) can be enhanced only when far away of any other sources of matter or potentials, which might be a viable mechanism for dark energy

lurscher
  • 14,423