5

The way I understand spontaneous symmetry breaking in thermodynamic systems is that the symmetry is actually explicitly broken by an infinitesimally small field. The system chooses one of the non-symmetric states effectively at random because it is infinitely sensitive to the smallest of perturbations. For example, in the case of the Ising model, the magnetisation depends discontinuously on the exernal field,

$$m(h\rightarrow 0^+) = m_0 \neq m(h\rightarrow 0^-) = - m_0 \, .$$

There are two possible values of the magnetisation in the symmetric case, $h= 0$ and the physically realised one depends on the sign of an infinitesimally small (and not measurable) external field. It appears random.

What about cosmology? Everybody talks about electroweak symmetry breaking during the early stages of the big bang, for example. What is meant by that? The big bang (and the whole universe) is a non-equilibrium system. I don't know how to relate it to thermodynamics. Moreover, in cosmology, there can be no external field to infinitesimally break the symmetry. How can the symmetry really be broken at random? Is the universe not deterministic?

1 Answers1

1

The early Universe is largely in a state of thermal equilibrium, after inflation the Universe has a large temperature allowing for all degrees of freedom to obtain equilibrium. This can be seen from the Boltzmann equation, which describes how the number density changes in the expanding Universe. Equilibrium is obtained by a particle species if $\Gamma \gg H$, as then the timescale for the interaction ($1/\Gamma \sim 1/T$) is smaller than the time scale for the expansion ($1/H \sim m_p/T^2$ where $m_p>>T$ is the Planck mass). This is a good estimate of thermalisation.

Symmetry breaking in the early Universe occurs through finite temperature effects. To study this one must look at the finite temperature effective action for the scalar field, involving a thermal field theory calculation. A scalar field receives contributions to it's effective potential from the relativistic degrees of freedom it couples to. So as the Higgs field couples to fermions and gauge bosons which are light and relativistic in the early Universe this modifies the potential.

Typically the potential has two terms:

$V = V(T=0) + V(T)$

At leading order $V(T)\sim M^2 T^2$ where $T$ is the temperature of the thermal bath and $M$ is the mass term of the fields the Higgs field couples to.

The key point is that at large $T$ the thermal corrections dominate and the potential has a single minimum at the origin. As the temperature cools the thermal corrections fall off and new minima appear. Eventually the Higgs field is free to evolve away from the origin towards the new minima during which it breaks the symmetry.

This process is more or less the same for any phase transition in the early Universe. Depending upon the particle content in the theory the phase transition may be first or second order. Depending upon the couplings dissipative effects may be important and can lead to interesting effects.

I hope this answers your question, or at least helps.

  • Thank you for you answer. You did not address the last part of my question though. The point that troubles me is that if a symmetry is not (at least infinitesimally) broken in the microscopic Lagrangian, it will not be broken by the time evolution. I can only see two options for a symmetry to be broken: the symmetry is explicitly broken either by the Lagrangian or by the initial conditions. In thermodynamics that is no problem because something external to the system can break our symmetry. This is obviously not an option in cosmology. How can a symmetry break in cosmology? – Steven Mathey Jan 18 '16 at 00:18
  • Ah sorry. There are quantum and thermal fluctuations of the scalar field though which perturb the system from the symmetric point. – Sam Bartrum Jan 19 '16 at 08:38