0

I know that when the universe began it was incredibly hot. Ever since, it is been cooling, and nowadays the average temperature of the universe is quite close to 0 K.

Is this a consequence of having the same energy spread in a wider space, or is there something else going on?

pho
  • 4,605
  • Good question, I suggest you look at the Horizon problem that Charles Misner has brought up pertaining about the uniformity of our Universe temperature and other properties: http://en.wikipedia.org/wiki/Horizon_problem –  Aug 28 '14 at 23:28
  • I disagree that this is a duplicate. There is a difference between temperature and energy, and neither of the answers to the "linked" question discusses the temperature of the universe. – ProfRob Aug 29 '14 at 19:41
  • Whereas this question is much more similar and has useful answers http://physics.stackexchange.com/questions/55885/why-the-temperature-is-getting-lower-when-the-universe-is-expanding – ProfRob Aug 29 '14 at 19:53

1 Answers1

2

There are (at least) two things going on. Perhaps the easiest place to start is with the temperature as estimated from the radiation in the universe - possibly what you are referring to when you say the temperature is approaching 0K?

The radiation in the universe takes the form of thermal blackbody radiation. It is emitted by material in thermal equilibrium with its surroundings and has a characteristic temperature that is inversely proportional to the peak wavelength of the continuous spectrum of the radiation - known as Wien's law.

Because the universe is expanding (and by this we mean the space is expanding), the wavelength of the radiation that was around early in the universe gets stretched by the same expansion factor. The stretch factor between when the cosmic background radiation was emitted by material at around 3000K is about 1000. So what was visible/IR radiation has been stretched into microwaves. As a result, the temperature of this radiation has also been reduced by a factor of 1000, to the ~3K we see today.

For a (non-relativistic) gas of particles with mass the situation is a bit trickier. Yes, the gas cools because it occupies more volume, but $T$ is not inversely proportional to the cube of the scale factor. The behaviour is that of an adiabatically expanding ideal gas where $TV^{\gamma-1}$ is constant. For $\gamma=5/3$ and $V$ proportional to the cube of the scale factor, this leads to $T$ being inversely proportional to the square of the scale factor.

A more hand-waving explanation follows: Gas temperature is proportional to the square of the velocity dispersion in a gas. Imagine the gas in a box that is expanding out from an explosion and is a size $\Delta x$ after travelling a mean distance $x$ from the explosion. To still be in the box, gas particles must have velocities that differ fractionally by less than $\sim \Delta x / x$. As the size of the box can be anything, this means that the velocity dispersion $\Delta v \propto 1/x$ and so $T \propto 1/x^2$. Hence in our big bang universe, an expanding gas in thermodynamc equilibrium has a temperature that scales as the inverse square of the scale factor.

Of course most of the gas in the Universe is not in thermodynamic equilibrium at the low temperature implied by this relationship, because gravity and nuclear reactions make the universe a much more interesting place! The present day inter-galactic medium is at more like $10^6$K heated by the winds and radiation from stars and galaxies.

ProfRob
  • 130,455