4

If a star is 1 billion light years away, it means that the light we see from the star is emmitted billions of years ago.

How does this light not undergo a frequency change or get damped inspite of the collosal distance of travel? I am asking this as doppler effect is used to know star-distances and is one of the basis for big-bang theory

Also it travels in vacuum. According to Maxwell/Gauss, a charged particle is responsible for electric field whose change creates magnetic filed. Inturn, its change causes Electric field, a process that occurs infinitely. Even if we assume that the energy travel in vacuum as the energy produced from a charged particle elsewhere is used for propagation in vacuum, does the waves not damp significantly?

In the radio waves emitted/transmitted for earthly communications, power of source is directly proportional to the distance travelled by the wave. Is this not contradicting the fact that stars light reach us from the humongous distances?

Aadishri
  • 225

4 Answers4

5

The light from distant galaxies does undergo a frequency change. It is red shifted, and the amount of red shifting is used to work how fast the distant galaxy is receding and therefore how far away it is. However this is not a damping effect. The light red shifts because the spacetime in between us and the distant galaxy is expanding, so although the light's energy is conserved it is spread over a larger distance.

You ask why light isn't damped, but why should it be? Since energy is conserved the light wave can't lose energy unless there is some mechanism to carry the energy away. For a light wave travelling in vacuum there is simply no mechanism by which it can lose energy, so it doesn't.

In your last paragraph I think you're getting mixed up with a different effect. When a terrestrial radio station broadcasts it sends the radio waves out as a half sphere (the half above the ground). As you get further away from the transmitter the field strength of the radio waves decreases as the inverse square of the distance because the energy of the transmitted wave is spread over a larger area. However the total energy of the light wave is conserved. This obviously happens with distant galaxies as well because the more distant a galaxy is the fainter it appears. This isn't due to damping, it's just the inverse square law dependance of the field strength.

John Rennie
  • 355,118
  • Thanks! So I understand that the brightness decreases NOT due to lower energy(amplitude) and frequency, but due to the amount/fraction of waves itself reaching us. – Aadishri Feb 26 '13 at 08:50
  • Yes. The fraction that we see of the total energy emitted decreases as $1/r^2$. – John Rennie Feb 26 '13 at 09:06
3

First, we don't observe individual stars that are 1 billion light years away (i.e. starlight emitted 1 billion years ago). Individual stars we observe are either supernovae which may be outside our galaxy but they are bright enough and visible for a short moment of time only; or stars in our galaxy, the Milky Way, which are at most 200,000 light years away from us.

The more distant objects we observe are whole galaxies – different from our galaxy. Quasars – visually "quasistellar objects" which are actually active galactic nuclei – are usually even further than that.

An object that is 1 billion light years away does get redshifted by Hubble's law. Hubble's constant is about 22 km/s per million of light years so 1,000 million light years gives them speeds of order 22,000 km/s, or 1/14 of the speed of light. This is surely a significant, observable redshift. The frequencies get lowered by the factor of $15/14$, too. It's not "quite" a coincidence that the number 14 is the age of the Universe in billions of years although you can't use this formula if you want a great accuracy.

Electromagnetic waves in the vacuum proceed indefinitely and they don't lose any energy whatsoever – except for the loss of the energy (and frequency) of photons described in the previous paragraph and (one more multiplicative factor of the same size, $14/15$) the decrease of the number of photons we collect each second.

The energy of the electromagnetic waves in the vacuum can't be lost due to the energy conservation: there's just nothing in the vacuum where the energy could go and it's easy to see that the precise, undamped waves are solutions to Maxwell's equations of electromagnetism. An even more obvious point is that the frequency can't change at all (except for the Doppler shift due to the relative speed, either due to Hubble's expansion or some extra velocity or due to different gravitational potentials in general relativity). Why it cannot change at all? Because if the electromagnetic waves are created by some process (acceleration of charges) that has some frequency, the "input" perturbations are periodic functions of time with the period $\Delta t = 2\pi/f$ which implies that all of their implications – such as waves measured a billion years ago – must be periodic with the same period, too. The frequency just can't get changed.

The total energy from a star or a similar localized source gets distributed to a surface $4\pi R^2$, the surface of the sphere, where $R$ is the distance. So indeed, how much light from a star we may see is decreasing as $1/R^2$. However, the angular (apparent) size of the star is also decreasing as $1/R^2$ which means that the density of energy (light) per unit solid angle is actually independent of the distance $R$. Stars that are further look "smaller" but the "intrinsic color" of the dot doesn't depend on the distance.

Luboš Motl
  • 179,018
0

A good question for anyone who has thought about the big bang,red shift and Hubble etc. There are valid questions to be raised concerning Hubble's red shift leading to the big bang theory. Firstly, space is commonly regarded as a vacuum but this is not the case. Space contains a variety of matter ranging from atoms, to molecules, gases and solids and myriad pathways of curved space ( if you believe in that) around stars and planets, or gravity as a particle. Next there is dark matter and it's unknown effects. Next is the untold complexity of multidirectional electromagnetic radiation we can detect and whether or not there is mutual interference. Then consider just the light we perceive and realise that is consists of colours of different wavelength, each supposedly arriving simultaneously after travelling light years of distance! If we can see light that has travelled that far, it must have been emitted from a big star (considering the radial dilution effect) and therefore felt some drag on it's exit by gravity (we know that it takes internal light a long time to exit our little star the sun). The extreme case is the black hole, where exit of light is said to be prevented by gravity. It is miraculous that a photon arrives here purely on the energy of it's emission light years away and without 'friction' to impede progress (by friction I mean all of the above). If the Hubble Effect were an illusion, then there is no big bang theory. Look up topics such as 'tired light' and you will find you are not alone is thinking freely about this question.

Doug
  • 21
-1

I think the answer lies at the heart of quantum behavior. A photon cannot slowly dissipate energy any more than an electron "orbiting" the nucleus of an atom can slowly radiate energy away. Energy is gained or lost in chunks. Quantum behavior preserves the photon until it interacts with something.