9

Suppose there are two objects in the universe. Earth, with a gravitational acceleration of g = 9.8m/s/s, and a typical electron.

The electron is dropped from a certain height, say 1000m above the Earth's surface.

The initial energy of the electron is only the potential energy, $mgh = m_eg\times1000$, where $m_e$ is the mass of the electron.

As the electron falls towards the earth, it will be accelerated and thus will radiate energy. Will this cause the electron to slow down, and thus will the electron take a longer time to hit the ground than that expected by the equation $s = 0.5at^2$, due to energy loss through radiation.

If so, what acceleration will the electron actually fall at? How long will it take to hit the ground?

Kenshin
  • 5,571
  • 3
  • 32
  • 62

2 Answers2

6

Assuming non-relativistic velocities, the power radiated by a charge accelerating at constant acceleration $a$ is given by the Larmor formula:

$$P = \frac{e^2 a^2}{6\pi \epsilon_0 c^3} $$

To do the calculation properly is surprisingly complicated, but it's easy show that the effect of the radiation on the electrons fall is negligible. If the electron falls a distance $h$ then the time it takes is given by:

$$ h = \frac{1}{2}gt^2 $$

so:

$$ t = \sqrt{\frac{2h}{g}} $$

If we assume the electron is accelerating at a constant rate of $g$, the total energy radiated is just power times time or:

$$ E_{rad} = \frac{e^2 g^2}{6\pi \epsilon_0 c^3} \sqrt{\frac{2h}{g}} $$

In your question $h$ is 1000m, so:

$$ E_{rad} = 7.83 \times 10^{-51}J $$

The potential energy change is, as you say, just $mgh$:

$$ E_{pot} = m_e g h = 8.94 \times 10^{-27} J $$

So the ratio of the radiated energy to the potential energy is about $10^{-24}$, and therefore the effect of the radiation on the electron's fall is entirely negligible.

Response to comment:

The power radiated from the electron produces a force that opposes the acceleration due to gravity. Assume we can ignore the deviations from accelerating at a constant rate $g$, then in a small time $dt$ the energy radiated is $Pdt$. The energy is force times distance ($dx$) so to get the force we divide by the distance:

$$ F = P\frac{dt}{dx} = \frac{P}{v} = \frac{P}{\sqrt{2gh}} $$

using $v^2 = 2as$. The acceleration produced by this force is just $F/m_e$, so the net acceleration on the electron is:

$$ a_{net} = g - \frac{P}{m_e \sqrt{2gh}} $$

So the electron does accelerate slightly more slowly than $g$, but the difference between the acceleration and $g$ is inversely proportional to distance fallen so it gets increasingly negligible the further the electron falls.

You've probably spotted that the above equation says the force should be infinite at the moment you release the particle. That's because as you approach the moment of release it's no longer safe to make the approximation that you can ignore the change in the acceleration due to radiation.

John Rennie
  • 355,118
  • Thanks John. I can see from your calculations that the energy radiated is minimal compared to the kinetic energy gained from the fall. Thinking more conceptually about the situation, would it be correct to say the electron's fall velocity decreases due to the radiating charge, or is it better to say the velocity doesn't change, but the initial potential isn't mgh, but mgh + energy that will be radiated? – Kenshin Feb 01 '13 at 11:22
  • I understand any effect will be negligible now, but I'm interested in how these effects would influence the electrons movements conceptually, even if the effect is negligible in practice. – Kenshin Feb 01 '13 at 11:23
  • I've updated my answer. I have to dash out now, but if time permits I'll have a calculating the equations of motion properly. It's hard to do because you can no longer assume the power is constant i.e. the acceleration of the electron is constant. – John Rennie Feb 01 '13 at 11:48
  • Does this answer not contradict the article here: https://en.wikipedia.org/wiki/Paradox_of_radiation_of_charged_particles_in_a_gravitational_field#Resolution_by_Rohrlich, "Fritz Rohrlich (1965),[6] who shows that a charged particle and a neutral particle fall equally fast in a gravitational field" – Kenshin May 12 '19 at 14:55
0

A falling electron is basically a miniscule electric current, and will create a circular magnetic field, which can be calculated using Ampere's Law.

As the electron accelerates this magnetic field also grows, generating a perpendicular electric field, which can be calculated using Maxwell's Equations.

At this point you would need to use Quantum Theory, since all three quantities are so miniscule. Quantum Theory states EM radiation is composed of photons, free electrons cannot emit individual photons, because momenta don't add up as required.