0

The expansion of electromagnetic waves due to diffraction can be easily explained with Huygens' principle (and in introductory courses this is usually how it is explained). But Huygens' principle is physically wrong.

What is it in the electric and magnetic field that makes light diffract? In other words why will a perfectly collimated beam of light spread out?

Qmechanic
  • 201,751

2 Answers2

0

This can be seen as a consequence of Faraday's Law of Induction. See http://puhep1.princeton.edu/~kirkmcd/examples/diffraction.pdf, for more details.

0

Getting down to it, it's a mathematical property of the wave equation. Any wave-like phenomena will experience diffraction.

$$\frac{\partial ^2}{\partial x ^2}U + \frac{\partial ^2}{\partial y ^2}U + \frac{\partial ^2}{\partial z ^2}U = \frac{1}{c^2} \frac{\partial ^2}{\partial t ^2}U$$

The easiest way to see why this is to consider the Fourier transform property $\frac{\partial ^2}{\partial x ^2} \rightarrow -k_x^2$. Since $k_x$ determines the degree of propagation in the x direction, this lends itself to the physical interpretation that field variation in the x direction leads to propagation in the x direction as well. For something like a laser beam to be self contained, it should go to 0 far away from the beam, but be nonzero at the beam. Hence, variation in the transverse profile, and propagation in the transverse profile as well.

This also explains why waves will diffract at barriers, or waves that are more closely contained will diffract more. With a bit of maniuplation, it's also easy to show that lower frequencies will diffract more than higher frequencies.

David
  • 2,657