9

Why should the dependence relation be like $${\frac{s}{S}}<{\frac{\lambda}{d}}$$ for the interference condition to be seen?

Where $s$ is the width of the source slit and $S$ is the distance between the source slit and the double slit. $\lambda$ is the wavelength of light and $d$ is the distance between double slits.

Why is it that the interference condition is not seen when the relation is equal or just greater than it?

Qmechanic
  • 201,751
Sikander
  • 401

2 Answers2

1

Answer

Interference pattern is formed by diffracted-on-double-slits beams. When your condition is not met beams of different orders of diffraction superpose and interference pattern becomes blurred and disappears.

Explanation

Angular spacing of the fringes:

$$(tan)\theta_{fringes} = \frac{\lambda}{d}$$

Angular spacing between two light beams (let's call them beam A and beam B) from borders of a source slit entering double slits:

$$(tan)\theta_{beams} = \frac{s}{S}$$

While $\theta_{beams}<\theta_{fringes}$ you can distinguish zero- and first-order-diffraction beams and interference pattern is visible.

When a relation between two angles tends to $\theta_{beams}=\theta_{fringes}$ first-order-diffraction of beam A superposes on beam B. And first-order-diffraction of beam B superposes on beam A. They become indistinguable and so does interference pattern.

When $\theta_{beams}>\theta_{fringes}$ different diffraction orders superposes and no interference patterns is observed.

Nordik
  • 536
1

Let $L$ be the distance from the double slit to a screen, $S$ be the distance from a source slit to a double slit, and $s$ be the width of the source slit.

The interference pattern produced by a source point is like the cross section of a Fresnel diffraction pattern: the fringes get closer together as the distance from the center increases. The whole pattern scales up linearly as the ratio $L/S$ of the distance from double slit to screen and source slit to double slit increases. Also, if the source point is moved laterally by a distance $x$, the interference pattern moves laterally (in the opposite direction) by a distance $X$ where $$X = x L/S.$$ If two such interference patterns are superimposed but offset by a distance $X/2$, the result will be a pattern in which the fringes separated by a distance $X$ are completely blurred. The fringes whose separation is larger will be blurred slightly, but will still be visible.

If the source is a slit of width $s$, then it is effectively a continuum of point sources across the slit. It includes points with separations ranging from $0$ to $s$, so all features in the resulting diffraction pattern corresponding to fringe separations smaller than $X$ will be blurred severely.

If $X/2$ is equal to or greater than the spacing between the central fringes, then even the central fringes are blurred. So, if the source slit is too wide, there is no fringe visibility at all.

This is the basis of stellar interferometry: the angular width of a star can be measured by the width of the non-blurred portion of a diffraction pattern formed with light from the full width of the star. If you want to know the math in detail, this is a good source.

S. McGrew
  • 24,774