This has been stuck in my head and although I've found quite some info, I can't get to the final answer. I'm probably overthinking something, so I hope you can point it out or help me out otherwise.
I'm looking for the minimum spot size of:
- A 1 µm laser of quality $M^2 = 1.5$
- Leaving the system through a 40cm diameter lens
- Being focused by that lens at a range of say, 5 km. (f=5000m)
- EDIT: It concerns an in-atmosphere beam (say, at sea level) but turbulence, scintillation and similar effects can be ignored. For all intents & purposes, it could be in vacuum.
It's right in between the 'minimum spot size for microscopic applications' and 'spot size after thousands of km's of space travel' - resulting in me getting slightly confused on what is applicable and what is not.
Basically I've found a number of equations like
$ D'=\frac{4\lambda f}{\pi D} $ and $\omega_0= \frac{\lambda f}{\pi a}$ (with a= beam radius at 1/e^2 intensity)
Giving me a minimum diameter of 0.016m / radius of 0.008m at the provided range respectively, or
$\omega(z) = \omega_0*\sqrt{1+(\frac{z}{z_r})^2}$ Coming up with a radius of 0.20016m. (This one assumes the beam waist is at the lens itself, which would be valid for much larger ranges but probably not so much for 5km)
While they all make sense, I "feel" they're not considering the general divergence of the beam.
The answers provided here (Physics of Focusing a Laser) seem to be in the right direction. For example,
$\omega_0 = \frac{M^2 \lambda }{\pi \Theta }$
comes with the note
"Note that if you know the $M^2$ and measure the divergence of a beam, then you can calculate the waist radius."
Now I can't measure the divergence of my laser (theoretical case :) ) but I don't think I can calculate it from that equation either, since although I know the radius at the lens, that is not the beam waist.
As you see, I'm a bit stuck. Maybe I've just been looking at it for too long, garbling up my head. It'd be great if someone can clear this up for me!