I wanted to calculate the beam spot size for a laser beam from a satelite (100 - 36000 Km radius) orbiting the earth. I want to calculate the results to be as accurate as possible to the real world scenario. For example, given the radius of aperture of the transmitting telescope ($0.1$ m) and the wavelength ($1550$ nm), how do I calculate the diffraction at a distance of $40,000$ km ?
Currently I'm assuming a collimated gaussian beam with initial spot size $\omega_0$ the spot size at a distance $z$ is given by,
$$ \omega\left(z\right)= \omega_0 \sqrt{1+\left(\frac{z}{z_R}\right)^2},$$
where $z_R=\pi\omega_0^2n/\lambda$ with $n$ being the refractive index and $\lambda$ the wavelength.
If the transmitter on the satelute has an aperture of radius, say $0.1$ m, does this mean that $\omega_0=0.1$?
I've also read that in practise laser beams are not perfect gaussian, and its deviation from the ideal case is represented using the M squared or the laser beam quality parameter (wikipedia link). How does the beam spot in this case depend on the M squared parameter?
EDIT
From this question
$w_z=w_0\sqrt{1+M^2(\frac{z-z_0}{z_R})^2}$
Could anyone provide a reference for this?