I have been following the methods suggested by members of this site to calculate the solar irradiance outside of the earth's atmosphere, see here.
I now want to calculate the solar irradiance reaching the earth's surface.
I calculate the irradiance outside the atmosphere as:
$$ L_{\lambda} = \frac{2c^{2}h}{\lambda^{5}\left( \exp\left[hc/\lambda kT \right] - 1\right)}$$
where $ h = 6.626\times 10^{-34}$
$ c = 3 \times 10^{8} $
$ T = 6000 $
$ k = 1.38066\times 10^{-23} $
$ \lambda = 0:20 \times 10^{-9}:3200 \times 10^{-9} $
I then convert the units as: $$ L_{\lambda} =L_{\lambda} \times 10^{-9} $$
multiply by the square of the ratio of the solar radius of earth's orbital radius $$ L_{\lambda} =L_{\lambda} \times 2.177 \times 10^{-5} $$
apply Lambert's cosine law $$ L_{\lambda} =L_{\lambda} \times \pi$$
which results in the upper curve seen in the following figure, i.e. the energy curve for a black body at 6000K:
I now wish to generate a second curve, one that shows the irradiance at the earth's surface. I know that the scattering and absorption processes that take place in the atmosphere not only reduce the intensity but also change the spectral distribution of the direct solar beam.
I want to show the spectral distribution of solar irradiance at seal level for a Zenith sun and a clear sky. So, the curve that I want to show is the spectral distribution as it would be if there were scattering but no absorption. For this I would also like to make the assumption that the solar elevation is more than 30 degrees.
Does anyone know how I could produce the curve explained above?