Say I have a continuous light source emitting a flat light spectrum $\hat G(\nu)$ centered at $\nu_o$ with bandwidth $\Delta \nu$ (i.e. $\hat G(\nu)=\mathrm{rect}((\nu-\nu_o)/\Delta \nu)$) and want to see its time evolution. My understanding is I just need to take the Fourier transform. This can be done easily enough and evaluates to $$G(t)=\frac{\Delta \nu}{\sqrt{2\pi}}\mathrm{sinc}{(\frac{\Delta \nu t }{2})}\exp{(i\nu_o t)}.$$
I don't understand why the result is a decaying wave. When you have a polychromatic light source (like a light bulb) I don't see it decaying in time. I know this type of solution makes sense when talking about mutual coherence (which loses its coherence with time delays for a polychromatic source). But here we are just talking about the time propagation of a polychromatic field. Have I made a mistake with my thinking?