I am currently reading the book "Advances in Atomic Physics: An Overview" by Cohen-Tannoudji and Guéry-Odelin. In pages 29-31 the authors discuss a two-level atom subject to a broadband radiation field. More concretely, they derive the transition rate from time-dependent perturbation theory
$$W_{g\rightarrow e}=\dfrac{1}{4}\left|D_{eg}\right|^{2}I\left(\omega_{0}\right)$$
Here $W_{g\rightarrow e}$ is the transition rate from the ground state to the excited state, $D_{eg}$ is the dipole matrix element and $I\left(\omega_{0}\right)$ is the power spectral density of the field evaluated at the transition frequency $\omega_{0}$. This transition rate is related to the relaxation time $T_{\rm R}=1/W_{g\rightarrow e}$ (similar to Einstein's $A,\:B$ coefficients) which dictates the average time it takes until an absorption happens.
On the other hand, the authors mention the correlation time of the field $T_{\rm C}=1/\Delta \omega$, where $\Delta\omega$ is its bandwidth. They comment that this is the actual duration of an absorption process. The difference between those two times is stressed in Sec. 2.5.2 in page 31. I attach here an illustration of those two times, as I understand them, for the case of spontaneous emission.
While I understand there should be such two time scales and agree with the derivations, I feel a little bit of discomfort. My two major concerns are as follows.
For monochromatic radiation ($T_{\rm C}=\infty$) I recognize $T_{\rm R}$ in the form of Rabi oscillations. Where does $T_{\rm C}$ manifest itself in this problem? I can't see where we deal with the actual duration of a transition in both the semi-classical and quantum descriptions of this problem. I am not sure how it even makes sense to discuss such a notion, since we have a continuous build-up of superposition given by $\left|\psi\left(t\right)\right>=c_{g}\left(t\right)\left|g\right>+c_{e}\left(t\right)\left|e\right>$ (even when including the field).
My second concern is the process of spontaneous emission (of rate $\Gamma$). Here, $\Gamma{\rm d}t$ is the probability for spontaneous emission to take place in such a time interval. This again can be interpreted as the time until a transition, and not the duration of one. Yet, it is common to say that $\Delta=1/\Gamma$ is the width of an atomic line (and as a consequence, of the pulse it emits). Based on the book, I would expect this width to be related to the duration of a transition, and not to the average time until one takes place.
In most of the resources I am aware of, it is usual to use the phenomenology of the time-energy uncertainty relation while treating transitions. However, I would appreciate it if someone can refer me to a treatment of the problem which introduces both processes more rigorously, side by side.
Update 1: To sharpen point (1), consider several fields with increasingly narrower spectrum. As the spectrum gets narrower, the duration of an atomic transition gets longer and longer. In the monochromatic case, where is the diverging quantity in the mentioned models?