I would like to know what the relationship between bitrate and the width of spectrum that needs to be assigned to an optical channel is; assuming the channel uses an OOK modulation format such as RZ or NRZ.
Context- First of all I would like to start with, I am not a physicist, and so I don't necessarily understand everything involved with this. I am a computer scientist currently studying routing algorithms over fibre optics in the presence of impairments, specifically in flexi-grid networks; and as part of that I need to build a somewhat realistic simulator. I know that the higher the bitrate of a signal the higher the optical bandwidth required to transmit that signal, and that different modulation formats will require more or less optical bandwidth for the same bitrate. However, I am limiting myself to OOK modulation formats, which I have read can go up to 40 Gbps with current technology(correct me if I am wrong).
Question -So what I would like to know is, if I wanted to assign an optical bandwidth (from my spectrum range, in the flexi-grid network) to an OOK transmission of a given bitrate, assuming a noiseless channel, what would the relationship be?
More context - I saw a video on youtube "what is the maximum badwidth" which says the relationship is $\Delta f * \Delta T \approx 1$ is the relationship (where $\Delta f$ is the assigned spectrum width and $\Delta T$ is the pulse duration). I would like to gain some more insight into what is going on (where does it come from, why should it be approximately 1) , and most importantly confirm if I would be correct to use them in the way I am planning to.
Aside from an answer to the question, any insights, suggestions, or criticisms relating to anything written above would be appreciated.