1

I would like to know what the relationship between bitrate and the width of spectrum that needs to be assigned to an optical channel is; assuming the channel uses an OOK modulation format such as RZ or NRZ.

Context- First of all I would like to start with, I am not a physicist, and so I don't necessarily understand everything involved with this. I am a computer scientist currently studying routing algorithms over fibre optics in the presence of impairments, specifically in flexi-grid networks; and as part of that I need to build a somewhat realistic simulator. I know that the higher the bitrate of a signal the higher the optical bandwidth required to transmit that signal, and that different modulation formats will require more or less optical bandwidth for the same bitrate. However, I am limiting myself to OOK modulation formats, which I have read can go up to 40 Gbps with current technology(correct me if I am wrong).

Question -So what I would like to know is, if I wanted to assign an optical bandwidth (from my spectrum range, in the flexi-grid network) to an OOK transmission of a given bitrate, assuming a noiseless channel, what would the relationship be?

More context - I saw a video on youtube "what is the maximum badwidth" which says the relationship is $\Delta f * \Delta T \approx 1$ is the relationship (where $\Delta f$ is the assigned spectrum width and $\Delta T$ is the pulse duration). I would like to gain some more insight into what is going on (where does it come from, why should it be approximately 1) , and most importantly confirm if I would be correct to use them in the way I am planning to.

Aside from an answer to the question, any insights, suggestions, or criticisms relating to anything written above would be appreciated.

Qmechanic
  • 201,751
  • http://dsp.stackexchange.com would be a better match, but note, your question is encyclopedic and the standard answer is for that, why you doesn't simply google for it. – peterh May 29 '17 at 00:48
  • Tip: Consider to spell out acronyms. – Qmechanic May 30 '17 at 17:35

1 Answers1

1

The answer to your question depends on the system's signal to noise ratio. The SNR determines the number of distinguishable "letters" that can be transmitted with each symbol, as I discuss here.

With unsqueezed light states, and with the only noise present being quantum noise, the following rough formula gives you a good idea:

$$\mathcal C = B \, \log_2\left(1 + \sqrt{\frac{P}{B\, h\, \nu_0}}\right)$$

where $C$ is the channel's capacity bits per second, $B$ the bandwidth in symbols per second, $P$ the optical power and $\nu_0$ the channel center frequency. I show how this figure is derived in this discussion of fiber optic bandwidth.

These figures are theoretical maximums over all possible modulation and encoding schemes.