23

Ignoring hardware at either end and their technological limitations, what is the maximum theoretical bandwidth of fibre optic cables currently in use / being deployed in a FTTH type situations? I understand there's a limit to the number of freqencies or channels we can have in fibres, and each channel would have a theoretic max bandwidth too, I'd imagine?

I'm asking particularly to find out more about the current plan for a National Broadband Network in Australia, which is supposed to roll out fibre optics to almost every premises in the country. I'm interested in finding out how much data we can fit down the fibre before we have to dig it all up and replace it with newer fibres with higher bandwidth or some new medium we haven't started talking about yet. More general answers are interesting too.

Qmechanic
  • 201,751

2 Answers2

30

Short answer: A good order of magnitude rule of thumb for the maximum possible bandwidth of an optical fibre channel is about 1 petabit per second per optical mode. So a "single" mode fibre (fibre with one bound eigenfield) actually has in theory two such channels, one for each polarisation state of the bound eigenfield.

I'll just concentrate on the theoretical capacity of a single, long-haul fibre; see roadrunner66's answer for discussion of the branching in an optical network. The fundamental limits always get down to a question of signal to noise in the measurement (i.e. demodulation by the receiver circuit). The one, fundamentally anavoidable, noise source on a fibre link is quantum shot noise, so I'll concentrate on that. Therefore, what follows will apply to a short fibre: other noise sources (such as Raman, amplified spontaneous emission from in-line optical amplifiers, Rayleigh scattering, Brillouin scattering) tend to become significant roughly in proportion to the fibre length and some power (exponent) of the power borne by the fibre.

If we detect N photons from a coherent state of light for a measurement, then the undertainty in that measurement is $\sqrt{N}$ in an otherwise noise free environment. (see footnote on squeezed states). So suppose first that:

  1. The bandwidth of our fibre is $B$ hertz (typically 50THz maximum, I'll discuss what limits $B$ below)
  2. The power borne by the fibre is $P$ watts (typically 100W maximum; again, I'll discuss limits later)
  3. The fibre is single mode (this in theory allows one to overcome the dispersion limits discussed in roadrunner66's answer, at the expense of putting a harder upper limit on $P$)
  4. The fibre's centre frequency is $\nu_0$ (typically 193THz, corresponding to 1550nm freespace wavelength)
  5. In what follows, I shall take the word "symbol" to mean a theoretically infinite precision real number but whose precision is limited by noise (the purpose of this answer being to discuss the latter!).

So, let's begin by exploring the best way to use our power. If we devote it to $M$ symbols per second, each of our measurements comprise the detection of $N = \frac{P}{M\, h\, \nu_0}$ photons, thus our signal to noise ratio is $SNR = \frac{N}{\sqrt{N}} = \sqrt{\frac{P}{M\, h\, \nu_0}}$. By the Shannon-Hartley form of the Noisy channel coding theorem (see also here), we can therefore code our channel to get $\log_2\left(1 + \sqrt{\frac{P}{M\, h\, \nu_0}}\right)$ bits of information per symbol, i.e. $M \log_2\left(1 + \sqrt{\frac{P}{M\, h\, \nu_0}}\right)$ bits per second through our optical fibre. This is a monotonically rising function of $M$, so a limit on $P$ by itself does not limit the capacity.

However, by a converse of the Nyquist-Shannon sampling theorem we can send a maximum of $B$ symbols down the channel per second. This then is our greatest possible symbol rate. Hence, our overall expression for the fibre capacity in bits is:

$\mathcal C = B \, \log_2\left(1 + \sqrt{\frac{P}{B\, h\, \nu_0}}\right)$ bits per second

If we plug in our $B = 50\mathrm{THz}$ and $P = 100W$, we get the stupendous capacity of 1.2 petabits per second for each single mode optical fibre core.

By looking at the basic shape of the Shannon-Hartlee expression $\log_2(1+SNR)$ bits per symbol, one can see that improvements on the signal to noise beyond any "good" signal to noise ratio will only lead to marginal increases in capacity. By far the strongest limit is the Shannon sampling theorem converse. So the use of squeezed light will not change the order of magnitude of this result.

Now for some material physics begetting the limits discussed above. Our optical power is limited by two things:

  1. The heat dissipation rate of the fibre is the main one. At some point, losses in the fibre sum up to more power than it can dump to its surroundings and it fries itself. What happens in practice is that power tends to dissipate around inclusions and other imperfections and then it melts at that point, more power dissipates at the molten blob and the destruction propagates slowly away from the failure: the fibre "melts" itself like a lit fuse.
  2. Nonlinear processes like stimulated Brillouin and stimulated Raman scattering will quickly outweigh the quantum noise. These vary like $P^2$.

Bandwidth is limited by the losses in the medium. The "window" between about 1300nm and 1600nm freespace wavelength is chosen for optical communications for its low loss. Depending on the length of fibre, as you try to increase the bandwidth, any optical power outside the band of low loss simply won't get to the other end. This is what gives rise to the $B=50THz$ figure I cite above. This is not a hard figure: it's a rough estimate and its precise value depends on the length of fibre. I have shown a calculation where I assume the fibre transmits a certain bandwidth perfectly and sending "switches off" altogether outside this bandwidth. A fuller calculation would account for the spectral shape of the losses and that the imperfectly sent frequency components can also bear information. This would show that the effective equivalent bandwidth $B$ in my formula would be roughly inversely proportional to fibre length.

Other noise limits that quickly swamp the quantum noise are those I mentioned at the beginning: Raman, amplified spontaneous emission from in-line optical amplifiers, Rayleigh scattering, Brillouin scattering. Again, the calculation will need to be modified for those depending on the exact link parameters. These noises tend to increase in proportion to the link length, and therefore one often sees bandwidth-distance products quoted for "he-man" announcements of fibre capacity in experiements (e.g. 1 terrabit per second over a 100km link; the same link should roughly take 2TBps over 50km , 4tBPs over 25km and so on until the quantum noise limits everything). As above, the bandwidth limit $B$ also has an inverse dependence on fibre length, but a zero length fibre transmission will still be marred by the quantum noise of the link's source. So the true dependence on link length $L$ of the capacity $\mathcal C$, taking this into account, will be something like $\mathcal C = \frac{\mathcal C_1}{L_0 + L}$ where $L_0$ is something much less than a kilometer and accounts for the source's quantum noise.

Current figures quoted are tens of terrabits per fibre core - (see here - I'm sorry I don't have a better reference for this, it has been some time since telecommunication technology was wonted to me). Even higher figures can be gotten for short fibres (kilometres in length) with multimoded cores so that the power density in the core is not so constraining. The disadvantage is that dispersion (now difference between transmission speeds of the fibre's bound eigenmodes) is now the limiting factor, thus only short fibres can be used. For a single mode fibre, only chromatic dispersion, as discussed by roadrunner66, is present. This can be effectively cancelled when the link dispersion is known (as it is for long haul links) by imparting the inverse dispersion at either the sending or receiving end using a Bragg grating device.

There has in the past been some interest in using squeezed states of light. The quantum noise limit I have assumed is that of a Glauber coherent state, which saturates the Heisenberg inequality $\sigma_x \sigma_p \geq \frac{\hbar}{2}$ and has equal uncertainty in the conjugate "position" and "momentum" variables. On a phase plane, this can be translated into a lower bound on the product of amplitude and phase uncertainties. One can produce squeezed states with less phase uncertainty at the expense of amplitude uncertainty, so the idea was to use a frequency or phase modulated transmission scheme and lower the uncertainty in the transmitted phase. However, one of course gets a worsening of the amplitude SNR (we can't in quantum mechanics, thwart the Heisenberg inequality), so such schemes make marginal if any difference to the overall SNR. They certainly won't change the orders of magnitudes I discussed above.

Here is an excellent summary paper on the subject, fleshing out my summary above and discussing quantitative modifications to the model above for noises other than the "fundamental" quantum noise (especially Raman and Amplified Spontaneous Emission): René-Jean Essiambre, Gerhard Kramer, Peter J. Winzer, Gerard J. Foschini, and Bernhard Goebel, "Capacity Limits of Optical Fiber Networks", J. LIGHTWAVE TECH., VOL. 28, NO. 4, FEBRUARY 15, 2010.

There is some very recent work on the use of fibres with few (i.e. fewer than about five) bound eigenfields and the encoding of separate, potentially petabit per second each, channels, one for each bound eigenfield. See the work of Love and Riesen, e.g. Optics Letters 37, 19 (2012) 3990-3992.

7

Fibers themselves have an extremely high bandwidth in principle. Pretty much all the wavelengths where they are transparent enough to transmit light such that you can still detect it at the other end.

Where the fiber itself is the limiting factor is dispersion, i.e. since all signals have a bandwidth themselves, their 'red' and 'blue' portions travel at different speeds. So if the fiber is long enough and your signal modulation is very fast, at the detector end the square input pulse will be rounded enough that you have trouble distinguishing it from the previous or following one.

The strongest limits on the usable bandwidth come from the lasers and detectors that are being used. To get all the different channels in and out while keeping them separate, you need lots of narrow band filters and modulators/demodulators. That part of the technology is expensive, but is more often replaced/upgraded than the fiber itself. The above is mostly relevant for long-haul fibers.

FTTH uses only a very small portion of the bandwidth that long-haul fibres use (Think e.g. of bundling in parallel all the FTTH fibers in Australia being a much, much thicker cable that what is laid into the ocean or between territories). You will almost never use the actual optical bandwidth of FTTH fibers, since you couldn't afford the electronics and laser to modulate and demodulate.

Finally, a human being (optical resolution of the eye being 1 arc second) can probably not process much more information than about 4 HD channels would pack (sound is trivial in information density compared to video, so is touch, smell), so there appears to be not much need for an individual to be connected at download speeds that substantially exceed the limit of about 4 HD channels incoming and 4 HD channels outgoing. Multiply this by the size of a family.

FTTH optical bandwidth likely already exceeds our biological capacity to process, only the price of the electronic side might remain a practical limit for a few years.

roadrunner66
  • 517
  • 2
  • 9
  • 1
    also, FTTH is often implemented as a passive optical network (using time multiplexing) meaning that each end user will share the bandwidth of the fibre with other users. – Andre Holzner Jul 10 '13 at 13:02
  • Yes, literal physical bandwidth (as in time vs frequency) is different from the information transport capacity, which includes amplitude and frequency. See https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theorem. Either way , any kind of shorthaul fiber (glass) has orders of magnitude more information transport capacity than a family or group of families (TV, video in /out, games, sensors) could consume or generate (say 4 video cameras per person, high res, always on, recorded elsewhere on a server). – roadrunner66 Jul 26 '20 at 01:11