I'm having trouble finding typical quantities in fiber optic communication. In particular, what kind of powers are generally used (or what is the minimum that fiber optics receivers can detect effectively)? What frequencies of light are generally used, and what are typical sampling rates?

- 7,033

- 14,136
-
21550nm and 1310nm are the main bands: the 1300nm band is naturally where the zero dispersion point is but Er doped fibre amplifiers work at 1550nm; dispersion shifted fibre allows 1550nm to become a zero dispersion point. Powers can be up to 100W. Each wavelength division multiplexed channel is typically run at 10Gbps; with a full quota of channels fibres are now carrying tens terabits per second. A rough estimate of their maximum theoretical capacity is 1 petabit per second for each modal channel, see here. – Selene Routley Feb 08 '14 at 14:31
-
1@WetSavannaAnimalakaRodVance This sounds like more of an answer than a comment. Maybe you could expand it into one? – Chris Mueller Feb 08 '14 at 18:08
1 Answers
Although I have worked fairly near to these topics (albeit a long time ago), I found good information googling "wavelength band fiber". The encyclopoedic site at http://www.rp-photonics.com of the engineer Dr. Rüdiger Paschotta is an excellent reference (I have no links to this guy; I simply go to his site as a reference when I need reminding of this stuff)>
The short answer is 1300nm and 1500nm (i.e. roughly between twice and thrice visible light wavelengths) are the main bands. Powers can be up to about 100W per fibre, which, in a single mode fibre with a $10{\rm\mu\,m}$ mode field diameter represents a power density of $300{\rm GW\,m^{-2}}$; at greater powers the fibres tend to fail because the local heating owing to the power absorbed by tiniest blemish or inclusion in the fibre begins to melt the fibre and thus re-inforces the loss and local heating, and the fibre destroys itself such that it looks rather like a lit fuse burning itself up!
The signal rates through fibres are organised as follows (at least this was how they were twenty years ago: my understanding is that the technology has improved but the ideas are the same):
- Digital baseband signals are typically organised into $10{\rm GBps}$ baseband (i.e., just like a bedazzlingly swift telegraph) channels (actually the rate is $10.192{\rm GBps}$, of which $9.852{\rm GBps}$ is payload, in a Synchronous Transmission Module level 64 (STM-64));
- These $10{\rm GBps}$ baseband baseband channels are "stacked" onto a fibre through wavelength division multiplexing. Typically there are up to several hundred such channels per fibre, making for several terabit transmission capacity for each fibre.
It is important to understand that the above organisation arises from limitations on the sending and receiving electronics (we have a hard time making things work at reasonable cost greater than $10{\rm GBps}$ baseband), it is in no way set by the fibre's physics. A fibre could in theory send roughly 1 petabit per second for each modal channel used over a distance of roughly ten kilometres without regeneration. For an overview of the physics limiting the fibre itself, see my answer here and the reference I cite at the end of it.
So what is the underlying physics for all this. I fetched the drawing below from the JDSU product literature:
Gregory Lietaert, "Fiber Water Peak Characterisation"
and this illustrates much of what I want to say:
Two things limit signal transmission through optical fibres- attenuation and dispersion. A great deal of technology has been developed to deal with the latter, dispersion, which limits bandwidth through scrambling signal transmissions through variable group delay. Dispersion is almost wholly a linear effect, so its effects can be almost totally annulled and compensated for by devices such as optical fibre gratings (see also Long Period Gratings and also dispersion shifted fibre. Loss is a much bigger problem and can only be overcome by regeneration through fibre amplifiers; loss limits transmission rates through noise: even in a perfectly clean system where amplifiers had zero noise figure, the optical quantum limit prevails and sets the limit: see my detailed discussion in my answer to the Physics SE question Maximum theoretical bandwidth of fibre-optics.
So, given these comments, it is obvious that for fibres of many kilometres length, we need to work in wavelength bands where the attenuation is less than 1dB each kilometre. Thus, from the diagram above, the reason for the 1300nm and 1550nm bands is obvious. The 1550nm band is the lower loss band but historically the 1300nm band was used first because this is where the zero dispersion wavelength of a simple step refractive index silica fibre tends to sit. Before dispersion cancelling technology such as dispersion shifting fibre refractive index profiles and and fibre gratings were fully refined, residual dispersion tended to be the ultimate limit. As this was tamed, the practicable lengths of fibres became longer, so the 1550nm band was gradually used. Moreover, the erbium doped fibre amplifier technology came into being to realise regeneration stations in fibre networks, and this works at 1550nm. But nowadays the bandwidths asked of fibres are so huge that all practicable bands - those between about 1260nm to 1625nm - must be used, with research underway to open up channels in the 850nm band.

- 88,112