1

I'm not a physicist, so my apologies if this isn't worded correctly, or if it even is a valid question.

I was recently learning how WiFi works and how the slowest devices on a channel can slow down access for everyone due to the need to make out different devices, and that, along with the observation that signals degrade over distance as the inverse square law, made me wonder:

What is the maximum theoretical shared digital bitrate density of the entire reasonable portion of the electromagnetic spectrum?

Some clarifications and assumptions:

  1. By reasonable I mean no wavelengths hazardous to life (~380nm and above) and those reasonable in the real world (Probably much greater than 1Hz? Unsure here)
  2. Omnidirectional antennas only, no directed beams/lasers
  3. Assuming no rain, trees, or anything else interfering with line of sight (I'm curious what would happen if an atmosphere was present, but assume a vacuum for this question)
  4. A planar density (no stacking antennas vertically)
  5. Digital bandwidth/bitrate is the sum of all bits (or symbols?) successfully transmitted between two devices/antennas, no multicast or anycast is present.
  6. Adding another device/antenna may be possible, but would bring the total bitrate down due to collisions.
  7. Assume perfect antennas with reasonable available power.
  8. Data must be transmitted at least a meter in each pair (devices can be closer than a meter as long as A: they could still receive it if they were positioned a meter away, or B: they aren't talking to each other).
  9. Each device/antenna may talk with as many other devices as needed.
  10. Each device has as accurate time as physically possible.
  11. Any data encoding schemes that are theoretically possible can be used.
  12. The cosmic microwave background & other interstellar sources of noise are present.

That's all of the relevant constraints I can think of at this time. Do let me know if you think I should add or remove a constraint or assumption, as I'm a CS person, not a physicist.

I am thinking the answer would be in $bits/s/m^2$, though only valid for areas larger than some amount? And I'm assuming that the number of devices is likely also in that unit, but I have no idea how that would fit in.

I'm aware of the Shannon-Hartley theorem and this related question: What is the relationship between bitrate and bandwidth for optical channels? but that's for a single point to point channel, not a shared medium that falls off over distance.

byteit101
  • 111
  • If distance is a problem, you crank up the output power until it isn't. – Jon Custer Aug 23 '22 at 13:52
  • @JonCuster, the trade-off is the higher you make the power for this link, the farther away you have to put the next pair of antennas to avoid interference from this signal. (I.e., suppose you have transmitter A sending to receiver B and transmitter C sending to receiver D. If you increase the signal from transmitter A, then receiver D has to be farther away to avoid interference from A. – The Photon Aug 23 '22 at 15:43
  • @ThePhoton - of course. How the FCC licenses radio stations demonstrates that. Can't have that 50,000W transmitter too close to another one. Although the limits on AM/FM are mostly because of the older specifications - with more modern circuitry one could cram more stations into the spectrum. – Jon Custer Aug 23 '22 at 15:51
  • @JonCuster, right, so as I understand the question, since we're looking for data transmission per unit area (or volulme), it hinges on how to balance closely packing the stations to reduce "area" vs keeping them far enough apart to reduce interference. – The Photon Aug 23 '22 at 15:59
  • @ThePhoton That is the correct understanding of my curiosity. I chose 1m arbitrarily. – byteit101 Aug 23 '22 at 16:16
  • 1
    given the packing density goes as $1/r^2$ but the interference effect on the bandwidth will go as $\log r^2$ (it fits in where the $N$ is in the Shannon-Hartley formula, I think you want to pack the transmitters as close together as possible. But I don't have time to prove it strongly enough to write an answer. – The Photon Aug 23 '22 at 17:14

1 Answers1

1

A rough rule of thumb is that you can get about a bit per hertz of bandwidth. In some radio systems you can get about 4 bits per hertz. In optical systems it would be the same except it is much harder to have a coherent optical communication system. It is roughly independent of the system. There is an assumption about the signal to noise ratio buried in that. So comms guys will also talk about how much energy per bit as well. You can also do this with acoustic systems.

To look at it more closely you can look at the Shannon limit. Most engineered comm systems come close to the full capacity of the channel. In some cases with multiple antennas and receivers the analysis is more complicated and some claim that they exceed the Shannon limit.

Most digital systems also use error correction. That lowers the bits per hertz, but let’s you have much lower signal to noise. It is really impressive.

But as a back of the envelop probably a bits per hertz works anywhere in the spectrum assuming you have a transmitter and receiver where you are sensitive to the field rather than the intensity.

UVphoton
  • 3,283