I'm not a physicist, so my apologies if this isn't worded correctly, or if it even is a valid question.
I was recently learning how WiFi works and how the slowest devices on a channel can slow down access for everyone due to the need to make out different devices, and that, along with the observation that signals degrade over distance as the inverse square law, made me wonder:
What is the maximum theoretical shared digital bitrate density of the entire reasonable portion of the electromagnetic spectrum?
Some clarifications and assumptions:
- By reasonable I mean no wavelengths hazardous to life (~380nm and above) and those reasonable in the real world (Probably much greater than 1Hz? Unsure here)
- Omnidirectional antennas only, no directed beams/lasers
- Assuming no rain, trees, or anything else interfering with line of sight (I'm curious what would happen if an atmosphere was present, but assume a vacuum for this question)
- A planar density (no stacking antennas vertically)
- Digital bandwidth/bitrate is the sum of all bits (or symbols?) successfully transmitted between two devices/antennas, no multicast or anycast is present.
- Adding another device/antenna may be possible, but would bring the total bitrate down due to collisions.
- Assume perfect antennas with reasonable available power.
- Data must be transmitted at least a meter in each pair (devices can be closer than a meter as long as A: they could still receive it if they were positioned a meter away, or B: they aren't talking to each other).
- Each device/antenna may talk with as many other devices as needed.
- Each device has as accurate time as physically possible.
- Any data encoding schemes that are theoretically possible can be used.
- The cosmic microwave background & other interstellar sources of noise are present.
That's all of the relevant constraints I can think of at this time. Do let me know if you think I should add or remove a constraint or assumption, as I'm a CS person, not a physicist.
I am thinking the answer would be in $bits/s/m^2$, though only valid for areas larger than some amount? And I'm assuming that the number of devices is likely also in that unit, but I have no idea how that would fit in.
I'm aware of the Shannon-Hartley theorem and this related question: What is the relationship between bitrate and bandwidth for optical channels? but that's for a single point to point channel, not a shared medium that falls off over distance.