57

Is there a physical limit to data transfer rate (e.g. for USB $3.0$, this rate can be a few Gbit per second)? I am wondering if there is a physical law giving a fundamental limit to data transfer rate, similar to how the second law of thermodynamics tells us perpetual motion cannot happen and relativity tells us going faster than light is impossible.

knzhou
  • 101,976
Shing
  • 2,774
  • I have done some search on google, but the suggested sites seemed to telling quite different matters. – Shing Apr 30 '18 at 08:44
  • 6
    Well the speed of light is the maximum anything can go so I would say that. – Naz Apr 30 '18 at 12:26
  • Sixty symbols did a video on the subject. – Arthur Apr 30 '18 at 12:40
  • 12
    This is not a meaningful question, because it brings exotic answers like a pipe with black holes moving near the speed of light. A meaningful question would be, for example, the maximum transfer rate at a given transmission power or some other reasonable physical limitation that would relate to practical applications. – safesphere Apr 30 '18 at 14:29
  • I'm voting to close this question as off-topic because it has no meaning. It fails to take into account, for example, parallel pipes or the potential number of quantum states available. – Carl Witthoft Apr 30 '18 at 15:05
  • Might be somewhat related to https://physics.stackexchange.com/questions/35674/is-time-continuous-or-discrete – rackandboneman Apr 30 '18 at 19:19
  • 15
    @Naz that's how fast you can receive the first bit, has nothing to do with bandwidth :) – EralpB Apr 30 '18 at 19:38
  • 8
    This is more of a practical concern than a theoretical one, but Charlie Stross once quipped that "once you get up into X-ray frequencies your network card becomes indistinguishable from a death ray." – zwol Apr 30 '18 at 23:19
  • 10
    @safesphere: Somebody once told me that the data rate achieved by a pipe containing black holes moving at the speed of light is equal to the data rate you get for signal-to-noise ratio when you plug Planck-temperature thermal radiation for signal, and vacuum fluctuations for noise, into Shannon's formula. So your speculation indeed seems to be the correct answer to the question, and the maximum rate can be calculated exactly—not that it's anywhere close to achievable. – Peter Shor May 01 '18 at 15:44
  • Seems to me you're simply talking about bandwidth. – Hot Licks May 02 '18 at 01:22
  • @PeterShor Yes, we've discussed a similar matter before: https://physics.stackexchange.com/questions/359683/number-of-photons-required-for-communication – safesphere May 02 '18 at 11:41
  • @safesphere thanks for your comment. I will ask a better posed question next time. As for this question, I am still thinking how to ask it better. I would also love an answer covering how this question can be asked meaningfully. – Shing May 02 '18 at 15:25
  • 2
    @Shing I think your question is fine (or close to) as it is. You could perhaps be more specific about looking for an absolute universal limit, disregarding practicality, but Nat's answer (or Peter Shor's suggestion of Shannon's formula with the most extreme values) suggest that your question is definitely answerable. – mbrig May 02 '18 at 15:34
  • I don't know any maths but I know once the smart people figures out quantum entanglement - it will make the "distance" factor disappear and is just a matter of how fast it is possible to change the states and thus "transfer" data instantly on the other side of the globe or from earth to mars or wherever. – Pierre Oct 07 '20 at 05:12

3 Answers3

63

tl;dr- The maximum data rate you're looking for would be called the maximum entropy flux. Realistically speaking, we don't know nearly enough about physics yet to meaningfully predict such a thing.

But since it's fun to talk about a data transfer cord that's basically a $1\mathrm{mm}$-tube containing a stream of black holes being fired near the speed of light, the below answer shows an estimate of $1.3{\cdot}{10}^{75}\frac{\mathrm{bit}}{\mathrm{s}}$, which is about $6.5{\cdot}{10}^{64}$ faster than the current upper specification for USB, $20\frac{\mathrm{Gbit}}{\mathrm{s}}=2{\cdot}{10}^{10}\frac{\mathrm{bit}}{\mathrm{s}}$.


Intro

You're basically looking for an upper bound on entropy flux:

  • entropy: the number of potential states which could, in theory, codify information;

  • flux: rate at which something moves through a given area.

So,$$\left[\text{entropy flux}\right]~=~\frac{\left[\text{information}\right]}{\left[\text{area}\right]{\times}\left[\text{time}\right]}\,.$$ Note: If you search for this some more, watch out for "maximum entropy thermodynamics"; "maximum" means something else in that context.

In principle, we can't put an upper bound on stuff like entropy flux because we can't claim to know how physics really works. But, we can speculate at the limits allowed by our current models.

Speculative physical limitations

Wikipedia has a partial list of computational limits that might be estimated given our current models.

In this case, we can consider the limit on maximum data density, e.g. as discussed in this answer. Then, naively, let's assume that we basically have a pipeline shipping data at maximum density arbitrarily close to the speed of light.

The maximum data density was limited by the Bekenstein bound:

In physics, the Bekenstein bound is an upper limit on the entropy $S$, or information $I$, that can be contained within a given finite region of space which has a finite amount of energy—or conversely, the maximum amount of information required to perfectly describe a given physical system down to the quantum level.

"Bekenstein bound", Wikipedia [references omitted]

Wikipedia lists it has allowing up to$$ I ~ \leq ~ {\frac {2\pi cRm}{\hbar \ln 2}} ~ \approx ~ 2.5769082\times {10}^{43}mR \,,$$where $R$ is the radius of the system containing the information and $m$ is the mass.

Then for a black hole, apparently this reduces to$$ I ~ \leq ~ \frac{A_{\text{horizon}}}{4\ln{\left(2\right)}\,{{\ell}_{\text{Planck}}^2}} \,,$$where

  • ${\ell}_{\text{Planck}}$ is the Planck length;

  • $A_{\text{horizon}}$ is the area of the black hole's event horizon.

This is inconvenient, because we wanted to calculate $\left[\text{entropy flux}\right]$ in terms of how fast information could be passed through something like a wire or pipe, i.e. in terms of $\frac{\left[\text{information}\right]}{\left[\text{area}\right]{\times}\left[\text{time}\right]}.$ But, the units here are messed up because this line of reasoning leads to the holographic principle which basically asserts that we can't look at maximum information of space in terms of per-unit-of-volume, but rather per-unit-of-area.

So, instead of having a continuous stream of information, let's go with a stream of discrete black holes inside of a data pipe of radius $r_{\text{pipe}}$. The black holes' event horizons have the same radius as the pipe, and they travel at $v_{\text{pipe}} \, {\approx} \, c$ back-to-back.

So, information flux might be bound by$$ \frac{\mathrm{d}I}{\mathrm{d}t} ~ \leq ~ \frac{A_{\text{horizon}}}{4\ln{\left(2\right)}\,{{\ell}_{\text{Planck}}^2}} {\times} \frac{v_{\text{pipe}}}{2r_{\text{horizon}}} ~{\approx}~ \frac{\pi \, c }{2\ln{\left(2\right)}\,{\ell}_{\text{Planck}}^2} r_{\text{pipe}} \,,$$where the observation that $ \frac{\mathrm{d}I}{\mathrm{d}t}~{\propto}~r_{\text{pipe}} $ is basically what the holographic principle refers to.

Relatively thick wires are about $1\,\mathrm{mm}$ in diameter, so let's go with $r_{\text{pipe}}=5{\cdot}{10}^{-4}\mathrm{m}$ to mirror that to estimate (WolframAlpha):$$ \frac{\mathrm{d}I}{\mathrm{d}t} ~ \lesssim ~ 1.3{\cdot}{10}^{75}\frac{\mathrm{bit}}{\mathrm{s}} \,.$$

Wikipedia claims that the maximum USB bitrate is currently $20\frac{\mathrm{Gbit}}{\mathrm{s}}=2{\cdot}{10}^{10}\frac{\mathrm{bit}}{\mathrm{s}}$, so this'd be about $6.5{\cdot}{10}^{64}$ times faster than USB's maximum rate.

However, to be very clear, the above was a quick back-of-the-envelope calculation based on the Bekenstein bound and a hypothetical tube that fires black holes near the speed of light back-to-back; it's not a fundamental limitation to regard too seriously yet.

Nat
  • 4,640
  • 2
    Incidentally, I'm trying to avoid sinking too much time on what was basically a for-fun calculation, though I'd note that the $\frac{\mathrm{d}I}{\mathrm{d}t}{\propto}r$ relationship that falls short of the classical $\frac{\mathrm{d}I}{\mathrm{d}t}{\propto}r^2$ could be brought closer-to-classical by making the pipes arbitrarily small and putting them together in a parallel configuration. The issue's basically the black holes collapsing together, so some additional spacing would be required even in the serial case. – Nat Apr 30 '18 at 12:31
  • 2
    So, much would 5 m of this wire weigh while data's in transfer? I don't want my downstairs neighbors to complain about a ruined ceiling... – John Dvorak Apr 30 '18 at 12:45
  • 5
    @JohnDvorak Probably about 280 times the mass of Earth before factoring in that they're all moving arbitrarily close to the speed of light; but, gotta make some allowances to have the latest tech, right? – Nat Apr 30 '18 at 12:55
  • 2
    Very interesting, I like this. Just as a black hole's entropy is limited by its surface area instead of its volume, the pipe's capacity for entropy flux is limited by its circumference instead of its cross section. I wonder what that really means. – N. Virgo Apr 30 '18 at 13:31
  • 1
    @JohnDvorak The problem isn't so much the transmitting them at the speed of light, it's the slowing them down at the other end that's the problem. A supernova would likely be the least of your problems. – SGR Apr 30 '18 at 14:14
  • 4
    Now we just need an SI prefix so we can properly advertise cables instead of saying the transfer rate is 1.3 ridicubits – David Starkey Apr 30 '18 at 14:15
  • 1
    The critical next step here, however, is having some sort of computational system at the receiving end of that data stream which is capable of consuming and processing the data at the rate at which it is delivered. I might hazard that the physical limits to data transmission will ultimately be less about the medium of transmission and more about the transcievers at either end and the ability to encode and decode information into and from such a stream. – J... Apr 30 '18 at 15:02
  • 4
    This is a great derivation, but my semitrailer full of flash drives can beat the pants off your average data rate. (and if it doesn't I'll just add a second trailer section) – Carl Witthoft Apr 30 '18 at 15:06
  • 2
    This answer conflates entropy with information without considering how much that information is "signal" and how much is "noise." A black hole has large entropy precisely because you cannot tell what is inside it. A terabyte flash drive has information entropy $2^{40}$ bits, because all $2^{2^{40}}$ possible configurations of the drive are distinguishable. – rob Apr 30 '18 at 18:21
  • 3
    @rob Yeah, technically I was referring to the limit at which storage media would collapse into a black hole if any more dense. Wikipedia included "It happens that the Bekenstein-Hawking Boundary Entropy of three-dimensional black holes exactly saturates the bound [...]" -"Bekenstein bound", so I used the black hole equation as the limit, as seemed to be the heuristic argument others had put forth. Was it misapplied here? – Nat Apr 30 '18 at 18:53
  • Unfortunately black holes that small may (violently) evaporate before reaching their destination. – OrangeDog May 01 '18 at 15:01
  • 3
    @DavidStarkey Given that we're using quantum phenomena, wouldn't that be ridiqubits instead? – user May 01 '18 at 15:35
  • @Nat Any idea on how this compares to Peter Shor's suggestion in the comments about "data rate you get for signal-to-noise ratio when you plug Planck-temperature thermal radiation for signal, and vacuum fluctuations for noise, into Shannon's formula" ? I'd do it myself but I can't find/identify the right number to use for the power of vacuum fluctuations... – mbrig May 02 '18 at 15:36
  • @mbrig Hah that'd be a good calculation! Honestly I initially took that comment as kidding; the issue's that these calculations involve enough assumptions, approximations, and blatant fudging that I wouldn't understand them to be exact, and I'd be particularly surprised if two different calculation approaches managed to come up with the same result without being contrived to do so. In fact, given the subjectivity of entropy and the failure for physics to reach unification at these limits, it'd be almost frightening. – Nat May 03 '18 at 03:36
  • I have to admit, though, I'm curious about what the second approach's result would be. Comparing/contrasting such model approaches would seem to make for an interesting line of investigation. – Nat May 03 '18 at 03:48
33

The Shannon-Hartley theorem tells you what the maximum data rate of a communications channel is, given the bandwidth.

$$ C = B \log_2\left(1+\frac{S}{N}\right) $$

Where $C$ is the data rate in bits per second, $S$ is the signal power and $N$ is the noise power.

Pure thermal noise power in a given bandwidth at temperature $T$ is given by:

$$ N = k_BTB $$

So for example, if we take the bandwidth of WiFi (40MHz) at room temperature (298K) using 1W the theoretical maximum data rate for a single channel is:

$$ 40 \times 10^6 \times \log_2\left(1 + \frac{1}{1.38\times 10^{-23} \times 298 \times 40 \times 10^6}\right) = 1.7 \times 10^9 = 1.7 \mathrm{\;Gbs^{-1}} $$

In a practical system, the bandwidth is limited by the cable or antenna and the speed of the electronics at each end. Cables tend to filter out high frequencies, which limits the bandwidth. Antennas will normally only work efficiently across a narrow bandwidth. There will be significantly larger sources of noise from the electronics, and interference from other electronic devices which increases $N$. Signal power is limited by the desire to save power and to prevent causing interference to other devices, and is also affected by the loss from the transmitter to the receiver.

A system like USB uses simple on-off electronic signal operating at one frequency, because that's easy to detect and process. This does not fill the bandwidth of the cable, so USB is operating a long way from the Shannon-Hartley limit (The limiting factors are more to do with the transceivers, i.e. semiconductors). On the other hand, 4G (and soon 5G) mobile phone technology does fill its bandwidth efficiently, because everyone has to share the airwaves and they want to pack as many people in as possible, and those systems are rapidly approaching the limit.

patstew
  • 471
4

No, there is no fundamental limit on overall transfer rate. Any process that can transfer data at a given rate can be done twice in parallel to transfer data at twice that given rate.

Chris
  • 17,189
  • 9
    Tell that to my DSL provider! :D – AnoE Apr 30 '18 at 13:01
  • 1
    @AnoE: You can get all the internet you want, price is not linear though. – Christian Apr 30 '18 at 13:42
  • 3
    Could the downvoters explain their votes? This seems a reasonable answer to me, but I could be missing something. Thanks. – AccidentalFourierTransform Apr 30 '18 at 14:02
  • This would seem only to be true if you assume (1) that the space available is not finite, whereas in practice increasing the size of the data transfer device is presumably limited; and (2) that integrating the data transfer from the two processes is irrelevant? – Jack Aidley Apr 30 '18 at 14:11
  • 25
    The answer is (a) deliberately not answering the spirit of the question and (b) wrong anyway. Even if you assumed that adding extra cables was an option you would eventually hit a limit due to the physical space they would require and the need to combine the data from each cable. If you assume the end-points are points then only one cable can have a minimum distance and the other cables must take a longer route leading eventually to timing issues. Anyway, who wants a multi-strand cable that is thicker than it is wide? In the end, you must answer the evaded question - "what is the serial rate?" – rghome Apr 30 '18 at 14:13
  • 10
    I really don't see why this answer should be downvoted. Sure it's incomplete, but it brings a legitimate and interesting point. – user140255 Apr 30 '18 at 14:22
  • @Undead if you just want to make a point, then you add a comment. – rghome Apr 30 '18 at 14:44
  • 4
    @rghome: No, if you want to suggest improvements to the question or request clarification, you add a comment. If you just want to make a point, then you invite the other party to a chat room. :) This is an answer, albeit a wrong one for the reason you have laid out (already a +1 from me on that :P) – Lightness Races in Orbit Apr 30 '18 at 14:44
  • @rghome you are forgetting the difference between startup latency and average thruput. I don't care if some cables are longer than others since I can re-sync data at the receiver if necessary. – Carl Witthoft Apr 30 '18 at 15:12
  • 2
    @CarlWitthoft It's not about the length of the cables, it's about the cross-section area of the cables. If you require a cable or a set of cables with 1 km^2 total cross-sectional area, you simply can't transfer data to a desktop device at that rate. – JiK Apr 30 '18 at 15:38
  • 2
    Downvoted the answer because it is not verifyable. There are no details/requirements/"specs" whatsoever. For example, while the OP does not specify how the information should be transfered (i.e. via a cable or through vacuum etc.), he does mention profane things like USB3.0 and such. Any meaningful information device has a spatial volume (in the case of a cable, its diameter; in the case of pure light, the diameter of the "beam" and so on), even if not specified as such. It is then not clear that any process can simply be run twice in parallel in the same dameter! – AnoE Apr 30 '18 at 15:52
  • 1
    @AccidentalFourierTransform You're missing synchronization issues due to phase noise for a large number of parallel lines. – Massimo Ortolano May 01 '18 at 05:33
  • 2
    @JiK The question doesn't mention having to transfer data to a desktop device, or to any device. – curiousdannii May 02 '18 at 02:06
  • "A lot of parallel Planck space" sounds like a neat name for a data centre. Discussing eventual size limits for parallelling transfer does little for the answer without discussing the individual size limits. Additionally, data that is already touching the receiver as it is sent is received instantaneously as may be the case if a black hole is touching an anti-black hole (theoretical, made of dark matter - a better question is what binds it, perhaps another black hole inside) so, discussing the length of transfer is also of relevance. Speed of light in parallel seems to be the actual answer. – Willtech May 02 '18 at 12:31