95

The idea for this question comes from an example in cryptography, where supposedly 256-bit symmetric keys will be enough for all time to come (brute-forcing a 256-bit key is sort-of equivalent to counting to $2^{255}$, with some constant in front of it). While I don't really doubt this, I think it is an interesting thought experiment to what number (approximately of course) a theoretical "perfectly efficient" (define this as you want) computer, with infinite time and all the energy (including matter but not dark matter) in our galaxy available could count to. Counting to $x$ is defined as having some physical object go through $x$ different, predefined, measurable states. Sadly I'm missing the theoretical background to do this calculation myself properly, I could try it but I would have no idea if I missed something essential. I hope a "fun" question like this isn't out of scope for this site (feel free to direct me to a better place to ask this).

Alternatively: What about all the energy in the known Universe?

Since the idea for this question was key-length in cryptography, feel free to consider (or not consider) Grover's algorithm.

Edit: As a comment suggests, if there's no good answer on what to consider a "perfectly efficient computer", maybe just take the values for a known processor.

Trang Oul
  • 119
cooky451
  • 1,069
  • hi cooky - I have a feeling your question is teetering on being loved/hated in physics SE. For one, I don't know how one measures the work of "computer counting", or if there is an ideally efficient computer. – anon01 May 22 '16 at 14:11
  • You could maybe frame it like: how much energy would it take the most efficient computers to...(insert your task) – anon01 May 22 '16 at 14:13
  • 1
    Using classical approaches, one can count to 2^256 using 3/4 of the mass/energy of the galaxy. I'd re run the math for you, but it looks like you already accepted a quantum answer. – Cort Ammon May 22 '16 at 16:45
  • 2
    @CortAmmon That looks like an interesting answer too, please elaborate - I for one have no idea what you're getting at. Moreover, given the OP's needs (estimates for judging soundness of cryptographic schemes by), a variety of different approaches to the problem would be helpful. – Selene Routley May 22 '16 at 23:28
  • Given the context of your question, you should also look at the Bremmermann Limit which apparently has some history of being used to assess safety of cryptographic algorithms. – Selene Routley May 23 '16 at 00:42
  • @CortAmmon BTW Mine isn't a "quantum" answer, it's a "reversible computer" answer. I only mention quantum computing because it's seen nowadays as a promising candidate for realizing a reversible computer. Bennett, Feynman and others who thought about these things actually made their thought experiments with billiard ball computers, wherein little balls bounce off one another elastically to influence each other's paths and thus do computations without any loss of energy. Look at the Bennett paper I cite for a description. – Selene Routley May 24 '16 at 11:37

1 Answers1

136

A "perfectly efficient" computer can mean many things, but, for the purposes of this answer, let's take it to mean a reversible computer (explained further as we go).

The theoretical lower limit to energy needs in computing is the Landauer Limit, which states that the forgetting of one bit of information requires the input of work amounting $k\,T\,\log 2$ so as to comply with the second law of thermodynamics. If the computer is reversible, i.e. its state at all times can be inferred its state at any other time, then there is no theoretical lower limit to its energy needs. By state here we mean the computer theoretical state of the CPU, not the physical quantum state (the former being a very small part of the latter; microscopic laws are reversible so that the full quantum state at any time can always in theory be inferred from the full quantum state at any time). An example of a nonreversible computation is one where you add two numbers and write the result over the memory formerly occupied by the addends. The two addends cannot be inferred from the computer's state (i.e. the sum) after the addition has taken place. Briefly, the reason for this situation is that if your calculation forgets, Nature does not, so if you erase memory, then that "erased" information must somehow wind up encoded in the full quantum state of the computer since microscopic laws are indeed reversible. The only way a system can "absorb more information", i.e. fully encode its past in its quantum state, is by accessing more and more quantum states, and that almost always means by getting hotter [see 1]. So, somewhere along the line you have to add energy to make this happen, and eventually you'll need to cool the computer to keep it working. The second law of thermodynamics then shows that if we want to keep the computer at a constant macrostate, we need to input the amount of work prescribed by Landauer's principle to do so[see ref. 2].

Now let's look at your problem. Counting can clearly be made into a reversible computation: each step is invertible and you can imagine simply clocking a simple digital counter backwards to achieve this. So in theory we could build a quantum (or other reversible) computer to count with no energy input whilst it is counting. However, when tallying up the forgetting of information, one needs to take into account initialization. That is, you need to begin with initialized registers to count with. You start your machine up by initializing them all to nought ..... but that means that there is a quantum state of each register that is "forgotten" as the machine is initialized. So, if you need memory of $N$ bits for your counting, you need to come up with $N\,k\,T\,\log 2$ joules to get your reversible computer initialized. Wikipedia tells me the Milky Way's mass is estimated to be $10^{12}$ solar masses, or about $2\times 10^{30}\times 10^{12}\times 10^{17} =2\times 10^{59}$ joules. If you can cool your computer to the temperature of the Cosmic Background Microwave Radiation, or $2.7{\rm K}$, then the Landauer limit implies you can buy the initialization of $2\times 10^{59} / (2.7\times 1.38\times 10^{-23}\times \log 2) \approx 8\times 10^{81}$ bits. You can't run your computer below $2.7{\rm K}$ since it would then need energy for artificial cooling below its environment.

So that's then your rough answer: in theory you could count to the number :

$$2^{8\times 10^{81}}$$

with a reversible implementation of a counter given the stated energy budget.

Another limit that may be of interest in from the cryptographic viewpoint is the Bremmermann Limit, which limits how fast computations can evolve into their successive steps.

It should be noted how difficult it is to achieve the Landauer limit. If our counter forgets even one bit per counting cycle, the limit reduces to the still colossal $2\times 10ˆ{81}$. Yockey [see reference 3] claims in the early chapters of his book that the phenomenon of DNA replication during cell division thought of as a computer algorithm is the most efficient computation known, and consumes roughly one order of magnitude more energy than the Landauer limit, that is, roughly $10 k\,T$ per forgotten bit. In the light of the Landauer limit, modern computers are staggeringly inefficient. 32Gbyte of RAM being overwritten at 1GByte per second and consuming 5 watts at 300K in being so (these are the figures for the computer these words are being written on) represents a forgetting that is eleven orders of magnitude more wasteful ($5 / (8\times 10^9 \times k \times 300\,\log 2)\approx 2\times 10^{11}$) than the Landauer limit.


References and Footnotes:

[1]: To deepen your understanding of this statement, try working out and plotting the Shannon entropy of specification of the state of an ensemble of $N$ quantum harmonic oscillators at thermodynamic equilibrium as a function of temperature (answer: $\left(\frac{e^{\beta_\omega } \beta_\omega }{1-e^{\beta_\omega }}+\log \left(e^{\beta_\omega }-1\right)\right)/\log (2)$ bits per oscillator, where $\beta_\omega = \hbar\omega/(k\,T)$). You can immediately see what's going on: the Boltzmann probability distribution is here proportional to $p(n)\propto\exp\left(-(n+\frac{1}{2}) \frac{\hbar\,\omega}{k\,T}\right)$ and the tail gets longer, "accessing more states" as $T$ rises).

[2] An excellent review paper for these concepts is Charles Bennett, "The Thermodynamics of Computation: A Review", Int. J. Theo. Phys., 21, No. 12, 1982)

[3] "Information Theory, Evolution, and the Origin of Life", Hubert P. Yockey As a non biologist I don't feel qualified to judge this text. I did feel, however, that I understood the early chapters whence I gleaned the assertion about the efficiency of DNA replication well enough to be reasonably confident in the assertion's soundness, but I found most of the text beyond Chapter 2 utterly incomprehensible.

  • 3
    That qualitative explanation of why irreversible computation takes energy was great. I'd come across that fact, but hadn't seen an explanation before. – Peter Cordes May 22 '16 at 17:56
  • 1
    For completeness, it might be worth noting that if you need to do any kind of non-reversible computation at each step, then the limit becomes much lower. In fact, using your figures and assuming the erasure of (at least) one bit per step, the limit becomes approximately $8 \times 10^{81} \approx 2^{272}$ steps. – Ilmari Karonen May 22 '16 at 22:46
  • @IlmariKaronen Indeed. The Landauer limit is way below anything we look like achieving. I've added a bit following your suggestion – Selene Routley May 23 '16 at 00:38
  • @PeterCordes Thanks; the explanation of course is not altogether mine but one I got straight reading the Bennett review that I've now cited in the updated answer. If you're interested in this stuff, that paper is well worth a read. – Selene Routley May 23 '16 at 00:40
  • @Bosoneando Thanks grandly. As I said to Peter Cordes, if you're interested in this stuff, the Bennett reference I've just added to the answer is well worth a read. – Selene Routley May 23 '16 at 00:41
  • Rod, based on a book or not, your answer is flat-out impressive! – Terry Bollinger May 23 '16 at 01:46
  • Also, as a strong advocate of the remarkable efficiency of biological systems at intelligent calculations, I was very intrigued by your remarks about the efficiency of DNA replication as a form of computation. – Terry Bollinger May 23 '16 at 01:47
  • @TerryBollinger Thanks Terry. It is indeed intriguing. It's a shame I couldn't understand most of the book I am quoting. There are of course references in that book, but I've not gotten around to chasing them up. – Selene Routley May 23 '16 at 01:51
  • 6
    Given that there are ~$10^{80}$ atoms in the universe, it turns out that the limiting factor in a binary computer would be the the number of atoms (assuming you can get one bit per atom). We may figure out how to encode more densely than that someday, but we are still in the many atoms per bit regime. It is somewhat interesting, though, that these two numbers are within 2 orders of magnitude of each other. – user121330 May 23 '16 at 23:43
  • @user121330 I think the two numbers' being roughly equal is co-incidence, as I have taken the total energy of the Milky Way, not the observable universe. As Jerry Schirmer's answer here points out, there is no information theoretic limit to the number of bits that can be encoded into one hydrogen atom. But you will eventually reach the Bekenstein bound. – Selene Routley May 24 '16 at 01:59
  • Right, the Milky Way. So there's only ~$10^{68}$ atoms to work with, and we'd need to encode ~$10^{13}$ bits per atom. $\delta E_n >> \Delta E_n$ as a rule, so while one might, with infinite precision encode the information, one could never measure it, and Hydrogen's excited states are notoriously unstable. You've written a beautiful answer here, but as with any discussion of computation, one must consider the bottlenecks. I'd guess there are 4 really great questions here - CPU, Memory, Bandwidth and Display. – user121330 May 24 '16 at 19:44
  • 1
    There is a Nature paper that experimentally validates the Landauer Principle. – WYSIWYG Jun 09 '16 at 08:39
  • 1
    @WYSIWYG Thanks for the reference. There is now quite a body of experimental work seeking to directly validate the LP. See for example here – Selene Routley Jun 09 '16 at 08:43
  • 1
    @WetSavannaAnimalakaRodVance Yeah I came across another paper while looking for the above one. Overall, it seems very interesting. – WYSIWYG Jun 09 '16 at 08:48
  • I'm curious: What exactly is the difference between a reversible and irreversible counter? You say the irreversible one is one that forgets, but isn't every time we click the counter up by one we forget its previous state? Or is it that the fact we can also just as easily click it down by one sufficient to make it reversible, because this allows us to infer what its state was in the past (that is, the map of past to future states is bijective on the state space)? You mention "you can imagine simply clocking a simple digital counter backwards to achieve this"but this statement is a bit unclear. – The_Sympathizer Mar 10 '18 at 05:05
  • It suggests somehow you have to use a backwards counter to implement it, not simply that the fact of that you can click it backwards makes it reversible. Rather it suggests that either you need to include a backwards counter, in which case I don't see how that helps, or you need to tick the counter backwards instead of forwards, which also doesn't seem like it should make any difference. What exactly do you mean here? – The_Sympathizer Mar 10 '18 at 05:07
  • @The_Sympathizer Essentially its the bijective mapping that is important here. With a counter, if you know the inputs (direction bit) and the state then you can infer what the state was at the clock cycle beforehand. With the adder, there are many former states that can produce the current addend. Certainly quantum counters exist: one can set up a quantum system that visits a fore-defined set of states in a given order through unitary, reversible evolution. – Selene Routley Mar 10 '18 at 06:06