117

As far as I know, today most of the computers are made from semiconductor devices, so the energy consumed all turns into the heat emitted into space.

But I wonder, is it necessary to consume energy to perform computation?

  • If so, is there a theoretical numerical lower bound of the energy usage ? (I even have no idea about how to measure the amount of "computation")

  • If not, is there a physically-practical Turing-complete model that needs no energy?


edit: Thank @Nathaniel for rapidly answering the question and pointing out it's actually Landauer's principle. Also thank @horchler for referring to the Nature News and the related article. There are lots of useful information in the comments; thank every one! This whole stuff is really interesting!

jiakai
  • 1,223
  • 9
    Suggested title change: "Is it necessary to consume energy to perform computation?" At the moment the title looks like you want to know how cooling fans work, but the body text is asking a much deeper physical question. –  Jul 07 '13 at 16:00
  • @ChrisWhite Agreed, that would make a killer title. – Thomas Jul 07 '13 at 17:21
  • 1
    You may be interested in the Applied Cryptography snippet about half way into the post on http://www.schneier.com/blog/archives/2009/09/the_doghouse_cr.html – user Jul 07 '13 at 19:06
  • 1
    Your question made me think of Maxwell's demon. See this Nature News article and the related Nature article by Bérut, et al. on the experimental verification of Landauer's principle. – horchler Jul 08 '13 at 01:12
  • 1
    See also link"The thermodynamics of Computation - A Review" by Charles Bennet, Int J. Theor. Phys. Vol 21, No 12 1982 for a most excellent review of Landauer's principle discussed in Nathaniel's answer below. – Selene Routley Jul 08 '13 at 08:32
  • 2
    You might also care to do a rough calculation as to how far we are from these theoretical limits with contemporary technology. A current chip, depending on the algorithm, forgets (raises the Kolmogorov complexity of the surrounding universe) of the order of $10^{10}$ bits per second for a power consumption of 10W. At 300K, I make this to be at least ten orders of magnitude more than the theoretical limit. That's even before reversible computing is sought. – Selene Routley Jul 08 '13 at 08:49
  • 3
    In contrast, a cell's building of a protein forgets about 1.6 bits per amino acid (64 DNA codons code for 20 amino acids) and does it with the expenditure of about between $10 k T$ and $40 k T$, thus only one order of magnitude worse than the Landauer limit. See linkJ Avery "Information Theory and Evolution" 2003. I guess this is one of the reasons DNA computing might hold some promise. – Selene Routley Jul 08 '13 at 08:52

2 Answers2

132

What you're looking for is Landauer's principle. You should be able to find plenty of information about it now that you know its name, but briefly, there is a thermodynamic limit that says you have to use $k_\mathrm BT \ln 2$ joules of energy (where $k_\mathrm B$ is Boltzmann's constant and $T$ is the ambient temperature) every time you erase one bit of computer memory. With a bit of trickery, all the other operations that a computer does can be performed without using any energy at all.

This set of tricks is called reversible computing. It turns out that you can make any computation reversible, thus avoiding the need to erase bits and therefore use energy, but you end up having to store all sorts of junk data in memory because you're not allowed to erase it. However, there are tricks for dealing with that as well. It's quite a well-developed area of mathematical theory, partly because the theory of quantum computing builds upon it.

The energy consumed by erasing a bit is given off as heat. When you erase a bit of memory you reduce the information entropy of your computer by one bit, and to do this you have to increase the thermodynamic entropy of its environment by one bit, which is equal to $k_\mathrm B \ln 2$ joules per kelvin. The easiest way to do this is to add heat to the environment, which gives the $k_\mathrm BT \ln 2$ figure above. (In principle there's nothing special about heat, and the entropy of the environment could also be increased by changing its volume or driving a chemical reaction, but people pretty much universally think of Landauer's limit in terms of heat and energy rather than those other things.)

Of course, all of this is in theory only. Any practical computer that we've constructed so far uses many orders of magnitude more energy than Landauer's limit.

N. Virgo
  • 33,913
  • @Nathaniel : What is the increasing of the energy for the environment ? For instance is it correct to state that the environnement gains energy $\frac{kT}{2}$, so that the Gibbs Free energy of the environnement decreases : $\Delta F = \Delta U - T\Delta S =kT(\frac{1}{2} - ln 2) < 0$ – Trimok Jul 07 '13 at 18:15
  • 2
    The Feynman Lectures on Computation is a slim little volume that includes a presentation of the state of this art a few of decades ago. Very accessible to most physicists. – dmckee --- ex-moderator kitten Jul 07 '13 at 18:43
  • 1
    See the paper "Thermodynamics of prediction" by Still et al., available on [http://www.threeplusone.com](Gavin Crooks' website) (sorry, I kind of feel bad about deep-linking) for a refinement of Landauer's principle for general computation devices retaining memory and responding to some temporally correlated input signals. Such a system approaches Landauer's limit when it is maximally predictive, that is, the information in the system's memory contains a maximum amount of information about future input signals. – Simeon Carstens Jul 07 '13 at 23:07
  • 2
    @Trimok it's simpler than that. You add $Q=k_BT\ln 2$ joules of heat, so the environment's entropy changes by $Q/T=k_B\ln 2 = 1,\text{bit}$. The environment is typically assumed to be a heat bath, so we don't worry about its free energy, just its entropy. The total entropy change is $-1,\text{bit}+1,\text{bit} = 0$, satisfying the second law. – N. Virgo Jul 08 '13 at 01:31
  • This post has been referenced from a possible duplicate post but I am not sure. In short, it is abourt computing learning and computing applying what is learned, if there is a thermodynamic correlation. https://physics.stackexchange.com/questions/369112/thermodynamic-or-entropy-explanation-of-learning-energy – J. Doe Nov 21 '17 at 16:44
-1

Is it necessary to consume energy to perform computation?

Strictly to "perform", perhaps not, but to stage the performance and measure it requires energy. If dropping an object could be equated to calculating gravity consider the energy to lift the object and hold it away from the surface to which it falls.

Similarly an electron orbiting a neutron performs it's own calculations (if you like) in order to exist, but it took energy to create and would require effort, energy, for you to observe.

A quantum computer may in future be energy efficient, since with a sufficient number of bits (of good fidelity) it's ability to calculate will by far exceed everything that came before it; but energy efficient doesn't imply energy free.

Solar energy isn't "free", "free lunch" isn't free, computation can not be "free" (done without energy) but in the future it will be performed more efficiently.

Rob
  • 2,273
  • 1
  • 11
  • 31
  • I hope on this site we can use more meaningful phrases than "calculate gravity". I understand that in this context it doesn't matter whether you're referring to the force of gravity or acceleration due to gravity, but many young physics students use ambiguous phrases like this and it can hamper learning; I believe we should not contribute to it. – electronpusher Apr 19 '22 at 15:16
  • I'm also confused in what situation you would find an electron "orbiting" a neutron. – electronpusher Apr 19 '22 at 15:17