33

Computers generate heat when they work. Is it a result of information processing or friction (resistance)? Are these just different ways to describe the same thing? Or does some definite part of the heat "come from each explanation"?

I often read that it's a necessary byproduct of information processing. There are irreversible operations such as AND gates and the remaining information goes to heat.

But so many other things generate heat as well! A light bulb, electric hotplates, gears, etc. (These probably don't process information the way the computer does, but I may be wrong from a physical perspective.) Earlier I had always assumed the computer is like this as well. It basically has small wires in the processor and the resistance could explain the heat.

Maybe these are parallel explanations. The information processing aspect may say that there has to be some heat as byproduct in some way in any realization of an abstract computer, and the friction aspect could then describe how this actually happens in this concrete wires-and-transistors-type physical implementation of the abstract computer.

But maybe the two explanations account for separate amounts of the heat. Or maybe one accounts for a subset of the other, again in a partially parallel explanation way.

Can someone clarify?

isarandi
  • 905
  • 6
    Most of the heat in modern digital (CMOS) chips are generated by charging and discharging stray and intentional capacitors, the faster the chip is the more dissipative cycles it goes through per unit time. The heat dissipation is of course is in the unavoidable resistances. – hyportnex Sep 27 '14 at 12:59
  • If you like this question you may also enjoy reading this post. – Qmechanic Sep 28 '14 at 15:52

4 Answers4

35

Landauer's principle (original paper pdf | doi) expresses a non-zero lower bound on the amount of heat that must be generated by computers.

However, this entropy-necessitated heat is dwarfed by the heat generated through ordinary electrical resistance of the circuitry (the same reason light bulbs give off heat).

lemon
  • 13,220
  • Is the small heat explained by Landauer a subset of the resistance-heat, or is it additional? – isarandi Sep 27 '14 at 11:29
  • 10
    @isarandi The Landauer principle says nothing about the mechanisms by which the heat is produced so it doesn't really make sense to say that 'this bit of thermal energy is Landauer and the rest is not', it is merely a theoretical limit. – lemon Sep 27 '14 at 11:38
  • 1
    So, short answer is what you mentioned, it's mostly the resistance-heat. – Vincent Vancalbergh Sep 27 '14 at 16:28
  • 2
    So let's make this clear. Assume we make a very-very efficient computer. Say we are extremely close to Landauer's bound. That heat will still come about through some concrete physical process, such as resistance, right? Or does it just magically jump from information to heat? Is that a physical interaction? As I understand it, it always involves some process that we can describe in non-information terms, too. Could you clarify? – isarandi Sep 28 '14 at 23:07
  • 3
    @isarandi Yes, there will always be a 'concrete physical process' by which the heat is produced. There's no magic involved. – lemon Sep 28 '14 at 23:22
  • @lemon I was using magic as a figure of speech. What I mean is that there is no 'concrete physical process' called 'information-to-heat interaction' or anything like that, right? All of the Landauer heat is realized by some good old well-known electrical phenomenon, like resistance, if I understand correctly. No additional heat because of (explainable solely by) information processing. – isarandi Oct 06 '14 at 08:40
6

Computers manipulate internal stored values "0" and "1" represented as different voltages. Every change 0-to-1 and 1-to-0 involves an electric current I passing through a circuit resistance R, which gives rise to ohmic or "Joule" heating.

1

The heat generated in a computer has nothing to do with the reversibility condition in Landauer's principle. Computations can be carried out reversibly, if required. What can not be made reversible is the RESET of the computer.

The first time we turn the machine on, the memory is in a random state, and it takes energy and entropy to turn that random state into a well-defined initial state, without which computations can not be carried out.

The same is true for the state of the program memory... so writing the program would, as one would naively expect, take a non-zero amount of energy and entropy. As has been pointed out, technological implementations are many orders of magnitude away from these limits (which, by the way, are far, far lower than the power demands of the human brain) without which the computer's output is utterly meaningless.

CuriousOne
  • 16,318
  • 3
    Most computation is not reversible. For example zeroing out a chunk of memory. And programs are full of such value-setting operations. – isarandi Sep 27 '14 at 17:20
  • Please see what I wrote: computation can be made reversible, reset can not be. There is no need to reset memory during a computation. That's just a convenient (but unnecessary) operation to save memory. – CuriousOne Sep 27 '14 at 18:14
  • 2
    But in practice we do so. My question was about why computers generate heat in real life. You said the heat has nothing to do with Landauer because theoretically we can compute things reversibly. But this isn't a solid argument. What we actually do (reset memory often) may lead to heat through Landauer's principle. – isarandi Sep 27 '14 at 18:27
  • 2
    In practice the Landauer limit is nowhere close to the heat generated by practical computer designs. It is unmeasurably small for any mainstream memory design that I am aware of. I think it has been demonstrated, at least theoretically for some single bit atomic systems, but I could be wrong. At this point in time there is, as Feynman said, still plenty of room at the bottom, even for classical computing. – CuriousOne Sep 27 '14 at 18:37
  • 2
    @CuriousOne: There most definitely is a need to "reset" (or rather reuse) memory during computation: otherwise you would quickly run out. That's why nobody implements the trivial allocator: malloc=sbrk and free=nop. On a more abstract level, in a general setting without knowing the particular computation you'll be performing, it requires astronomically more memory to make a computation reversible - and that memory has energy cost. – R.. GitHub STOP HELPING ICE Sep 27 '14 at 22:27
  • Depends on what you are computing. Good algorithms compute in place whenever possible. You are right, though, reversibility will blow up the necessary memory requirement exponentially for NP algorithms... but that's basically saying that I can either compute reversibly and push the cost to the reset once, or be irreversible and reset as I go... the result is the same, and still some 10+ orders of magnitude below today's energy consumption. There ain't no free lunch. – CuriousOne Sep 27 '14 at 23:39
0

There are two main components to the heat produced by a computer. The first is due to the "steady state" power usage. This is the power required when the computer is doing nothing. The second component is proportional, and due, to the amount of computing being done. The more computations per second, the more power is required, thereby generating additional heat.

Guill
  • 2,493
  • Sure, but there is no additional heat associated with the information processing itself, it's always explainable by some good old well-known electrical phenomenon, like resistance. There is no separate "information-to-heat" physical process. At least that's what I understood from the other answers. – isarandi Oct 06 '14 at 08:36