8

James P. Sethna. Statistical Mechanics. Exercise 5.2:

What prevents a Maxwellian demon from using an atom in an unknown state to extract work? The demon must first measure which side of the box the atom is on. Early workers suggested that there must be a minimum energy cost to take this measurement, equal to the energy gain extractable from the bit. Bennett showed that no energy need be expended in the measurement process. Why does this not violate the second law of thermodynamics?

Reference to Bennett's paper didn't help me much. Here the relevant model is a tape consisting of single atoms in pistons, where knowing which side an atom is in a piston counts 1 bit of information, which can be used to extract useful work by expanding the piston, as shown below:

enter image description here enter image description here

My understanding is that after the measurement uncertainty of position is reduced by half and entropy decreases. But without energy expenditure, it seems that this decrease comes free. How can the second law hold if there's no corresponding increase of entropy elsewhere (which I can't identify)?

Something like an explanation is given at the end of the exercise:

The demon can extract an unlimited amount of useful work from a tape with an unknown bit sequence if it has enough internal states to store the sequence—basically it can copy the information onto a second, internal tape. But the same work must be expended to re-zero this internal tape, preparing it to be used again.

Does this mean after measurement, the reduced entropy in the first tape goes to the second "internal tape" which stores the information? How can such measurement take place?

Eric
  • 831
  • 7
  • 14

2 Answers2

4

Let's consider just one cycle of the Szilard engine. Aside from discussion of energy free state polling, one of the main points of Bennett paper (if you mean Charles Bennett, "The Thermodynamics of Computation: A Review", Int. J. Theo. Phys., 21, No. 12, 1982) here is that you must build a finite state machine (a very simple three-state machine) as a minimal Maxwell Daemon. Whichever way you do it, you must implement storage for this state machine in some kind of physical, computer memory. Once you come back to the beginning again, you must either (1) use a new bit in memory for the next cycle, as in your drawings or (2) initialise the bit to use again, i.e. "forget" its former state. This "forgetting" is the key to the "mystery" of the decreasing information-theoretic entropy.

The laws of physics at the microscopic scale are perfectly reversible. That means that there is an invertible (indeed unitary) mapping between the microstate (full quantum state) of any physical system at any time and its state at any other time. If you have a system's full state definition at any time, you can derive from this the state at any other time - past or future. The World does not forget its history.

So, when you wipe the bit in the Maxwell Daemon, ready to begin a new cycle, ask yourself how this wiping can be in keeping with my last paragraph. How so? My last paragraph asserts this: the physical process of wiping must be invertible, in principle. This can only mean one thing: the process of wiping the bit must change the states of the "stuff" that makes up the computer subtly: you could in principle run a simulation of this whole process backwards, beginning with a full specification of its state after the deletion, and you would see this state change in the computer hardware's matter unwinding and restoring the wiped bit!

Therefore, as the Maxwell Daemon runs, the whole bit sequence, recording all the states of all the gas molecules in each of the cycles, must somehow wind up encoded in the changed state of the computer hardware's matter.

Repeated bit erasures change the state of the computer hardware's matter more and more

This is OK for a while. The Maxwell Daemon seems to win. But all finite physical systems have a finite information storage capacity. Look up the Bekenstein Bound, for example. In the end, the matter can encode no more cycle bit states, and the machine must stop. Or, another alternative: one can raise physical system's information storage capacity by making it hotter. So you would have to give the computer system's matter this extra capacity by thermalising it. That energy has to come from somewhere. Or, yet another alternative: we must do work on the system's matter to make happen other physical processes to encode the system's matter's physical state elsewhere in the Universe. In the environment of the computer. Later on, we shall need to do the same to the room that the computer lives in. This particular work is often done by air conditioners! (I jest a little here: most of the energy used by our computers is "inefficient": our computers use roughly ten orders of magnitude more energy than the Landauer limit, i.e. the work needed to erase and initialise memory we have just talked about.

As powerful as they undoubtedly are for thinking about statistical mechanics, abstract information-theoretic methods can beguile us into forgetting this one simple fact:

In physics, you can't disembody physical system information from its underlying physical system and think about it purely abstractly as we often do in pure information theory and, in particular, computer science. Nature writes down Her information, Her "bits" in physical "ink", so to speak. That "ink" is the state of a physical system

You might like to see my articles "Information is Physical: Landauer's Principle and the Information Soaking Capacity of Physical Systems" and "Free Energies: What does a physical chemist mean when he/she talks of needing work to throw excess entropy out of a reaction?"

  • Thank you. I found your argument very clear. Just one point: do you agree with this: Upon each erasure the entropy of the Demon (if isolated from environment) must increase by at least 1 bit (Landauer's principle), although this increase may be "subtly" dispersed out among the Demon's components as you pointed out. If so, then the total entropy of the system is already non-decreasing upon each erasure and there's no paradox anymore. Why wait for repeated erasure and the Berkenstein Bound? – Eric Nov 11 '14 at 10:56
  • But consider a Demon with a very large memory. It can make the engines do a lot of work (i.e., run many cycles) before there must be any erasure. What happens to the entropy of this system? – Eric Nov 11 '14 at 11:17
  • @Eric Firstly there is no in principle difference whether the Daemon has a very large computer memory or whether its state machine is one bit and the "erased" information is stored in the state of the "stuff" of the computer's components. Computer memory just encodes information in the "stuff" of the computer in a particular way - such that it can be read out on a standard bus - but it's still a special case of the more general consideration. Secondly, why wait? This point is made to emphasise that the memory is always finite; once it is filled up, there is no way to erase our bits ... – Selene Routley Nov 11 '14 at 11:37
  • @Eric ... any more, so something has to give. We'd find that the computer could not re-initialise its bits. At this point, we'd have to push some of the entropy out into the surrounding environment, a deed that calls for work by the Second Law. – Selene Routley Nov 11 '14 at 11:40
3

Bennett showed that no energy need be expended in the measurement process. Why does this not violate the second law of thermodynamics?

From the second law it follows than when macroscopic work is performed on the system when it goes from initial equilibrium state to final equilibrium state and heat transfer is prevented, the final entropy is greater or equal to the initial entropy.

Notice the word macroscopic. In thermodynamics, we can do work by pushing the piston, or turning a paddle, but not moving individual particles, because variables of these particles do not appear in thermodynamics. This is because such feat was impossible when second law was formulated.

The second law of thermodynamics was formulated and applies to macroscopic systems where we can measure only few (often <5) variables. It does not necessarily apply to purely mechanical systems like little balls in a solid wall container.

Manipulating directly the balls in the container refers to a model of a system from mechanics which is fully specified by positions and momenta (or other microscopic variables). For such model thermodynamic description is superfluous since we have equations of motion and has no claim of validity.

Now, system of little balls in container can be used to explain the behaviour of macroscopic system (gas, liquid), and even to explain why second law is valid in probabilistic sense, but only with additional assumption: that every two states of equal energy are equally probable.

If we know someone fiddles with the balls on the microscale, this assumption may not be justified and such system may not show behaviour compatible with the second law.

One can make the system of little balls do anything mechanically possible if he can measure their position and manipulate them, even make them all accumulate in the upper corner and stay there.

My understanding is that after the measurement uncertainty of position is reduced by half and entropy decreases.

Yes, but this is information entropy, not thermodynamic entropy. Equating thermodynamic entropy to information entropy is only justified if the latter is expressed as a function of few macroscopic variables like internal energy, volume, number of particles and the system is in thermodynamic equilibrium.