0

What are the hypothetical possibilities that entropy of a closed system may decrease?

I would accept plausible but hypothetical setups.

For instance,

  • Non-trivial timeline topology (closed timelike curves, interaction of matter with different time arrows etc)

  • Non-trivial spatial topology (dynamical cosmic horizon etc)

  • Observer-dependent selection in theories with distinguished observer (anthropic principle, Boltzmann brains etc)

  • Various kinds of available hypercomputing oracles

I want details on how entropy may rise in all the listed possibilities and other possibilities as well.

Qmechanic
  • 201,751
Anixx
  • 11,159
  • 2
    "I would accept plausible but hypothetical setups"... So the Second Law of Thermodynamics allows you to believe that there are any plausible setups? – Sean Nov 24 '14 at 19:11
  • @Sean - See my answer below, the second law deals with long-term behavior but does not rule out reliable short-term decreases in entropy in an isolated system. – Hypnosifl Nov 24 '14 at 21:06
  • Describe a system in which there is no way to have a change from a micro-state of maximum entropy to another such state. Then anytime the system is in a micro-state of maximum entropy, any change causes a momentary decrease. I believe that a 1-D Ising model is such a system. – dmckee --- ex-moderator kitten Nov 24 '14 at 21:27

2 Answers2

3

After reaching thermodinamic equilibrium, if you wait enough time, any system will reaach, by random chance, any state of lower entropy. Of couse, the time needed to for such a fluctuation to occur increase exponentially with the amount of decrease in entropy (search poincare recurrence theorem).

My guess is that this is not the kind of answer you were looking for. But I have one that you might like. In Wolphran's book a new kind of science he makes a lot of computer experiments with cellular automata. He finds a range of behaviors for the evolution of the entropy for different rules. From systems with standard behavior where the entropy increases until it reaches equilibrium and then fluctuates according to poincare's theorem, to the other extreme in which regardeless of the initial conditions, the entopy increases (but the behavior is trivial in the sense that always converge to a frozen state (usually all ones or all zeros). The interesting rules are the ones in the middle, the dynamics behave in complex ways, but still their entropy sometimes increase and sometimes decreseas, with no apparent arrow of time direction.

Regarding hypercomputing oracles, just my two cents: I believe that for any practical purposes a superturing machine will behave as a random number generator. It will not increase order, at least in the way we currently define it.

Also, assuming a closed non expanding universe) if gravity starts to change (or oscilate) with time quikly enough, then entropy will oscilate with these changes. The reason is that in absence of gravity the universe will flow into a thermodinamic equilibrium with uniform mass density, but in the presence of gravity such a state is actually a state of high entropy (the largest entropy state with gravity would be prety similar to all the mass concentrated at one place.

0

The second law implies that entropy of an isolated system should be expected to go to a maximum given a sufficiently large length of time, but in the short term, for certain isolated systems it would be possible to have the probability that entropy will decrease over a certain time interval be larger than the probability it will increase over the same interval. A system with a Maxwell's demon type machine would be an example. The widely-accepted analysis by Charles Bennett indicates that in order to function the "demon" has to record information about the molecules it interacts with in some kind of physical memory, and when the memory fills up the process of erasing the memory must increase the system's total entropy by an amount greater than or equal than the decrease in entropy the demon was able to create by sorting molecules. But from the time that the demon starts with a "blank" memory to the moment just before its memory fills up, in theory it can reliably decrease the entropy in an isolated system.

If you're familiar with the notion of phase space and microstates vs. macrostates, this can make it easier to conceptualize how this works. Starting from a single macrostate of the form "Gas at entropy S, demon's memory in "blank" state 00000000..." at initial time T0, all microstates in that macrostate will then evolve into microstates at T1 belonging to some member of a large set of possible macrostates of the form "Gas at lower entropy S', demon's memory in state X", where X is some arbitrary complicated string like 010111010... Each of these individual macrostates at T1 has a smaller multiplicity (number of microstates belonging to that macrostate) than the single macrostate we started from at T0, but the total summed multiplicity of all these macrostates at T1 is still exactly equal to the multiplicity of the single macrostate at T0, as required by Liouville's theorem. So the "trick" here is just that we have coarse-grained the set of all possible microstates in such a way that the initial set of microstates at T0 are treated as a single macrostate, while the later set of microstates at T1 are treated as a large number of smaller macrostates.

Hypnosifl
  • 6,190