21

I understand that, in thermodynamics, entropy has a precise definition (the infinitesimal change of entropy being the infinitesimal heat transfer divided by the temperature), and that in statistical mechanics, for a system consisting of a large number of identical subsystems, so to speak, it is the log of the number of possible distributions of the subsystems, corresponding to some given energy levels.

And from what I understand, the Second Law of Thermodynamics basically says that, for some simple systems, such as two reservoirs at different temperatures, connected to each other (and isolated from the rest), heat goes from hot to cold, so that the fact that the entropy wants to be as big as possibly can, is similar to systems going towards an equilibrium state (at least for simple systems such as the one above).

(Please forgive me if my descriptions here are not so accurate from a physical point of view, for I am not a physicist, just a mathematician.)

Fine. But I frequently see people linking entropy to chaos. I assume there is some scientific work which started this train of thought, and then the media kept stretching the words further and further. Can someone please point me to that scientific work? Also, is the link between entropy and chaos valid, in the eyes of modern physicists?

I have seen a few questions that overlap with mine, but I have not found the exact answer to my question.

Malkoun
  • 641
  • 6
    Entropy is one of the most misunderstood and misused words in the English language (right after "freedom"?). You can challenge everybody who uses the word to explain the argument they are making with it using the actual definition of $dS=\delta Q_{reversible}/T$. If they can't, then you know that they have no idea what they are talking about. :-) – CuriousOne Jun 23 '16 at 21:04
  • 4
    @CuriousOne Entropy is one of the most misunderstood and misused words in the English language (right after "freedom"?) ; incidentally, I like to think of entropy as microscopic freedom ;) – Christoph Jun 23 '16 at 22:14
  • 2
    @Christoph: Did you see those cranium fragments reaching escape velocity? They are mine. :-) – CuriousOne Jun 23 '16 at 22:16
  • 1
    @CuriousOne, did you measure the energy released? How long do you have to get someone to rotate in their grave to match an exploding head? – Christoph Jun 23 '16 at 22:23
  • 1
    @Christoph: I will let you know after I have collected the leftovers of my scatter-brain. – CuriousOne Jun 23 '16 at 22:25
  • I think that entropy in popular perception and unfortunately in many textbooks is associated with disorder if not with chaos. These are similar, but chaos seems more extreme than disorder, but I agree, your point is correct. From memory I think that it was Boltzmann himself who used a word similar in meaning to 'disorder' and so started the whole thing. Perhaps someone knows this quote. It is unfortunate as it clouds what entropy is, which is well defined in classical and statistical thermodynamics, because 'order' and 'disorder' in common parlance are ill-defined and are subjective. – porphyrin Jun 29 '16 at 18:33

3 Answers3

11

I would say the connection between chaos and entropy is through ergodic theory, and the fundamental assumption of statistical mechanics that a system with a given energy is equally likely to be found in any 'microstate' with that energy.

Although chaos is a very general aspect of dynamical systems, Hamiltonian chaos (encountered in classical mechanics) is characterized by a paucity of conserved quantities, such as energy, and total linear/angular momentum. The crucial fact is not that these conserved quantities are merely difficult to find, but that they do not exist. Because of this, the trajectories of a chaotic dynamical system will trace out a high-dimensional submanifold of phase space, rather than a simple 1 dimensional curve. Each trajectory is locally 1 dimensional, but if you looked at the set of all points in phase space traced out over all time, you would find a higher-dimensional space, with dimension $2D-N_C$, where $N_C$ is the number of globally conserved quantities.

In most many body systems, $N_C$ is finite, or at least subextensive (i.e. any 'hidden' conservation laws are insignificant in the scaling limit as volume and particle number go to infinity, while keeping intrinsic parameters such as density and temperature constant). One takes total energy as the single most important conserved quantity, and the rest is textbook statistical mechanics.

Now, the precise type of non-linear behavior that would introduce ergodicity to the system is usually ignored in physics, because everything absorbs and emits radiation, which almost always causes systems to reach equilibrium. However, going a step deeper to consider self-thermalization of non-linear wave equations historically lead to the important discovery of the Fermi-Pasta-Ulam problem. Essentially, the discovery was that many nonlinear differential equations have hidden conservation laws that cause the heuristic argument described above to break down.

TLDR
  • 3,158
  • 2
    You mean a chaotic system is not automatically ergodic because the attractor is not guaranteed to fill the entire phase space, right? – CuriousOne Jun 23 '16 at 21:24
  • 3
    Depends on what space you are considering ergodic behavior in. Conserved quantities certainly prevent a classical system from filling all of phase space. For dissipative systems, you need to consider the trajectory of the attractor itself, and (possibly) the trajectory of the system within the attractor. In my answer, however, I only addressed non-dissipative systems. – TLDR Jun 23 '16 at 21:28
  • I don't know if naive statistical mechanics allows for a whole lot of leeway, there, does it? Can one modify the ergodic hypothesis for strange attractors and still recover standard thermodynamics? – CuriousOne Jun 23 '16 at 21:33
  • Thank you so much for the link with ergodic theory, and the ergodic hypothesis. I understand much better where it comes from, now, the link between chaos and entropy. Still, if I were to discuss the concept with a "layman", so to speak, or young people, I would prefer to avoid using the word "chaos" at least at first. – Malkoun Jun 23 '16 at 21:35
  • @Malkoun: You're welcome. For discussing with laypeople, I think it depends a little on their background. Since a lot of people are familiar with statistics (and digital security), a maximal uncertainty justification can be effective I think. To CuriousOne: I've only heard of strange attractors in the context of driven-dissipative systems (but I would be very interested in hearing if you know of conservative examples). This would indicate to me that strange attractors could only appear in nonequilibrium stat mech. – TLDR Jun 23 '16 at 21:41
  • I am not aware that chaotic attractors require driven-dissipative systems under the KAM-theorem nor do I understand how a dissipative system (from a physical rather than mathematical perspective) can even have a strange attractor, as it is under the thumb of the fluctuation-dissipation theorem, which will automatically smear it out into a "well behaved" average. – CuriousOne Jun 23 '16 at 21:45
  • 1
    As far as I know only dissipative systems can have attractors. – QuantumBrick Jun 23 '16 at 23:10
  • I guess there might be a few definitions of attractor, and strange attractor. From wikipedia, I had the impression that an attractor needs a non-trivial basin (i.e. the basin of attraction is an open set, of which the attractor is a proper subset), while strange attractors are fractal. However, Goldstein might say that an attractor is any set closed under time evolution, and that a strange attractor is an attractor with dimension > 1. In physics, I would only expect fractal structure to be apparent in a scaling limit (e.g. overdamped with strong driving force & interactions, or dynamical RG). – TLDR Jun 23 '16 at 23:48
  • @fs137 I was under the impression that chaos was much, much more special than just ergodicity. Are you saying they're the same thing? – knzhou Jun 24 '16 at 00:17
  • Well, I certainly wasn't trying to say they are the same, just that chaos is one route to the type of ergodicity that is needed in order for statistical mechanics to be valid. Both ergodic and chaotic systems have distinct general definitions. As I understand it, a chaotic system can always be made ergodic with an appropriate choice of state space, but an ergodic system is not necessarily chaotic. – TLDR Jun 24 '16 at 03:54
  • @fs137: that a strange attractor is an attractor with dimension > 1 – I do not now about Goldstein, but that definition includes tori, which I am pretty certain most people would consider not strange. Usually, you would require the dimension to be non-integer for the attractor to be strange. – Wrzlprmft Jun 24 '16 at 07:38
  • I agree. After checking Goldstein again, I should correct my earlier comment: strange attractors are defined in the text as being fractal. – TLDR Jun 24 '16 at 15:26
  • I am sorry but I have put -1 for now. Not because the answer is completely wrong but because it discusses only mainly Hamiltonian chaos without discussing issues like the KAM theorem and its relevance for the actual important stuff like ergodicity and most of all entropy. One could have easily mentioned the topological ($\epsilon-$) entropy for instance very much used in dynamical systems and discuss how it is bound from above by the equilibrium Gibbs/Boltzmann entropy for instance. – gatsu Jul 01 '16 at 11:16
8

I'll have to disagree that those notions of entropy are disjoint. I'll try to explain my view.

In Statistical Mechanics entropy is defined in terms of accessible regions in phase space. It is the logarithm of this volume times a constant. In the process of deriving this formula starting from the number of accessible configurations it is postulated that all configurations must be equally accessible. This postulate is called the Ergodic Hypothesis. Since you're a mathematician I think you're probably familiarized with the term ergodic: it is a system whose evolution preserves measure (in our case, Liouville measure, which is Lebesgue's measure on phase space). Now, not every system is ergodic. Even though, estimates can be carried out and point that in a general gas, which has a huge number of particles, non-ergodicity would result in an extremely small error in Physics measurements (Laudau does that in his first volume on Thermodynamics). Even though, systems like spin glasses are canonical examples of non-ergodic systems where usual Statistical Mechanics is not applicable.

You see that the ergodic hypothesis is a key assumption in Statistical Mechanics. But what does chaos mean in Classical Mechanics? Well, it means your trajectories will cover your whole phase space. If you take a chaotic system (which is not only ergodic but also strongly mixing), the particle's trajectory will cover each and every bit of phase space accessible to it, bounded by energy conservation laws.

The conclusion is that if you assume Statistical Mechanics as being applicable, this is the same as assuming you cannot predict trajectories in phase space, either because you have too many initial conditions or because you can't track each and every trajectory, and after an infinite time they'll also cover the whole phase space. This is intrinsically connected to the notion of chaos in Classical Mechanics.

In Thermodynamics I think no one really understood what entropy meant, so I can't elaborate on that. It only gets clear in Statistical Mechanics.

QuantumBrick
  • 3,993
  • Entropy is quite clearly defined in thermodynamics and it does have a very clear meaning that delineates reversible and non-reversible processes. The problem, I believe, may be that we often short-change physics students by teaching thermodynamics and statistical mechanics in a single class (at least that happened to me), whereas they are actually independent and equally important in their own right. The result is that students often replace poorly developed intuition for TD with half-baked intuition from SM. – CuriousOne Jun 23 '16 at 21:40
  • Yes, thank you QuantumBrick. Your comments, as well as those of fs137, are the kind of explanation I was looking for. – Malkoun Jun 23 '16 at 21:41
  • @CuriousOne That is true. In undergrad I had a subject both on TD and SM, but later on in my graduate studies I had three other subjects dedicated exclusively to SM. My understanding of TD is probably shameful... For me it looks as phenomenology with poor maths, and only SM taught me to see stuff properly. But again, I probably don't have a clue on that TD is. – QuantumBrick Jun 23 '16 at 21:44
  • Sadly, I have come away with the insight that industrial process engineers worth their title know TD usually better than physicists... to them SM is useless. They have to understand heat and material flow on a fundamental level and TD is the proper tool for that... at least in my undergrad days the physics department basically just winged it and left most of us hanging with completely insufficient knowledge. I can't claim that I ever recovered from that. :-( – CuriousOne Jun 23 '16 at 21:48
  • @CuriousOne I haven't had a Physics education myself, but I can tell that teaching thermodynamics and statistical mechanics in the same course for undergrads is a bad idea. For a very long time, I thought that thermodynamics was kind of obscured by some old language, perhaps because it was developed a very long time ago. Something about the language they used. I think thermodynamics should be taught first, and then statistical mechanics. The first one develops the physical intuition, and the second one explains it with more mathematics and better stated assumptions (erg. hyp., etc) – Malkoun Jun 23 '16 at 21:49
  • @Malkoun This is what is usually is done: TD then SM. Unfortunately, in my case, as soon as you know calculus and SM you never again return to TD. Perhaps one day I'll take my chance at filling that void. – QuantumBrick Jun 23 '16 at 22:13
  • @Malkoun: You got my vote... – CuriousOne Jun 23 '16 at 22:14
  • 2
    While we're having a rant, my physics education went along similar lines. The lecturer liked to stress the idea of entropy as 'missing information'; I sat there and thought We're talking about a physical quantity that relates energy and temperature. How does this make any sense at all... – Christoph Jun 23 '16 at 22:43
  • I took TD as an undergrad, liked it and did well but never really understood entropy. SM made it more clear in grad school, and I was amazed it was all so intuitively clear though not always easy to calculate. But to this day I find it difficult to understand some TD arguments. I am glad I'm not alone. I think you are all right, you have to go back to a better course in TD after the SM. – Bob Bee Jun 24 '16 at 03:10
1

The (physical) concept of entropy is predominantly applied to many-particle systems. We can regard such a system as high-dimensional dynamical system, whose dynamical variables comprise the positions, momenta, and other variable properties of all particles. It can exhibit, in theory, three types of dynamical behaviours:

  1. A low-dimensional regular (i.e., non-chaotic) dynamics, i.e., a fixed-point, periodic or quasiperiodic¹ one. Such dynamics are possible for very low temperatures, e.g., a completely frozen system would correspond to a fixed-point dynamics and simple lattice vibrations would correspond to periodic dynamics. For higher temperatures, however, such dynamics do not correspond to what we observe in reality and simulations.

  2. A high-dimensional regular dynamics, i.e., a quasiperiodic¹ dynamics. Such a system could be described as the superposition of many independent periodic processes, each having a different, incommensurable frequency. While these processes need not affect a single particle but could be rather obfuscated, there is no reason why they would not interact at all (for a high temperature and sufficiently many processes). Moreover, it can be argued that high-dimensional quasiperiodicity is practically indistinguishable from chaos.

  3. A high-dimensional chaotic dynamics.

So, it makes sense to say that a system that has some entropy (i.e., whose temperature is not close to absolute zero) also exhibits a chaotic dynamics on the microscopic level. But this does not mean that the two are the same. In a multi-particle system, it’s not the mere presence of entropy that we care about but how and when it increases. So, entropy and chaos are as much linked as entropy and temperature or, say, mass and momentum.

Note that this not really about ergodicity. Complex systems with insurmountable energy barriers (consider spin glasses) can still be chaotic; and quasiperiodic dynamics can be ergodic (consider the example of a single particle moving on a quadratic torus, which is ergodic and quasiperiodic if the components of its momentum are incommensurable, but periodic and not ergodic otherwise).

Finally, note that the information-theoretic concept of entropy is used to characterise dynamical systems and to distinguish between chaos and regularity, but I assume this is not the reason for your question.


¹ Quasiperiodic dynamics are dynamics which can be described as a superposition of at least two periodic dynamics, but are not periodic (which is why the frequencies of the sub-dynamics must not be commensurable, i.e., have a common multiple). While the phase space of a periodic dynamics is a topological circle, the phase space of a quasiperiodic dynamics is a torus or hypertorus.

Wrzlprmft
  • 6,242
  • 1
    I did not understand everything in your answer (my background is not in Physics), but I have come to think that the definition of entropy itself, say in Statistical Mechanics, is along the lines of microscopic freedom, as Christoph mentioned above, and that the link with chaos is more like a theorem, which does not hold for all systems, but it seems to hold for a big nice special class of systems. This of course brings up many interesting questions (which may be the subject of another post I suppose). – Malkoun Jun 25 '16 at 05:56
  • I did not understand everything in your answer (my background is not in Physics) – Can you elaborate what you did not understand, so I can improve my explanation? — that the link with chaos is more like a theorem – I never saw it formulated as such and I do not think that it would have any useful applications. Chaos theory and statistical mechanics tend to look at vastly different systems and vastly different aspects of systems. Also, keep in mind that almost every real system features entropy and almost every real system features chaos. – Wrzlprmft Jun 25 '16 at 06:40
  • I think what you are arguing for, is that chaotic and ergodic are two different properties of a system, am I right in interpreting your argument? Anyway, I am just learning about all this terminology. I would like to ask you, or someone else, to possibly recommend something to read containing the definitions of all these dynamical systems words. Any suggestion for a nice dynamical systems book, or online notes, or paper? I would like to see the words: regular, chaotic and ergodic, all clearly defined please. – Malkoun Jun 25 '16 at 07:36
  • In your classification, you seem to rule out the existence of high-dimensional regular dynamics. Is that a well-known fact? – I do not dispute the existence of high-dimensional regular dynamics (in fact, I encounter such systems in my research); I rule out the existence of regular, realistic many-particle dynamics (with some temperature). This should be impossible to validate in experiment or model, as we cannot distinguish a high-dimensional quasiperiodic dynamics from a chaotic one. However, as quasiperiodic dynamics are quite special, I argue that they should not prevail in reality. – Wrzlprmft Jun 25 '16 at 07:41
  • Ok, I understand. I apologize for misquoting you (I deleted that previous comment). – Malkoun Jun 25 '16 at 07:44
  • @Malkoun: […] recommend something to read containing the definitions of all these dynamical systems words. – I just stumbled upon these notes, which defines almost all those terms. However, be careful about the fourth part (“and yet more”), which, while technically correct, does not issue appropriate warnings against conflating similarly named concepts (such as molecular chaos and chaos). Also, the notion that all chaotic systems are ergodic is wrong, e.g., there are very simple chaotic systems that are weak ergodicity breaking. – Wrzlprmft Jun 25 '16 at 08:06