19

I have a lot of confusion around the concept of Entropy.

If we know a system's entire microstate, the number of ways it can exist in that microstate is 1. So, if we fully define our system, it will have zero entropy. (By multiplicity definition, and by Gibbs entropy as well).

What do we mean exactly when we say the universe tends to increase in Entropy? Assuming only classical physics, the universe's state at any time can be fully defined. There is only one way the universe can be in that state, therefore entropy should be always zero.

Perhaps entropy can only be defined in terms of a macrostate. In this case, does maximizing entropy tell us how the microstate evolves given a certain macrostate. Or does maximizing entropy tell us how the macrostate evolves in time (or both?)

Thanks so much to anyone who might be able to help me understand this!

  • Thos may help https://physics.stackexchange.com/q/583603/226902 and https://physics.stackexchange.com/q/450977/226902 – Quillo Mar 01 '23 at 07:39
  • 1
    Overall, entropy defines number of ways how system can transit between $N$ microstates. The fact that system exists in particular microstate does not say anything about it's entropy, like the fact that you experience particular emotion at the moment,- does not say anything about your character or traits. – Agnius Vasiliauskas Mar 01 '23 at 13:54
  • 1
    Personally, I found that entropy made a lot more sense to me after I had studied it in the context of measure theory and Kolmogorov's axioms. That purely mathematical approach generalizes away from just being about the physics, but it also helps me make sense of the physics. – Galen Mar 01 '23 at 20:45
  • Where do you get 'if we fully define our system, it will have zero entropy' when we all know that in practical terms, that can't be true? – Robbie Goodwin Mar 01 '23 at 21:38
  • 1
    Entropy is statistical. An analogy is a lottery: Your odds of winning are millions to 1 against, but the actual winner's odds of having won are 100%. – Barmar Mar 02 '23 at 15:49
  • Being confused by Entropy feels like a right of passage of undergraduate physics. Congrats! – akozi Mar 02 '23 at 20:57
  • the best way to understand "enthropy" is by learning first what means "information" in this context: read first Hartley, R. V. L. "Transmission of Information". – Joako Mar 04 '23 at 03:18
  • There exist many entropies, often confused: https://physics.stackexchange.com/a/739917/247642 – Roger V. Mar 15 '23 at 13:07

11 Answers11

23

When you know everything about a system, it stops being a statistical system, and you can just calculate its exact time evolution. In this time evolution there is no increasing entropy. Statistical mechanics becomes useful only if you have an ensemble of possibilities aka macrostate.

Note that in a (classical) many-particle system, even if you have some information about every particle, as long as you don't know their state with infinite precision, that information will quickly become useless due to chaotic behaviour. Only macroscopic variables are conserved well enough to predict them for a longer time - this is what you usually call the macrostate.

w123
  • 744
  • This is an useful observation, but it doesn't by itself answer OP's question of what we mean when we say the universe tends to increase in entropy. – HelloGoodbye Mar 02 '23 at 17:09
15

In an isolated system (this specification is essential), Entropy is the logarithm of the number of microstates ($\Omega$) corresponding to a fixed macrostate. For simple fluid systems, the macrostate is characterized by energy, volume, and number of moles.

There is no operative way to assign Entropy to a single microstate.

When we say the universe tends to increase in Entropy, we mean that if we have an isolated system (the thermodynamical universe) at equilibrium and with some internal constraint (for example, a wall confining a gas in half of the system) and we release that constraint, the new (unconstrained) macrostate will correspond to a larger set of microstates. When the system reaches the new equilibrium, its Entropy will be larger than the original. Notice that Entropy in Thermodynamics, as well as in Statistical Mechanics, is an equilibrium property.

Therefore, entropy can be defined only in terms of a macrostate. However, maximizing entropy does not give us a detailed description of how the microstate evolves given a certain macrostate. We can only say that typical microstates for the constrained equilibrium system usually become improbable for the same system when it reaches equilibrium after removing the constraint.

  • Thank you for this answer. I especially appreciate your discussion about probable microstates before/after a constraint. – Matthew Smith Mar 01 '23 at 19:05
9

You are onto an important and subtle question that many physicists have spent a lot of time debating.

Entropy as "ignorance"

According to one view [1] the level of entropy is a measure of "how much you don't know about the state of the system." This means that the same bowl of water might have a different entropy for two different people. Lets imagine we are looking at an ideal gas. I know the pressure, volume and temperature, but nothing more. There are many possible microscates that could give me those same pressure volume and temperature measurements, and I don't know which of those it is in. So for me the possibilities are many, and the entropy is high.

Now a Supercomputer looks at the same gas, but it has already done measurements with some fancy tech and it knows the positions and velocities of all the particles exactly. Their is only one unique state compatible with its measurements, so it assigns an entropy of zero.

Entropy of an Isolated System

The low-level interactions in physics are reversible. A consequence of this is that in an isolated system we can't have physics that evolves state $A$ into state $B$: $A \rightarrow B$ and evolves state $C$ into B $C \rightarrow B$. Because if this could happen then, when we reverse things, how does $B$ know whether it is supposed to turn into $A$ or $C$? The isolated caveat is important because we can have physics that does something like: $\{A, E \rightarrow B, L$ and $C, E \rightarrow B, M\}$ meaning that both $A$ and $C$ can turn to $B$, but that in doing so they change some other object, initially in state $E$, to either $L$ or $M$ and that "remembers" where we started.

Lets say we have done enough measurements to know that the system is either in state $A$ or $C$ each with 50% probability. If both $A$ and $C$ lead to the same final state when "physics happens" then we could eventually know for certain that the system was in state $B$. But this cannot happen, because the reversibility of physics forbids it (in an isolated system).

Do Measurements reduce entropy?

If we have a measurement device initially in state $X$ (meaning "not used yet") then we can use it to make a measurement of a system. The system is either in state $A$ or $C$ and we don't know. (Assigning 50% probability to each). The ideal measuring device might do something like: $\{ A, X \rightarrow A, Y $ and $C, X \rightarrow C, Z\}$, so that the device gives us an $Y$ if the system was in $A$, and a $Z$ if it was in $C$. So we now know more about the system and (to us) it has a lower entropy - Yes indeed. However, our measuring device started out in a nice state $X$, ready to be used. And it turns out that re-setting our measurement device so it is ready to be used again on a different system (or the same again) will create entropy. (We can't have $\{Y \rightarrow X$ and $ Z \rightarrow X \}$ remember, so it can't be reset in isolation). In this case its like our measurement apparatus contained a small amount of negative entropy we moved into the system.

Entropy of the Universe

Who's entropy of the universe? Well, ours is the only one we know about. We are ignorant of so much (the velocities of so many gas particles), and I have tried to motivate vaguely why learning about the velocities of those gas particles requires us to scramble up information we already had. In terms of pure information bulk we cannot reduce our ignorance - but it turns out (perhaps unsurprisingly) that increasing one's ignorance by just making a big mess is quite easy.

[1] https://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf

Dast
  • 1,796
  • 1
    Subjective interpretation of entropy is problematic if you want to connect with classical thermodynamics, since entropy can be measured (by quasistatically heating up from 0 to the temperature, measuring the heat flow and integrating the dQ/T.) So it wont give you what traditionally is the entrop state function – lalala Mar 02 '23 at 10:36
  • 1
    It's important to remember that, when we measure entropy in comfortable units, the differences caused by varying the subjective definition of a macrostate are around 10^-23 , which is usually inconsequential. – Matt Timmermans Mar 02 '23 at 13:53
  • @lalala. Its certainly plausible that the subjective interpretation has flaws in that regard, and what you are saying sounds similar to a conversation I once had but didn't fully understand. If you know anywhere (maybe wiki-page or paper) where these problems are laid out please let me know. – Dast Mar 03 '23 at 10:07
  • 1
    @lalala No, it is not problematic. In every thermodynamic experiment, you have to make a choice of (macroscopic) variables you can observe and manipulate. There is not "the" entropy, there are many different entropy functions, depending on the set of variables you consider. So to one physical system you can associate various different thermodynamical systems with different entropy functions. See basically any paper of Jaynes. And one more thing: Entropy is not a mechanical property; it depends on what you mean with "measure". Do we measure temperature? Well, no, in a certain sense: [...] – Tobias Fünke Mar 03 '23 at 10:34
  • you measure some mechanical quantity (say length contraction or expansion) which you relate, in your theory, with the thermodynamical concept of temperature. Similarly for entropy. – Tobias Fünke Mar 03 '23 at 10:37
8

The entropy definition is as follows:

$$S = k \; \text{ln}\,\Omega (x_{1}, ..., x_n)$$

Where $\;\Omega\;$ is the number of microstates that correspond to a system with some macroscopic parameters(the ones in which you are actually interested, like energy, volume etc.) $x_{1}, ..., x_n$.

It is not the "current microstate" or however you're viewing it in the question.

Sgg8
  • 513
6

"the number of ways it can exist in that microstate is 1" This is not the definition. The classical (mechanical) definition is rather based on the number of microstates that would result in the same macrostate. The fact that you know the actual microstate does not change the fact that if you looked purely at the macrostate then you wouldn't know which exact microstate that underlies it (similarly to if you were being told only a Yatzi result, then you wouldn't in general be able to tell the exact dice-and-eyes configuration.)

Steeven
  • 50,707
  • 8
    This is pretty much a "shut up and calculate" answer. Most textbooks pretend there is some objective way to define the macrostate, and this is a useful fiction to calculate thermodynamical stuff, but it doesn't hold up when you take an information-theoretical perspective. – w123 Mar 01 '23 at 14:36
  • 2
    and what is a macrostate - it is based on what you can observe from the outside - yes? So if you can observe everything then every microstate is a macrostate – user253751 Mar 01 '23 at 15:57
  • 1
    @user253751 A macro state is "this bucket is filled with 1L of water at 20°C". A microstate is an exact (to infinite detail) description of the water molecules including impurities etc. - their orientation, position, velocity vectors, electromagnetic properties etc. – AnoE Mar 02 '23 at 13:31
  • 1
    @w123 I don't understand your comment at all. There absolutely is an objective way to define a macrostate - that's what all of school physics, and plenty of real-world applied physics is doing all the time (when shooting a rocket to the moon, knowing the macro-state (for some definition of what that means) is enough and absolutely possible; it is not theoretical). – AnoE Mar 02 '23 at 13:33
  • 1
    @AnoE and why is that a macrostate? – user253751 Mar 02 '23 at 13:33
  • @user253751, per definition? – AnoE Mar 02 '23 at 13:34
  • @AnoE and why is it defined like that? – user253751 Mar 02 '23 at 13:34
  • @user253751, did you try googling? https://www.theory.physics.manchester.ac.uk/~judith/stat_therm/node55.html – AnoE Mar 02 '23 at 13:43
  • @AnoE it says it's because it's what we can observe. Which is what I originally said. – user253751 Mar 02 '23 at 13:45
  • 1
    Whatever you intended to achieve with this style of comments - you win. Congrats! – AnoE Mar 02 '23 at 13:46
6

Building off the other answer, if you think of an ideal gas with $PV = RT$, and you know the current macro state (P, V, and T), you could then ask what is happening in the micro sense. Each particle of gas can have a location and velocity at any moment, all you know is that the average of all of those locations and velocities is giving a macro behavior of $PV = RT$. The entropy is related to the number of ways the system can be arranged so that this macro behavior is observed.

5

The whole point is that the microstate is not known and it cannot be known. All we know is the macrostate. Entropy is the log of the number of microstates that are possible for the given macrostate. It is as if we are trying to guess the microstate by looking at the system from the outside: the more the microstates inside, the harder it is to guess.

Themis
  • 5,843
5

There is more than one definition of Entropy. We kept on adding definitions because we kept on running into things that behaved like Entropy.

It might help to think of Entropy as information. What information does it take to describe a system? How much information?

Entropy always increases is a side effect of information is not destroyed. As your system evolves, the amount of information it takes to describe it could be the information required to describe the initial state, plus the information required to describe how it evolved.

But, you say, evolution is defined by its initial state!

But, this isn't true; in a quantum world, your initial observations of the state are not sufficient to predict your future observations. What more, any series of observations aren't sufficient to fully predict future observations. And even if you just want distributions, every observation will impact the distributions of what future observations will look like; you need to remember how you interacted with the system in order to know what state the system is in. And every interaction adds to the amount of information you need to store in order to fully describe its state.

Backing up a second, and without quantum mechanics, presume you know a lot about the world, but not everything. There are error bars on your ability to describe the universe.

So you take your universe's states and you divide it up into "given my level of observation, this is what I can distinguish between" pieces.

Next, I'm going to require that you don't have a near infinite number of possible states. Instead I'm going to require that you have the same number of states at each power of (say) 10 level of precision.

So if your ability to see the universe is accurate to within 1/10^30 meters, I'm asking for an equal number of states of precision 10^30, 10^29, 10^28, 10^27, 10^26 etc.

The lower precision buckets are going to be far larger than the higher precision buckets. But you are free to arrange states in any bucket.

If we describe the number of macro states in the large buckets, they'll be much larger than the number of macro states in the smaller buckets.

"Entropy always grows" could be saying "you can't arrange the states so that the universe doesn't move into the bigger buckets over time, and once it does it won't go back into the smaller buckets with near certainty".

Higher entropy states have insanely more microstates than lower entropy states. If we divide the universe in any halfway sensible way into macrostates, the universe will evolve from low-entropy macrostates to high-entropy macrostates, because staying within low-entropy macrostates requires us to pick the microstates within the macrostates with insane precision. There isn't enough room to fit the microstates we could evolve into inside the low-entropy macrostates.

The only way around this is to create something akin to 1 macrostate per microstate. Then we can no longer measure entropy, because everything has 0 entropy.

You can get mathematical about this. Our ability to describe the universe with mathmatics is an example of an insane ability to compress the states of the universe. And, from computational complexity theory (another entropy), perfect compression is impossible. You cannot take a system that has 10^30 states and a way of describing it that can fully describe it in less than log_2 10^30 bits.

As mathematical descriptions of the universe in its current state provide a highly accurate compressed description of what the future universe will look like, that means we are in a low entropy universe.

There are going to be far, far more universe states in which any attempt to build a mathematical model to predict future behavior means that the (description of the universe) + (mathematical model) to predict (future universe state) will be larger than the information we get about the future universe state.

This includes counting. Counting is an amazing compression; 1 sheep, 2 sheep. By saying we have 27 sheep, I'm describing sheep once, and then using log(number of sheep) bits of information to say we have a bunch of stuff like that one thing.

If sheep is a meaningful term, then saying "this is a sheep, with the following differences" is less information than fully describing the sheep.

Universes with 2 of some structure or arrangement of things that can be described this way -- in which counting has meaning -- are far fewer in number than ones that don't.

If we have 10^30 subatomic particles (roughly a human) and we need 10^10 bits to describe each of their locations, that is 10^40 bits to describe the matter of a human.

Fully describing what a human is (and isn't) might require 10^40 bits of information as well. But once you have done so, it might only take 10^30 bits to describe "this particular human".

If there is two humans in the universe, then you can say "10^40 bits for a human, then 2 times 10^30 bits for each of the humans".

This is efficient compression. And universes that can be compressed this way - by any algorithm - are in the minority.

"Entropy always increases" can be converted to "for any model of physics, the universe will evolve into a state where the model of physics is less useful".

Yakk
  • 4,292
  • Thank you for such a thoughtful answer. I'm realizing entropy has far more to it than my intro textbook (Schroeder) would have me believe. I also think "If we divide the universe in any halfway sensible way into macrostates, the universe will evolve from low-entropy macrostates to high-entropy macrostates" is a very elegant way of putting it! – Matthew Smith Mar 01 '23 at 19:09
  • @MatthewSmith It can also help to realize that entropy is a logarithmic scale. Each point of entropy multiplies the number of states of one previous point. And when we work with entropy, we end up adding huge numbers. This corresponds to insanely huge multiplications of the number of states described. A macrostate with 10,000 more entropy has 1 followed by 3000 zeros times more states in it than one with a mere 10,000 less entropy. This is really, really big. And 10,000 units of entropy a tiny physical event, like a grain of sand cooling off slightly. – Yakk Mar 03 '23 at 15:27
1

One perspective that really helped me understand entropy is connecting the dots between the thermodynamic and statistical mechanical definitions.

See Enrico Fermi's book Thermodynamics for a good explanation into Clausius' Theorem and the definition of entropy:

$$S(A)=\int_O^A \frac{dQ}{T} \tag{1}$$

Where $O$ is an arbitrary state which we can assign $S(O)=0$. Taking a differential of $(1)$ gives $dQ=TdS$.

Pathria's Statistical Mechanics develops Boltzman's classic equation for entropy:

$$S=k_B\ln\Omega \tag{2}$$

This can be established via two assumptions:

  1. Holding $N$ and $V$ constant, $\Omega \equiv \Omega(E)$
  2. For a two systems in thermal contact, equilibrium occurs when $\frac{\partial \Omega_0}{\partial E_1} = \frac{\partial \Omega_0}{\partial E_2} =0$

With these assumptions in place we establish some idea of temperature:

$$\frac{\partial \ln \Omega_1}{\partial E_1} = \frac{\partial \ln \Omega_2}{\partial E_2} =constant \tag{3}$$

Why temperature? It's the instrinsic thermodynamic state variable that we know is the same for two systems in thermal contact, after equilibrizing.

Then it's just a matter of definitions:

$$(3)\equiv\beta=\frac{1}{k_B T}$$

There are other arguments as to why $\beta=\frac{1}{k_B T}$ which I won't derive here - but it is derivable when considering an ideal gas and the canonical ensemble.

It does not matter if we are referring to system 1 or system 2, the same equation applies in equilibrium, so using the total average energy is still valid,

$$\frac{\partial \ln \Omega}{\partial U} = \frac{1}{k_B T}$$

At this point it's just a matter of plugging in the definition of $(2)$ to see that it is valid:

$$\frac{1}{k_B}\frac{\partial S}{\partial U} = \frac{1}{k_B T}$$

Via the 1st law, and with constant $N, V$,

$$TdS=dU=dQ \tag{4}$$

Hopefully $(4)$ convinces you of the legitimacy of these definitions of entropy. There are more arguments regarding entropy in the domain of thermodynamics and they are all self consistent.

michael b
  • 782
1

You are alluding to the following definition of entropy:

The entropy of an isolated system (in equilibrium) with a given macrostate is (proportional to the logarithm of) the number of microstates (or their volume in phase space) that are consistent with this macrostate.

This is indeed a very useful and fruitful definition. However, it has at least one problem: it makes entropy subjective, i.e observer-dependent. The reason is that, it's not entirely clear what a macrostate is. We think of a macrostate as a specification of all degrees of freedom a system has that can be sensed/measured at the macroscopic level, e.g. coarse lengths and timescales. But what if a different observer can sense a different set of degrees of freedom? should they assign a different value for the entropy? for example, an alien who can easily sense and track $10^{23}$ particles individually - they would identify a microstate with a macrostate and would calculate an entropy of 0 (and their system would rarely be in equilibrium...).

(This issue may not be so terrible, because we can use integrals of motion etc as the set of macroscopic observables...)

(other relevant flaws with this definition are: it applies directly only to equilibrium states, and it is not clear how to apply it on complex systems like an animal body, whose macrostate is itself very complicated...)

Instead, we can use a different definition:

The entropy of an isolated system (in equilibrium) is (proportional to the logarithm of) the number of microstates, or their volume in phase space, that the system visits at coarse time scales.

This definition has issues as well (e.g what are coarse time scales? how to measure the phase space volume of a trajectory of states?). However, it solves the subjectivity issue mentioned above. It doesn't require that you define a macrostate or do any macroscopic-scale physics.

This definition agrees with the earlier one assuming the ergodic hypothesis (which I won't go into).

So: if you know the exact microstate of the universe at a given point in time, and you want to calculate the entropy, you have two options:

  1. Figure out what the macrostate is, and use the first definition of entropy. But you run into a problem because it's not clear how to specify a macrostate here - it's observer dependent (and why should we even talk about macrostates when we can handle microstates???). This is the issue your question is addressing.

  2. Figure out how much volume in phase space the universe goes through in some reasonable amount of time, and use the second definition of entropy. The microstate of the universe rapidly changes each moment in time, and you need to track (or calculate) how many unique microstate it goes through. With this definition, you don't have the issue you mentioned in your question.

Lior
  • 3,309
0

In Statistical Physics, entropy $S$ is defined as a measure of our ignorance on the precise state of the system. Denote $\wp_i$ the probability that the system be in the microstate $i$. Entropy is defined as $$S[\wp]=-k_B\sum_i\wp_i\ln\wp_i$$ Note that other definitions exist but they usually lead to a loss of extensivity.

Why does entropy increase in time? In classical physics, a microstate is defined as a given cell of phase space. The hypervolume of a cell is finite ($h_0^N$ for $N$ particles). This is necessary because, at some point, we will need to count some numbers of microstates. However, the consequence is that two points $(\vec r_1,\ldots,\vec r_N;\vec p_1,\ldots,\vec p_N)$ of the same cell are considered to correspond to the same microstate, even though they do not correspond to exactly the same mechanical state. Some information is therefore lost in this discretization of phase space. If we knew precisely the mechanical state $(\vec r_1,\ldots,\vec r_N;\vec p_1,\ldots,\vec p_N)$ of the system at time $t=0$, we could in principle integrate the laws of mechanics (Hamilton equations) and get the trajectory at any time $t$. We do not know the initial mechanical state but only the initial microstate, i.e. the initial cell of phase space. The points of this cell lead to different trajectories that will tend to get farther and farther from each other with time (because of collisions between particles). Starting from a given cell of phase cell, you have a non-zero probability to find the system in a number of different cells that increases with time. Your ignorance on the system, i.e. entropy, increases with time.

At very large time, you have completely lost all the information on the initial microstate. The system can be found in any cell of the domain of the phase space that is accessible. Your ignorance is maximal. If the system is isolated, its energy $E$ is fixed. Maximizing $S[\wp_i]$ with the constraint $\sum_i\wp_i=1$ leads to $$\wp_i={\rm Cst}$$ If the number of cells whose energy $E_i$ is $E$ is equal to $\Omega(E)$ (hopefully, we have discretized phase space so we can count them!), the constant is $1/\Omega(E)$. Plugging into the definition of statistical entropy, the usual expression is recovered $$S(E)=k_B\ln\Omega(E)$$ If the system is at equilibrium with a thermal bath, its energy is not fixed but can fluctuate. However, the average energy $E=\sum_i E_i\wp_i$ is fixed and is expected to depend on the temperature of the bath. Maximizing $S[\wp_i]$ with the two constraints $\sum_i\wp_i=1$ and $\sum_i E_i\wp_i$ leads to the canonical distribution $$\wp_i={1\over{\cal Z}}e^{-\beta E_i}$$ where ${\cal Z}=\sum_i e^{-\beta E_i}$. Plugging into the definition of statistical entropy, we get the relation $$F=-k_BT\ln{\cal Z}=E-TS$$

Christophe
  • 3,548