18

Is the entropy of every system zero at the absolute zero?

Or is it taken to be zero at the absolute zero?

Are there systems that doesn't reach zero entropy even till absolute zero?

zeal
  • 275
  • 1
  • 1
    No, and here's a counter-example. The entropy of an ideal gas is $S=Nk_B \log(V/(N\lambda^3)+5/2)$, where $\lambda$ is the thermal de Broglie wavelength, inversely proportional to temperature. Therefore, if we set $T=0$, we still have $S=Nk_B \log(5/2)$, which is clearly not zero. – JamalS Apr 25 '14 at 08:55
  • 1
    @JamalS I don't think that equation holds for $T=0$ simply because I cannot think of a material that would still be an ideal gas (or a gas at all for that matter) at absolute zero. I could be wrong though – Jim Apr 25 '14 at 12:57
  • 1
    @JamalS The ideal gas approximation does not hold in the limit $T \rightarrow 0$. It is unphysical in that limit. – Andrew Steane Sep 13 '20 at 13:11
  • @AndrewSteane is there a counter example to the question you can think of that does hold under $T \to 0$? – JamalS Sep 13 '20 at 14:58

5 Answers5

20

Not quite. Some systems can be in their ground state and still have a nontrivial state. For example, there may be several states a system can be in, all with the same, minimum ground state energy, and the entropy will therefore be $S=k_B \log N_G$, where $N_G$ is the number of degenerate (equal energy eigenvalue) distinguishable ground states a system can be in and $k_B$ the Boltzmann constant. If these degenerate ground states are not equiprobable, the entropy is $S = N_G k_B \sum\limits_j p_j \log p_j$, where $p_j$ are the probabilities for finding the state in its $j^{th}$ ground state. Likewise an imperfect crystal will have entropy "frozen into" the deviations of the actual crystal from an unflawed version. There is a nonzero number of bits needed to specify the deviation from the unflawed version of the crystal.

The Nernst Heat Postulate, or the Third Law of Thermodynamics is sometimes rendered, "The entropy of a perfect crystal at absolute zero temperature is zero" and is clearly worded in classical terms. For a quantum system one uses the concept of the von Neumann entropy, which is essentially the $S = N_G k_B \sum\limits_j p_j \log p_j$ definition and, because it can be nonzero at absolute zero temperature, the third law is not really a useful concept for quantum systems.

A practical answer, though, is often that the third law is an excellent approximation: the quantities of entropy "frozen" into ground states are almost always utterly negligible compared with the amounts of entropy a system takes on when thermalising after the absorption of some quantity of heat pushes the system away from absolute zero.

  • It is pretty clear that (by definition) entropy of a perfect crystal is zero at zero kelvin. Is this also true for fluids, in general? – noir1993 Jun 05 '16 at 18:15
  • 1
    @noir1993 Actually I don't believe your first sentence is true in general(theoretically) in general, unless of course "perfect" includes a "having a complete state specification of all the atoms in the lattice". There could be degenerate ground states that don't affect the crystal geometry. Practically I can't think of an example of such a degeneracy, but there are a good many real solid state physicists (unlike me) who work with very low temperature condensates who may be able to shed light on a follow up question. For a fluid at absolute zero, well that's the same as an amorphous .... – Selene Routley Jun 05 '16 at 23:24
  • 1
    @noir1993 .... solid at absolute zero. There is always some amount of randomness in the atomic positions (by definition of "amorphous") that the specification of the macrostate $0{\rm K}$ doesn't encode. So you would need to know the additional information telling you the random positions: this further information is a nonzero entropy. – Selene Routley Jun 05 '16 at 23:27
  • Wikipedia offers a different formulation: "The entropy of a system approaches a constant value as its temperature approaches absolute zero." (https://en.wikipedia.org/wiki/Third_law_of_thermodynamics). I'm not sure what this means. Is it just the mathematical statement that the limit of the entropy as $T\rightarrow0$ exists (instead of diverging)? – a06e Mar 22 '19 at 19:00
4

Considering very recent debates on the subjects of temperature, entropy and the Gibbs paradox, I have been involved with, I came to realize that the third law was at best a good practical thing but that's about it.

This is at least for two reasons:

  • Temperature is a pathological concept that leads to apparently contradictory conclusions based on the intuition of temperature we have 99% of the time. An example of that is the recent debate that has been going on because of this paper on the reality or not of negative temperatures which, according to the authors, do not make sense. The fact that negative temperatures are hotter than infinite ones is indeed puzzling and notions of warm and cold are a little bit shaken as well. However none of this arises if, instead of $T$, we speak in terms of $\beta = 1/k_B T$ where heat always flows from a small beta (albeit negative) to a bigger one. The fact that the absolute zero cannot be reached is also understandable as it is in fact an infinite $\beta$ and infinities cannot be reached. Hence, from this point of view the Third law is already weird as it states something about a notion that leads to inconsistent conclusions if we stick to it too much.

  • The goal of the Third law is to define an absolute entropy scale but that cannot make sense in any practical case. I don't even want to enter the quantum realm as I believe it is not necessary to make my point. Any system with interactions will end up in a ground state at very very low temperatures (or high $\beta$ I should say) and nothing will happen after that. Depending on your definition of a state you might have some degeneracy in the ground state due to some symmetries or something along that line. Fair enough, you can simply define the relevant states as being those which are invariant under these symmetry groups and you will get zero entropy for any system of your choice. Now, the problem arises when you bring two different systems in contact which have nothing in common. When they were separated, you surely had to use a different definition of the notion of "microstate" to get entropy zero at low temperature for each one of them. Now when you put them together, they might interact and yield a whole new set of symmetries in the ground state. The question is then, which definition of relevant microstates do you use then so that the entropy of the whole is zero at T=0? I think that you have to redefine again a new one for this new system.

The bottom line is that when you have a thermodynamic system and you are interested in the phase behaviour of this system, you need a reference point somewhere. If the system is always the same, this reference point can be taken as being the entropy at very low temperature and every quantity will be defined with respect to this reference as long as the system itself is not changed. As a joke/example, the idea is that the entropy of a tomato at $T=0$ has no reason to be zero when the entropy of a potato is zero at $T=0$ and vice versa.

gatsu
  • 7,202
  • 1
    I have not voted either up or down, but I wish to signal for other users that I think this discussion is mostly wrong. Entropy and internal energy are well-defined concepts in the thermodynamic limit, and consequently so is temperature (in equilibrium conditions). – Andrew Steane Sep 13 '20 at 13:18
  • @AndrewSteane Very kind of you to warn others. Feel free to downvote if you feel like it. Could you please feel state what is wrong in the discussion above. Have you been involved in the debate with Dunkel et al. by any chance? – gatsu Sep 14 '20 at 14:40
  • The reference to negative temperature is irrelevant. 2. I think ground state degeneracy is the important point here, and it is not circumvented by picking a state which is invariant under a symmetry group. The issue is the dimensionality of the Hilbert space singled out by $H \phi = E_0 \psi$. I think both tomato and potato have a non-degenerate ground state; consequently both tend to zero entropy at $T \rightarrow 0$ and this is no coincidence.
  • – Andrew Steane Sep 14 '20 at 15:07
  • Discussing negative temperatures gives rise to different definitions of entropy and temperature so it should be relevant to the equilibrium statistical mechanics understanding of the 3rd law. 2. The ground state degeneracy will correspond to an equivalence class of wave functions set by the symmetries of the problem. Mix potatoes and tomatoes and get the corresponding entropy when T tends to zero. Do you get zero?
  • – gatsu Sep 14 '20 at 16:51
  • negative $T$ comes up for degrees of freedom where energy is upper-bounded, but all systems have access to kinetic energy with no upper bound, consequently negative temperature is never a true equilibrium; it is at best metastable. 2. I think you must be asserting that a mixture of potatoes and tomatoes has a degenerate ground state. Perhaps that might happen, perhaps not. If not then the entropy is zero.
  • – Andrew Steane Sep 14 '20 at 17:00