0

The entropy of an ideal gas is known to be $$S(V,T)=S_{0}+nR\log \left|\frac{V}{V_0}\right| +C_v\log\left| \frac{T}{T_0}\right|.$$

Now let us have a cylinder of volume $V=V_1+V_2$ separated by an impermeable partition on whose either side there is an ideal gas of $n_1$ or $n_2$ moles. These gases are different but have the same temperature and pressure. On removing the partition the gases diffuse into each other. Since the inter-diffusion of ideal gases is like a Joule expansion into the full volume at constant temperature the resulting entropy increase is $$\Delta S_{12} = \Delta S_1+\Delta S_1 =n_1R\log \left|\frac{V_1+V_2}{V_1}\right|+n_2R\log \left|\frac{V_1+V_2}{V_2}\right| $$ Note that this $\Delta S_{12} >0$ always.

Now assume that the partition separates the same type of gas, since the two parts parts are already, by definition, in thermodynamic equilibrium, removing the partition causes no inter-diffusion. Consequently the entropy increase is zero, $\Delta S^* =0$.

What is the reason for this discrepancy and how it can be resolved in classical thermodynamics?

dgamma
  • 135
TomS
  • 883
  • 5
    This question is not clear. There is no $N!$ in thermodynamics. It appears in statistical mechanics based on the concept of microstates. Moreover, there is no Gibbs paradox in thermodynamics. – GiorgioP-DoomsdayClockIsAt-90 Aug 19 '23 at 10:09
  • Exactly. That’s why I am asking for a resolution in the context of thermodynamics. – TomS Aug 19 '23 at 10:21
  • In thermodynamics you (implicitly) treat all particles as indistinguishable. The $N!$ in statistical mechanics is introduced because otherwise you'd treat the particles as distinguishable. – Tobias Fünke Aug 19 '23 at 10:36
  • 1
    To allow a meaningful answer, you should explain what would be the Gibbs paradox in thermodynamics. I do not know it. – GiorgioP-DoomsdayClockIsAt-90 Aug 19 '23 at 11:45
  • Tobias, no, you cannot treat particles implicitly as indistinguishable in thermodynamics, you have to introduce this argument somehow explicitly. But you can’t do that w/o counting microstates - which is IMPOSSIBLE in thermodynamics b/c microstates do not exist in thermodynamics. I know where the factor is coming from in statistical mechanics, but that’s not my question. – TomS Aug 19 '23 at 12:20
  • @TomS you should use @ ...to notify another user. In any case: Yes, you do treat particles (of the same species) in thermodynamics as indistinguishable! You are dealing only with marco variables and not with each particle individual; thermodynamics says, for example, that you have $N$ particles (of a certain type) in a volume $V$ with temperature $T$ - and not that you have this particle here, the other there or what ever. – Tobias Fünke Aug 19 '23 at 12:52
  • 1
    See e.g. the paper Comment on “The Gibbs paradox and the distinguishability of identical particles, which discusses the Gibbs paradox and its relation to thermodynamics, in some sense. But the comment of @GiorgioP-DoomsdayClockIsAt-90 still holds. You should explain what you mean. As it stands, the question does not make much sense... – Tobias Fünke Aug 19 '23 at 12:56
  • Please explain what’s wrong with this question: “My question is whether it is possible to derive a correct formula for the entropy and therefore resolve the Gibbs paradox using thermodynamics only”. Just forget what we know about statistical mechanics. – TomS Aug 19 '23 at 13:02
  • Anyway, I will check the paper. – TomS Aug 19 '23 at 13:04
  • 1
    Can you just give a reference for the Gibbs paradox in thermodynamics? What do you mean with correct formula? Do you understand (one of the usual) formulation(s) of the paradox? Namely that the entropy derived from stat. mech. does not give the same results as in thermodynamics?! – Tobias Fünke Aug 19 '23 at 13:12
  • Again, I don’t refer to statistical mechanics, so this is pointless. Gibbs himself talked about mixing of two different gases and calculated the entropy: “It is noticeable that the value of this expression does not depend upon the kinds of gas which are concerned, if the quantities are such as has been supposed, except that the gases which are mixed must be of different kinds. If we should bring into contact two masses of the same kind of gas, they would also mix, but there would be no increase of entropy.” (1876) – TomS Aug 19 '23 at 13:42
  • Another formulation is that you calculate the entropy for mixing of gases with atoms of mass $m_1$ and $m_2 = m_1 + \Delta m$ and look at the limit $\Delta m = 0$. – TomS Aug 19 '23 at 13:45
  • "The Gibbs paradox" isn't a real problem to be resolved. It's an alleged/pseudo paradox. Similarly to hydrostatic paradox. – Ján Lalinský Aug 19 '23 at 23:10
  • What Gibbs points out is that increase of entropy depends on what is mixing. If the same gas is mixing, meaning we can't separate the two sets again via macroscopic operations, there is no entropy increase. If the gases can be distinguished and separated again via some macroscopic operation, then there is entropy increase. – Ján Lalinský Aug 19 '23 at 23:11
  • 1
    Some people are bothered by the fact that multiplicity (and thus its logarithm = entropy not yet fixed into homogenous function of first order) increase is a discontinuous function of "distinguishability": it is zero if distinguishability is "zero" and positive big non-zero if distinguishability is not zero. Well, distinguishability is a pretty discrete concept, and theory can work with such discontinuous entropy behaviour, so that's why there is no real problem. – Ján Lalinský Aug 19 '23 at 23:16
  • @Qmechanic, thanks for clarification and improvement! – TomS Aug 24 '23 at 10:10

6 Answers6

3

The back-and-forth in the comments is due to the fact that you are conflating the Gibbs paradox with the entropy of mixing - two distinct (albeit related) concepts. The former is specifically a quasi-paradox which arises in statistical mechanics, and so asking for its resolution in the context of thermodynamics does not make sense.

The answer to the question in the body of your post is that there simply is no discrepancy to resolve. Mixing two different gases is manifestly different from mixing two samples of the same gas; the former introduces entropy, and the latter does not. This is tied to the fact that the latter is reversible by simply re-introducing the partition, while the former is not.


One then might ask, what do we mean by different? Or rather, how different must two gases be in order for this "entropy of mixing" to take effect? For instance, I might say that I have two boxes of N$_2$ at the same temperature and pressure, so when I remove the partition between them, the entropy of the system stays the same. But you might counter with the fact that nitrogen has several stable isotopes - perhaps one box has slightly more $^{15}$N than the other. In the limiting case, perhaps one box contains pure $^{15}$N and the other pure $^{14}$N. At what point are they sufficiently different to conclude that an entropy increase has occurred?

The answer to that is an often overlooked subtlety in thermodynamics - namely that entropy is, to some extent, subjective. Let's say that Alice and Bob are physicists who study completely identical systems, but their experimental capabilities are different. Alice has a brand new dual-comb spectrometer which can precisely measure the ratio of $^{15}$N to $^{14}$N in her systems, but Bob's lab has not yet invested in that instrument. As a result, when Alice does her calculations, she treats $^{14}$N and $^{15}$N as different atoms, while Bob treats them as the same because he cannot experimentally resolve the difference with the tools available to him.

If Alice receives a box of $^{14}$N and a box of $^{15}$N at the same temperature and pressure, then she would be able to tell the difference between them. If she allowed the boxes to mix, she would say that the total system has increased in entropy as per the standard formula. The results of her experiments would be compatible with the predictions she obtains from her thermodynamic calculations.

If Bob received the same boxes, he would not be able to make the same determination. To him, they are the same gas, and when he allows them to mix, he would say that no entropy increase has occurred. And as long as none of his experiments are sensitive to the difference between $^{14}$N and $^{15}$N, the predictions he makes using thermodynamics would match his observations, too.

This all comes down to the question of how you define a "system" in thermodynamics - and the operational answer is that two systems are different if and only if you have the experimental means to distinguish between the two. The fact that thermodynamics is somewhat agnostic to fundamental questions of e.g. identity is part of why it is so powerful and general a framework.

J. Murray
  • 69,036
  • There is no conflation of concepts because Gibbs himself derived this paradox using his entropy of mixing, see page 166, eq 297, section "Consideration relating to the Increase of Entropy due to the Mixture of Gases by Diffusion," in Gibbs: Collected works, vol I. – hyportnex Aug 19 '23 at 19:06
  • 1
    +1. Yes, it always surprises me (again) that the "anthropomorphic" nature of entropy is not really well explained in many courses/books etc. Entropy is subjective in the sense that it is a function of the macro variables we choose to describe the system with - and these depend on knowledge/experimental control, which you've illustrated here very nicely. Regarding that, let me point to this post. See also Jayne's article on the Gibbs paradox. – Tobias Fünke Aug 19 '23 at 19:07
  • 1
    @hyportnex My point is that the phrase Gibbs paradox is generally used to describe the fact that if you use statistical mechanics to compute the entropy of an ideal gas (as the log of the number of accessible microstates), then the result is non-extensive. If the OP is not interested in a discussion of statistical mechanics, then that nomenclature is inappropriate. – J. Murray Aug 19 '23 at 19:17
  • @hyportnex And for what it's worth, I am happy to be relaxed or flexible in terminology and what we do or do not refer to as the Gibbs paradox(es). However, there was a back-and-forth argument in the comments that I wanted to address. – J. Murray Aug 19 '23 at 19:21
  • I rewrote the original question exactly for that reason concerning those back-and-forth comments. But note that since I am neither a statistical nor a thermodynamical mechanic, if Gibbs, undoubtedly the first time, writes down a paradox, then it is his paradox as far as I am concerned; who am I to argue? I still cannot forgive Griffiths for giving the credit to Jefimenko for what Ignatowski discovered at least 50 years before and generations of antenna engineers knew since 1920-s. – hyportnex Aug 19 '23 at 19:34
  • Mixing two different gases is manifestly different from mixing two samples of the same gas; the latter introduces entropy, You probably wanted to write the former introduces entropy.

    – Ján Lalinský Aug 19 '23 at 23:57
  • @JánLalinský Yes, thank you – J. Murray Aug 20 '23 at 14:12
  • That’s very helpful. Thanks for clarification. – TomS Aug 24 '23 at 06:56
1

Replace the impermeable partition with two partitions: one permeable to gas A (but not gas B) and one permeable to gas B (but not gas A). Each partition tends to move through the gas it’s permeable to, to allow expansion of the gas it isn’t permeable to. Collect this pressure–volume work separately. Assuming the partition movement is quasistatic, the process is reversible and the total entropy constant. Note that the gases cool as they adiabatically expand.

Now dissipate the stored energy in the container (e.g., by running a current through a resistor). The entropy has now increased, and the gases are mixed, and the temperature has returned to its original value. The state is indistinguishable from that in which all partitions were removed and the gases allowed to mix irreversibly.

In the case of a single gas, no partition can be produced that’s both permeable and impermeable to the gas, so the above operation isn’t possible. The gas remains at its original entropy. This may clarify the discrepancy.

(An interesting aspect is what it means for two gases to be “different.” If I look only at transparency, for example, then nitrogen and oxygen look identical either separated or mixed. No tailored partition seems possible, and I calculate a constant entropy in all cases, due to my ignorance. Once I learn about atoms, I can in theory design a selective partition and perform the expansion described above. The same argument can be applied once I learn about isotopes of a single particular gas. If someday we learn of a new characteristic distinguishing two types of N-14, we could in theory develop a new type of partition, and we’d need to revise our entropy calculations. This has interesting implications for the objective/subjective nature of entropy. In a thermodynamics context alone, for the above thought experiment, “different” can mean only “separable by a selective partition.”)

1

You should clarify what you mean by "discrepancy" and what exactly you want to "resolve". Note that there are two distinct paradoxes which are both called "Gibbs Paradox" in the literature. One is about a false increase in entropy which, in statistical mechanics, is calculated when you remove the partition between two gases consisting of distinguishable particles. The other is about the discontinuous vanishing of $\Delta$S when the two gas types transition from similar to same. The first of these two paradoxes is resolved in my paper "Demonstration and resolution of the Gibbs paradox of the first kind" Eur. J. Phys. 35 (2014) 015023 (freely available at arXiv).

HjP
  • 36
  • 2
  • Your entropy decrease (14) in "Demonstration of the GP1" does not contradict 2nd law, it only shows your definition of entropy implying entropy function $S_0(T,V,N)$ (10) violates a desideratum that entropy function be such that splitting the closed system into two closed systems and assuming additivity of entropy results is no entropy change. As far as I know, 2nd law does not presuppose or imply such desideratum, it is a convention. – Ján Lalinský Aug 20 '23 at 01:07
  • Regarding $\Delta S$ in adiabatic processes, 2nd law states only that single closed system undergoing a reversible change keeps constant $S$. Splitting the system into two closed systems is not covered. These are two different systems, can have two different entropies. 2nd law does not fix absolute value of entropy, or that it be additive when splitting a closed system into two closed systems, or combining closed systems into one closed system. – Ján Lalinský Aug 20 '23 at 01:07
  • The function $S_0$ is perfectly fine entropy regarding 2nd law of thermodynamics and its implications; it obeys $dS=dQ/T$ during allowed reversible transformations. It just isn't a homogeneous function of first degree, so it's "ugly". – Ján Lalinský Aug 20 '23 at 01:11
  • @Ján Lalinský I'm not sure whether I correctly understand your objection.
    • Are you objecting to the (statistical mechanical) entropy definition of $-k \sum_{m} P(m) \ln P(m)$? If so: My paper is based on this entropy definition and if you reject it, then my paper doesn't apply.
    – HjP Aug 20 '23 at 12:32
  • Or are you saying that the second law doesn't apply to the process of partitioning a gas volume, because we have one system before and two systems after the partitioning? If so: You have to consider the whole system consisting of the two subsystems and the partition as one single system.
  • Or do you object to the additivity of entropy (that is, do you object that the entropy of the whole system is not the sum of the entropies of its subsystems)? In that case please read Section 3 of my paper where I point out that, indeed, additivity doesn't hold for systems of distinguishable particles.
  • – HjP Aug 20 '23 at 12:33
  • 2nd law holds universally for macroscopic systems that can exchange heat/work with outside, so also during such process where partition is inserted. But 2nd law does not require or imply that sum of entropies of the partitioned system has to be the same as entropy of the unpartitioned system. This is an additional requirement, motivation of which is, to the best of my knowledge, that it leads to nice property of the entropy function: $S(kU,kV,kN)=kS(U,V,N)$. This nice property is not required by 2nd law. – Ján Lalinský Aug 20 '23 at 13:24
  • I can put it differently: when partitioning decreases certain entropy function $S_0$, this can't be used to violate 2nd law in any way, in particular, it does not allow us to construct a cyclic machine that in one cycle achieves taking heat from a heat reservoir and converting all of it into work. The decrease of entropy due to creating two closed systems is due to a quirk of the entropy definition, not due to that creation process violating 2nd law. – Ján Lalinský Aug 20 '23 at 13:27