3

Why is there an absolute entropy? Given any non-discrete probability distribution, we don't really have an absolute entropy because the entropy depends on the parametrization of the distribution (e.g. Beta vs. Beta-prime) which was arbitrarily chosen. Another way to put it is that instead of entropies we only have Kullback–Leibler (KL) divergences aka relative entropies. So, why isn't there a physics analogue to KL-divergence? Just as we have relativistic velocity, which has some properties, why don't we also have relative entropies, which have some properties? Instead of saying the absolute entropy of the universe increases, why don't we say that the relative entropy given our prior belief of the universe increases?

Alternatively, what is the relation between entropy and number of microstates when the physical system is continuous, and how do we "count microstates"?

Neil G
  • 364
  • 1
    entropy ultimately relates to number of available microscopic states that share same macroscopic description, each of such states exist regardless of any arbitrary reparametrizations of the functionals, and should be measurable by any observer – lurscher May 15 '11 at 21:39
  • @lurscher, that's fine to measure for a discrete system, but how many configurations are there for a single particle in a volume? If we're subdividing the volume into n pieces and then taking a limit as n goes to infinity, then doesn't the way that we subdivide matter? (This would matter with with continuous probability distributions.) – Neil G May 15 '11 at 21:51
  • 2
    as a friendly suggestion, i think your question should be at a more fundamental level, for example: what is the relation between entropy and number of microstates when the physical system is continuous? to make it more clear/understandable by other readers what is the context of it. – lurscher May 15 '11 at 22:38
  • @Neil G: have you seen/studied the entropy of the ideal gas in a volume $V$? – Gerben May 15 '11 at 22:50
  • @lurscher: good idea. – Neil G May 15 '11 at 23:02
  • @Gerben: No, I haven't. Is there an online reference that I can look at? – Neil G May 15 '11 at 23:04
  • @Neil G: try the Sackur-Tetrode equation or here or most statistical mechanics books. – Ramashalanka May 15 '11 at 23:52
  • Entropy is defined so. Other definitions lead to other relationships. It is the matter of being natural and practical, I would say. – Vladimir Kalitvianski May 16 '11 at 12:52

1 Answers1

1

The measure isn't arbitrary. In classical mechanics, the symplectic structure of phase space defines the Liouville measure. In quantum mechanics, the Hilbert space norm plays this role.

Sap
  • 26
  • 2
    This is correct, but I think it should be expanded to say that the entropy becomes a number because quantum states in finite volume/energy are discrete, so there is a number of them. In infinite volume, they become extensive but without the fine-graining classical ambiguity. – Ron Maimon Aug 14 '11 at 19:43