2

The Boltzmann entropy is defined as the logarithm of the phase space volume (E). Is there a reference, book, paper which shows where this definition comes and how it is equal to the phase space volume?

Srishti M
  • 291
  • related questions by OP: http://physics.stackexchange.com/questions/94730/boltzmann-entropy-and-phase-space-volume http://physics.stackexchange.com/questions/94567/statistical-entropy-and-information-theory – N. Virgo Jan 22 '14 at 05:34

2 Answers2

3

The phase space volume itself is clearly not equal to its own logarithm, $\Omega\neq\ln\Omega$. The entropy is the logarithm and the logarithm is relevant here because it is additive, $$\ln(\Omega_1 \Omega_2) = \ln \Omega_1 +\ln \Omega_2,$$ for an argument that is the product of two factors, and the volume of phase space of 2 independent subsystems is indeed a product $\Omega_1\Omega_2$ (think about areas of rectangles or other Cartesian products of sets.)

A presentation for "true beginners" or lay audiences could have omitted the logarithm because the author may have considered logarithms to be inedible by his or her audiences.

In the thermodynamic (large number $N\to\infty$ of particles), the log of the volume may also be approximated by the logarithm of the surface of the region in the phase space and in many other ways.

In quantum mechanics, the phase space is effectively divided to "elementary cells" whose volume is $(2\pi\hbar)^N$ for some $N$. The quantum entropy is the logarithm of the number of these phase space cells. This is equivalent to choosing the right "unit" of the volume and to eliminating of the ambiguous additive shift in the definition of the entropy.

There exist generalizations of the notion of "entropy" for which we don't expect additivity ($S_{AB}=S_A+S_B$) but a different relationship. Then the logarithm rule has to be revised, too.

Any book on statistical physics (and/or modern enough book on thermodynamics) discusses these issues.

Luboš Motl
  • 179,018
  • Thank you for your insights. I do have a question which is how does omega relate to phase space volume and can I use this same expression to define Kolmogorov entropy = ln(omega) ?What do we mean by microcanonical ensemble and independent subsystems? – Srishti M Jan 21 '14 at 18:27
  • Entropy=log of phase space volume, this formula was it given by boltzmann and hence the name Boltzmann entropy? or is it called as statistical definition of entropy? – Srishti M Jan 21 '14 at 18:41
  • Dear Srishti, $\Omega$ was my (and standard) symbol for the phase space volume. Kolmogorov entropy is a supremum over partitions, some advanced stuff, see https://en.wikipedia.org/wiki/Kolmogorov_entropy#Measure-theoretic_entropy - but the essence is always the same and you must surely first understand why the log appears here. – Luboš Motl Jan 22 '14 at 08:03
  • Microcanonical ensemble is the state of a macroscopic system in which all configurations with the total energy between $E-e_0$ and $E+e_0$ are possible and equally likely and all configurations with the energy outside the interval are impossible or prohibited. Independent subsystems A,B are two objects (described by their coordinates, momenta, or other information - degrees of freedom) that don't interact and whose probabilities of everything may be calculated just as the product $P(A=A_0)P(B=B_0)$ etc. – Luboš Motl Jan 22 '14 at 08:05
  • Yes, Boltzmann invented the way to write the entropy using statistical physics, so Boltzmann and statistical entropy are the same thing. Everything else called the entropy that counts things in thephase space or information or anything like that is just a minor generalization of Boltzmann's precious ideas. – Luboš Motl Jan 22 '14 at 08:06
  • Motl:Thank you for your response.The thing is I need to cite reference which says that the total arrrangements of the particles which is $\Omega$ is the phase space volume.Since I am not from Physics background, I do not know where exactly this word is mentioned that $\Omega$ is the phase space volume – Srishti M Jan 22 '14 at 17:25
  • 1
    The phase space is defined as the space of all possible arrangements or configurations of a physical system (all possible values of positions and momenta of particles, if this is the information used to described a physical system, for example), and the measure (volume) is the right way to "count" continuous sets, so the volume in the phase space and counting of possibilities is clearly the same thing. All these things were known to Boltzmann. In quantum mechanics, the number effectively becomes finite/discrete because the phase space is divided to cells that can't be further divided. – Luboš Motl Jan 23 '14 at 06:32
1

I am not very sure about this, but here is an attempt.

Well there is a connection you can try to make with the Shannon entropy !! By apriori principle of microstates you can see that probability of each state is
$$ p = \frac{1}{\Omega} $$ In infromation theory given a set of events $ \{X_1,...,X_n\} $ with probabilities $ \{P_1,...,P_n\} $

Shannon information for the $i^{th}$ event is defined by $$ I_i = -log_2P_i $$

From the definition you can see that lesser the probability of an event, greater is its information. This is the motivation for this definition.

Now the shannon entropy is defined as average information in the given set of events.

For a probability distribution average of a quantity is defined by, $$ <Q> = \sum\limits_{i=1}^n P_iQ_i $$

So, the average information (or entropy) is given by

$$ S_{Shannon} =<I> = -\sum\limits_{i=1}^n P_ilog_2P_i $$

As an exercise you can also verify this average information is maximised when the value of $$ P_i = \frac{1}{n} \:\:\:\:\:\: \forall \: i = 1,..,n $$ which is the idea of apriori principle saying entropy is maximised. (begin by setting $ \delta S = 0 $ )

The base 2 logarithm is a comfortable choice in the case of information theory.

However we can translate this to the idea in statistical mechanics with natural logarithm and using the first equation where $ \Omega $ is the total number of microstates (volume of phase space divided by $ h $ - unit volume element of phase space) .

$$ S_{Boltzmann} \propto ln \Omega \implies S = k_B ln\Omega $$

Where $ k_B $ is factor involved conversion from log base 2 to natural logarithm.

[EDIT 1]

Detailed explanation is quite involved, but to give you first sight, Microstate - is a particular set of values of (p,q) the momentum and position in the phase space. One set of (p,q) describes one physical state for the system. Now, since the phase space is continuous, we can't do counting of the every point (since that would sum infinite no. of them). From quantum mechanics, we have

$$ \Delta x\Delta p \ge \hbar $$ From this we deduce the smallest area element in phase space (ie x*p) goes like h(Planck's constant). After having discretised the space we can count the states, counting no. of smallest boxes (i.e. the bits of area h) within the area - condition given by its energy

In the proper mathematical language (for a 2D phase space) :

$$ \Omega = \int_{H\le E} \frac{dp dq}{h} $$ where E is energy of the system and H the hamiltonian. And the integration is over a 2-sphere.

In a much simpler language to associate to your familiar information theory ($ n \rightarrow \Omega $), which also gives you apriori principle for maximum entropy.

user35952
  • 2,965
  • Thank you for your insights. I am not too familiar with the concepts of microstate (come from communication background) so could you kindly explain "Ω is the total number of microstates (volume of phase space divided by h" and if increase in entropy causes decrease in volume? Also, is Bolzmann entropy related to Kolmogorov entropy? – Srishti M Jan 21 '14 at 18:21
  • @SrishtiM : Regarding your first question, have a loot at the answer again, I have edited. Secondly about Kolmogorov entropy (with definition I looked up on wolfram), it seems it is similar to the Boltzmann entropy just with a generalised context (meaning the phase space is not momentum-position space). Again, space is made into hyper-cube so that you can make counting finite. I am sorry I am not able to give a mathematical connection though. – user35952 Jan 22 '14 at 05:36