2

Suppose one knows nothing about the concept of entropy. How can we argue that the lack of information/ignorance about the system typically increases with the increase in the temperature using the formula of the canonical probability $p_i=e^{-\beta E_i}/Z$ where $Z$ is the canonical partition function? Assume that a system has a fixed volume and a fixed number of particles.

Here is the objective. If I can argue that lack of information typically increases with temperature, I can use that to argue that entropy typically increases with temperature by equating lack of information with entropy.

Qmechanic
  • 201,751

5 Answers5

5

Look at your Boltzman distribution.

$$ p_i \propto e^{-\frac{E_i}{K_b T}} $$

At infinite temperature, all states have equal probability and you have "minimal information"

4

At absolute 0, assuming the ground state is a crystal, there is no information encoded in the state, so there is no lack of information about the state.

When we heat up the system, the amount of information encoded in the state (in terms of the positions and motion of all the atoms in the material) increases. But we are learning barely any of this information. Therefore, our lack of information increases, not because we're forgetting anything but because there's more information in the system to lack.

Where is this extra information coming from? From the way we heat up the system. Suppose we shine microwaves on it. We don't know which atoms the microwaves are interacting with, so we don't know the resulting motion of the atoms.

So entropy increases as temperature increases.

Peter Shor
  • 11,274
2

Intuitively: at zero temperature there is no thermal energy so only one state is accessible - the ground state (the bottom of the energy landscape). So you don't lack any information on what configuration the system could be in - you know for sure that it's in the ground state. As you add in thermal energy, more and more states become energetically accessible, so there are more possible configurations the system could be in. So the amount of information in the system that you don't know (just from knowing the temperature) increases. Instead of hanging around the bottom of the energy landscape, the system could also be found higher up, because of its internal thermal energy.

tparker
  • 47,418
1

Let's imagine a system $\mathcal{S}$ coupled to a large heat bath $\mathcal{E}$, the environment.

As $\mathcal{S}$ interacts with $\mathcal{E}$, correlations between these subsystems build up. Information about the system's state is proliferated into $\mathcal{E}$ in form of these correlations. In order to retain this info we would have to measure large chunks of the environment.

Since we are unable to do this, our uncertainty about the system's state increases. Maybe one way to quantify this would indeed be to consider the thermal state for the environment and a pure state (perfect information). The general time evolution of the whole density matrix would then lead the emergence of a mixed state if you trace (average) out the environment states.

The rate at which this happens will depend on the bath-bath correlation time which in turn depends on the temperature.

1

My second answer, which probably isn't what you're looking for, but just in case it is:

I just realized that you might be looking for a mathematically rigorous proof that the entropy of the thermal distribution: $$ H_\beta = - \sum_i p_i \log p_i \ \ \ \ \mathrm{where} \ \ \ \ p_i = e^{-\beta E_i}/Z $$ is increasing in temperature $T = \beta^{-1}$. Here's a sketch of one.

Lemma 1
For a given average energy $E_\mathrm{ave} = \sum_i p_i E_i$, the thermal distribution with energy $E_\mathrm{ave}$ maximizes the entropy over all distributions with that energy.

Proof sketch:
Use Lagrange multipliers.

Lemma 2
For the thermal distribution at any energy $E_\mathrm{ave}$ with $0 < \beta < \infty$, you can find a probability distribution with slightly higher average energy $E_\mathrm{ave}+\,\epsilon$ and greater entropy.

Proof sketch:
Find two energies $E_j < E_k$ and move some probability mass from $p_j$ to $p_k$.

Lemma 3
When we increase the temperature $\beta^{-1}$, we increase the average energy.

Proof sketch:
Show that for any two positive temperatures $\beta^{-1} < {\tilde{\beta}}^{-1}$, there is an energy $E_m$ such that in the thermal distributions, if $E_i < E_m$, then $p_i > \tilde{p}_i$ and if $E_i>E_m$, then $p_i < \tilde{p}_i$.

This can be shown by straightforward calculation.

Now we can prove the theorem. Start with average energy $E_1$. By Lemma 2, we can increase the average energy to $E_2 = E_1+\epsilon\,$ and find a distribution with higher energy. But the thermal distribution at average energy $E_2$ has higher entropy than this distribution by Lemma 1. Thus, we've increased both the energy and the energy of the thermal distribution. But by Lemma 3, the thermal distribution at average energy $E_2$ also has higher temperature.

Peter Shor
  • 11,274