1

This is one of the questions from the series "I learnt Stat-Mech but have no idea what I am doing" : my super big post is here!

In microcanonical ensemble, many books prefer to consider an isolated system with some energy and then divide this system into two systems (system 1 and system 2) that are separated by a diathermal wall - ones that can exchange energy by heat.

  1. If energy of the isolated system is $E$, volume is $V$ and number of particles is $N$, then entropy $S$ is defined as $S(E,V,N) = k_{B}\ln{\Omega(E,V,N)}$. Here, $k_{B}$ is Boltzmann's constant and ${\Omega}$ is the number of microstates that correspond to the specific macroscopic parameters mentioned previously. For which of the three systems is entropy defined this way?

I am pretty sure that entropy is defined in such manner for the two systems separated by a diathermal wall, but I am very unsure about the isolated system. By physical considerations I would assume that such definition applies for the isolated system by saying that no system is better than other. On the other hand, if I assume that this is true then it leads me to a contradiction, for example, how can 2nd Law hold then?

  1. Is temperature, pressure, chemical potential etc. which are defined as derivated of entropy with respect to something defined for non-equilibrium also?

From high school studies I am pretty sure (which means I don't know) that pressure and temperature should be defined even when subsystems interact and they are not in equilibrium. But if that is true - both average energy and entropy should be defined in non-equilibrium. I have no problems with average energy just as it has no problems with me. But what about entropy? I understand why we use entropy formula that connects it to the microstates in equilibrium (assuming dominance of one macrostate). But what about non-equilibrium? I don't even know what energy subsystems have at each instant of time.

  1. In canonical ensemble - energy distribution of subsystems is not delta distribution - therefore we talk about average energies. But then how is entropy defined?

Do I take into account only microstates that have energy equal to the average energy? But then different distributions of energies could have the same entropy which is non-intuitive to me. On the other hand, if it is another function - what is it? And why is it the way it is? For me it would make sense to define entropy of a system to be something like an average of entropy for each of the energy macrostates. Because then I can calculate it and distribution of energy matters. But then, how is it connected to average energy? Because I surely have to calculate derivative with respect to it.

  • 1
    Related but Not a clear answer to your first question about isolated systems, please just treat as background that might be helpful: http://physics.stackexchange.com/questions/119387/why-can-the-entropy-of-an-isolated-system-increase –  Jan 17 '17 at 23:21

0 Answers0