First, the system is copied $\mathscr{N}$, and the number of systems on each microstate $j$ is $n_j$, we have:
$$
\begin{align}
\sum_{j}^{state} n_j =& \mathscr{N} \tag{1} \\
\sum_{j}^{state} n_j E_j = & \mathscr{E} \tag{2}
\end{align}
$$
And then get:
$$
n_j = e^{-\alpha - \beta E_j} \tag{3}
$$
where $\alpha$ and $\beta$ are constants from the Lagrange multiplier method.
For an arbitrary microstate $i$ with the same energy $E_j$, then it is obvious that $n_i=n_j$, that is to say, the probability of the two is equal.
So I have not used the equal probability hypothesis at the current position, but it has indeed been proved that the microstates with the same energy are equally probable, regardless of whether they belong to the same phase trajectory.
Asked
Active
Viewed 69 times
0

Zhao Dazhuang
- 339
-
Sorry, I do not understand your question. Is (3) supposed to be derived from (1) and (2) alone, no notion of entropy needed? Moreover, what do you exactly want to know? – Quillo Sep 19 '22 at 08:05
-
@Quillo Yes, eq(3) is derived entirely from eq(1) and eq(2) and the principle of maximum entropy (which has nothing to do with the assumption of equal probability), without using the assumption of equal probability. My question is obvious, I don't rely on the assumption of equal probability to derive that the assumption of equal probability is true, then the assumption of equal probability is no longer a hypothesis – Zhao Dazhuang Sep 19 '22 at 10:20
-
2Maximum entropy [that is necessary to obtain (3)] gives equal probabilities and justifies the microcanonical ensemble: E. T. Jaynes, "Information Theory and Statistical Mechanics", Phys. Rev. 106, number 4, pp 620-630, 1965 http://compbio.biosci.uq.edu.au/mediawiki/upload/b/b3/Jaynes_PhysRev1957-1.pdf – Quillo Sep 19 '22 at 10:32
-
3Try also to take a look at this question: Dispensing with the "a priori equal probability" postulate – Quillo Sep 19 '22 at 10:34
-
@Quillo It is not strictly necessary, tho. One can similarly start from the postulate of equal a priori probabilities of the overall system and then do some approximations in tracing out the bath, roughly speaking. – Tobias Fünke Sep 19 '22 at 13:23
-
@Quilo You means that Maximum entropy is depends on Equal probabilities, right? – Zhao Dazhuang Sep 20 '22 at 00:37
-
Maximum entropy is MORE fundamental than equal probabilities and it is the justification of equal probabilities, see the paper of Jaynes: the most "unbiased" distribution of energies is the distribution that leaves the maximum possible residual uncertainty about the problem. The result is the microcanonical ensemble. – Quillo Sep 20 '22 at 18:51
-
@JasonFunderberker that's the "old" story, after Jaynes the modern interpretation is different. First comes maximum entropy, then equal probabilities. – Quillo Sep 20 '22 at 18:53
-
@Quillo I know right. But there is no need for maximum entropy /an information theoretic approach at all (although I favor this approach). – Tobias Fünke Sep 20 '22 at 18:53
-
@Quillo Then eq(1), eq(2) and the principle of maximum entropy do not depend on the principle of equal probability. I have deduced eq(3) from the first three, and eq(3) proves equal probability. How to explain this? – Zhao Dazhuang Sep 21 '22 at 02:20
-
@Quillo Or do I understand you to mean that the assumption of equal probability has been proven correct by Jayens via the principle of maximum entropy? – Zhao Dazhuang Sep 21 '22 at 02:38
-
There is no 'proof'. From MaxEnt the case of equal probabilities is a special case, namely the case where you maximize the entropy only under the constraint that the probability distribution is such a distribution, i.e. $\max S[{p_i}]$ under the constraints that $\sum\limits_i p_i =1$ and $0\leq p_i\leq 1$ yields $p_i=const.$ – Tobias Fünke Sep 21 '22 at 07:40
-
@JasonFunderberker Sorry, I'm a beginner and I'm trying to understand what you mean. You mean the principle of maximum entropy is only possible under equal probability, but didn't the other person above say that the principle of maximum entropy is something more fundamental than equal probability? – Zhao Dazhuang Sep 21 '22 at 07:56
-
1No, it is not what I mean. Please read again. The case of equal probabilities is a special case of the MaxEnt Principle. – Tobias Fünke Sep 21 '22 at 07:59
-
@JasonFunderberker The second time I understand you mean that the entropy can be maximized only when the microstates are all equal probabilities, if not, the entropy has not reached the maximum value at this time. But the principle of maximum entropy has always been established. – Zhao Dazhuang Sep 21 '22 at 08:05
-
1Sorry, I don't understand. Perhaps the best is if you'd read the linked paper by Jaynes. He explains much better than I do. If you still have a question then, open a new one; the comment section is not really suited for this. – Tobias Fünke Sep 21 '22 at 09:17