I saw this curve at a website, but I didn't understand it. What is the lower point of entropy, and what are the points A, B, and C? Thanks!
-
what are you referencing by lower point of entropy? – auden Jul 17 '16 at 00:10
-
am referencing the Lower point of entropy between A , B , some one suggested that it was the moment of the big bang where the entropy is to small at that time – karim Jul 17 '16 at 00:39
-
okay, I will update my answer to answer that. otherwise, i think my answer is complete. – auden Jul 17 '16 at 00:40
-
please do, and thank you so much , your way of explaining is very good – karim Jul 17 '16 at 00:44
-
sorry, do you know if the person who suggested the big bang/low point correlation was citing a website, or book, or something? because I can find absolutely no information on it. – auden Jul 17 '16 at 00:58
-
no,he didn't :( – karim Jul 17 '16 at 01:02
-
okay, I updated my answer...I found a source that seemed to support that yes, the lower point is the "Big Bang" or an equivalent. – auden Jul 17 '16 at 01:03
-
yah it is , but if you think about it ,consider the lower point is very moment of the big bang so time grows from the left and the right is it possible ?? link – karim Jul 17 '16 at 01:10
-
so are you asking how the past can be both to the left and right? – auden Jul 17 '16 at 01:20
-
am asking if it possible for time to exist before the Big Bang – karim Jul 17 '16 at 08:08
-
I see. Well, that depends on who you ask. In the graph, boltzmann was kind of assuming our universe is in a "mother universe" but obviously, since there are problems with his idea, scientists aren't anywhere near sure. Some scientists invoke multiverses and some of them existing before ours to explain the anthropic explanation of the fine-tuning of the universe, but there isn't any evidence for multiverses. In our universe, at least, it is impossible for time to exist before the Big Bang...there might have been something, just not time. – auden Jul 17 '16 at 12:41
-
I updated my answer to include the information in my comment. – auden Jul 17 '16 at 12:43
-
You could at least have provided a link to the website you looked at - possibly this one? http://s33light.org/post/25967508330. As it stands, the question has no context at all. – sammy gerbil Jul 18 '16 at 11:51
-
@sammygerbil, the website I looked at was linked in the first sentence of the last paragraph. Interesting website that you found. – auden Jul 18 '16 at 12:36
2 Answers
For reference, the diagram is below.
First, I'm just going to give a quick explanation of entropy. Before Boltzmann, people knew about entropy, they just didn't explain it correctly; namely, they thought of it as a measure of the uselessness of arrangements of gas. As an example, they thought that if you had a box, and all of the gas molecules in the box were on one side of that box, that was low entropy, because they could extract useful work from it by letting it into the other side of the box. However, if the gas was spread uniformly throughout the box, that was high entropy, because any movement of the gas would require adding energy.
Boltzmann, on the other hand, thought (correctly) that entropy was the number of ways a system could be rearranged without anyone noticing. To put this succinctly: let's say Bob and Joe have an apartment, and Joe comes to Bob, obviously upset, saying "We've been ransacked!" Bob thinks this is nonsense, and as evidence points to his room: "There's a couple t-shirts on the floor, couple of crushed soda cans, sheets unorderly. Nothing's different!" But Joe says, "No, no, come to my room! See, the Shakespeare plays are out of alphabetical order, my musical collection is all messed up, and the bed sheets are unmade instead of made! Ransacked!"
Now, think about Bob's room. There are probably a ton of different ways you could rearrange it without Bob noticing, right? You could throw an extra shirt onto the ground, you could leave three shirts inside out instead of two, you could scatter one a little bit to the left. You could move the sheets on the bed back a few centimeters. All those different ways you could leave the room different without Bob noticing is the level of entropy. There are very few - maybe even zero - ways to rearrange Joe's room without him noticing, so his room is low entropy. But since there are many ways you could rearrange Bob's room so that it is different but so that he will not notice, it is high entropy.
To translate this to real life, let's say you have a tea kettle, and it's giving off a bunch of steam. Pass your hand through the steam molecules, and it still looks the same, right? The system is high entropy. Now, stack some wooden blocks. Knock them over. You notice, right? The system is low entropy. Going back to the box analogy, there are far fewer ways for for the atoms of gas to arrange themselves on one side of the box as opposed to spreading out throughout the box. If the atoms are all on one side of the box, it is low entropy. If they are spread throughout, it is high entropy.
With this new understanding of entropy, Boltzmann was able to derive the Second Law of Thermodynamics in a statistical sense. To explain this simply, there are far more ways for a system to be high entropy than low entropy, so it is no wonder systems naturally increase in entropy, but systems do not naturally decrease in entropy. This led to a rather unexpected consequence - namely, his definition explained why entropy tended to increase, but not why entropy was so low in the first place. This was now a problem for cosmologists - why did the early universe have such low entropy? Now, Boltzmann solved this problem, but first, it is important to point out that Boltzmann's definition of entropy only holds statistically. Going back, once again, to the box, it is not certain that the molecules of gas will spread out through the box. There is a very low probability that random motions of the molecules of gas will bring them all to one side of the box (so low that the uniformity of the gas molecules will exist much longer than than the age of the observable universe).
Boltzmann decided to solve the puzzle of the early universe's low entropy by taking advantage of this statistics-only limitation. Instead of the box of gas, think of the entire universe (in a box, if you like). Imagine that it is in thermal equilibrium - in other words, the highest state of entropy possible. Now, since it's the highest state of entropy possible, the entropy cannot possibly increase, so it'll stay steady...except for fluctuations. We can calculate how likely fluctuations are. As can be expected, larger fluctuations are exponentially less likely than small ones, but every type of fluctuation will eventually happen. In other words, maybe our universe is in a state of fluctuation away from its normal equilibrium. The low entropy of the early universe, according to this idea, is a "statistical accident". Okay, now for the graph: we think that we live at either point A or point B. A and B are utterly indistinguishable - people living in A consider the direction to the left the "past" and people living in B consider the direction to the right the "past".
Okay, now it gets a little more complicated. After this, Boltzmann used anthropic reasoning to explain why we're in the fluctuation regions as opposed to the vast, vast majority of the universe's time, which is in a state of thermal equilibrium (anthropic reasoning is based off the anthropic principle, which you can learn more about here). The problem with this is that you might as well say, "That's just the way it is." Instead, we have to think, okay, if this is true, what experimental results should we see? What properties should we expect to measure? People have done this, and, well, let's just say there are problems.
The most basic problem (and the one that's relevant to the diagram) is called Boltzmann's Brain. So, the fluctuations that we are talking about, the low entropy fluctuations, are very rare (the lower the entropy goes, the rarer they are). Points like C on the diagram are much more common than points like A or B. So if we find ourselves explaining the low entropy of the early universe with the anthropic principle, we should be in the minimum possible entropy fluctuation that allows for existence. That minimum fluctuation is Boltzmann's Brain...i.e., the fluctuation that allows for a conscious brain with enough sensory inputs to look around and recognize that it exists before going out of existence. These fluctuations are rare, but they are much, much less rare than the type of fluctuation we are in.
So...why are we in the type of fluctuation we are in? We have no idea. Honestly, none.
Note: I'm including this paragraph for completeness. Feel free to skip, as it is not related to your graph. Okay, so a guy named Josiah Gibbs (1839-1903; Boltzmann lived from 1844 to 1906) theorized a different interpretation of entropy. You can learn more about his ideas here and here. I'll be expanding this section soon with an explanation.
Now, for your final question. The low point of entropy between A and B is, as far as I can tell, supposed to be the Big Bang (or the "beginning of our universe" equivalent). As per my comment, no one is really sure whether or not there was time before the Big Bang - it depends on who you ask. In the graph, Boltzmann was kind of assuming our universe is in a "mother universe" but obviously, since there are problems with his idea, scientists aren't anywhere near sure. Some scientists invoke multiverses and some of them existing before ours to explain the fine-tuning of our universe, but there isn't any evidence for multiverses. In our universe, at least, it is impossible for time to exist before the Big Bang...there might have been something, just not time.
Hope this helps!
Resources
The graph first appeared in a paper by Boltzmann that I am trying to find. He did write several papers about (or with mention of) his "H-curve". I have found three online, for free. One is called Ueber die sogenannte H-Curve, or As to the so-called H-curve. The copy of the paper I have linked to is in German; I could not find an English translation. For reference, this paper was written in 1897. The second is called On Certain Questions of the Theory of Gases, and the section relevant to the H-curve is on the second page of the pdf, in the second column, in the paragraph where the first sentence reads "Let us now take a given rigid vessel..."; the pdf is here.
The third is called On Zermelo's Paper "On the Mechanical Explanation of Irreversible Processes". This one is in English; however, the section on his H-curve is very limited. It starts on the seventh page of the pdf, below Figure 1 in the Appendix. Here is the pdf. This paper was a response to a paper by Ernst Zermelo called On the Mechanical Explanation of Irreversible Processes; this paper can be read here. There were two more papers between Boltzmann and Zermelo in 1896 that I can't find for free online.
You can find more information on this website. There is also a very interesting book called Ludwig Boltzmann: The Man Who Trusted Atoms by Carlo Cercignani which I found a pdf of here. The book has many quotations from Boltzmann's writings. The relevant section is on page 148 (the beginning of Section 6.4, The So-Called H-Curve). A very interesting summary of the Zermelo-Boltzmann papers with extensive quotations is in the Ernst Zermelo - Collected Works, Vol. 2; the link to the Google book sample is available here. The relevant section starts on page 203; it is called "The Zermelo-Boltzmann Controversy". A paper that may be of use (as it is a overview of Boltzmann's work) called Rereading Ludwig Boltzmann can be found here. Last paper: Boltzmann’s H-theorem, its limitations, and the birth of (fully) statistical mechanics by Harvey Brown and Wayne Myrvold can be found here.

- 7,027
-
Fantastic effort! However, I have (at least) one quibble : "No one is going to notice they have been ransacked if their room was messy to begin with. However, if it was an orderly room, you will certainly notice if you were ransacked." Someone can notice a change in a messy room : it is a matter of prior knowledge of what was where, not "orderliness". Of course a "messy" room probably belongs to a person with a "messy" mind who would not notice the difference. But that is a human fallibility, not a physical one. – sammy gerbil Jul 18 '16 at 11:38
-
@sammygerbil, fair point. I could argue, however, that if you've got a couple of shirts scattered around and I picked them up and rescattered them (especially if it's a really messy room) you won't notice. I can see where you're coming from, though. – auden Jul 18 '16 at 12:34
-
It works at the other extreme also : Had the plays of Shakespeare been re-arranged into chronological order instead of alphabetical, this is equally "orderly" but it would still be noticed. The point I am making is that I do not think your succinct analogy has captured the essence of Botzmann's criterion. (But will anybody else notice?;) – sammy gerbil Jul 18 '16 at 14:04
-
@sammygerbil, it's an analogy. It's not perfect. Good points, though. =) – auden Jul 18 '16 at 14:05
-
An analogy should not merely entertain but crucially illustrate the central principle which it is intended to illustrate. In this case, that entropy is "the number of ways a system could be rearranged without anyone noticing." Your analogy contains nothing about "the number of ways" in which the ransacking could be done without either Bob or Joe noticing. – sammy gerbil Jul 18 '16 at 14:26
-
@sammygerbil, that's a good point, and I updated my answer to address that. – auden Jul 18 '16 at 14:31
I'm not sure what you mean by the "Lower Point". There is no such label in your diagram. The entropy axis is, I believe, linear, but is not necessarily bounded by 0 entropy at the lower horizontal edge. In other words, you should consider that the curve is one depicting that the system spends most of its time at maximum entropy (the maximum entropy value) and experiences random deviations from maximum. The time segment shown is selected from an infinite time because it has a large (random) deviation from its high entropy state. The points labeled A and B are two states which are on entropy gradients and, like C, represents the same set of microstates, but are surrounded by three different gradients (negative slope, positive slope, and zero slope peaks). Boltzmann suggested that the direction of time might be determined by the slope surrounding our state (in which case increasing time would be left to right for B and right to left for A, and neither for C). Because in the statistical theory of entropy deviations from maximum entropy are exponentially less likely to happen, finding our universe (our state) at either A or B is almost infinitely less likely than to find ourselves at C. What the diagram is supposed to suggest at C is that the order we perceive around us is far more likely to be due to randomness than to cause and effect (dynamically derived from an earlier state). A related concept is a Boltzman Brain.
Update: Sorry, I should have mentioned that the diagram is from a paper Boltzmann wrote, I'm not sure which one. So the labels A, B, & C don't necessarily correspond to anything that the website/journalist is speaking about. They are points at which the macrostate (the entire Observable Universe) consists of the same set of microstates - although to be fair, Boltzmann's main topic was ideal gas thermodynamics.