Questions tagged [entropy]

An important extensive property of all systems in thermodynamics, statistical mechanics, and information theory, quantifying their disorder (randomness), i.e., our lack of information about them. It characterizes the degree to which the energy of the system is not available to do useful work.

Entropy is defined to be extensive (that is additive when two systems are considered together).

In statistical mechanics the entropy is identified with the logarithm of the number of microscopic states of a system which has a given set of macroscopic properties.

2943 questions
18
votes
5 answers

Is entropy of all systems zero at absolute zero?

Is the entropy of every system zero at the absolute zero? Or is it taken to be zero at the absolute zero? Are there systems that doesn't reach zero entropy even till absolute zero?
zeal
  • 275
7
votes
2 answers

Does entropy decrease if time is reversed?

Entropy increases if we let newton's equation work its magic. Since newton's equation is time reversible, I would assume that in a closed isolated system, solving the differential equation and running time backwards would increase (and NOT…
6
votes
1 answer

Entropy as a state property

The usual "proof" entropy is a state property is like that: "Consider a system which undergoes a reversible process from state 1 to state 2 along path A, and let cycle be completed along path B, which is also reversible. Since the cycle is…
Kelvin S
  • 1,135
5
votes
1 answer

Second Law of Thermodynamics....confusion over an example

By the second law of thermodynamics, you shouldn't be able to use any amount of mirrors/lenses to focus sunlight onto an object and heat it past the surface temperature of the sun (approximately 5800K). In principle this makes sense to me, but I'm…
Spaderdabomb
  • 1,437
5
votes
3 answers

Can ice have a higher entropy than water?

I've leant that entropy is a state of randomness, and that solids have a more structured form, therefore having less entropy. However, I saw a YouTube comment stating the following: a liquid NOT ALWAYS means higher entropy than a solid it…
DarkLightA
  • 1,422
4
votes
3 answers

How are possible microstates discerned in Gibb's entropy formula?

On the entry of Gibb's entropy formula on Wikipedia, the following definition is given: "The macroscopic state of the system is defined by a distribution on the microstates that are accessible to a system in the course of its thermal fluctuations."…
Speldosa
  • 319
3
votes
1 answer

Why is there an absolute entropy?

Why is there an absolute entropy? Given any non-discrete probability distribution, we don't really have an absolute entropy because the entropy depends on the parametrization of the distribution (e.g. Beta vs. Beta-prime) which was arbitrarily…
Neil G
  • 364
3
votes
5 answers

Entropy - Gas Inside A Closed System Reaches Maximum Entropy

Filling a box with a certain amount of gas with a specific total energy and allowing the gas to reach a maximum entropy state, what happens next? Would the gas remain in a maximum entropy state indefinitely? What would prevent the gas…
pZombie
  • 359
3
votes
0 answers

Has any possible non-extensive entropy been classified?

How many non-extensive entropies do exist? Tsallis, Havrda-Chavrat, Renyi, Kaniadakis, Sharma-Mittal,... To be more precise, I am wondering if some classification like those of "finite" or Lie groups do exist for uniparametric, biparametric or…
riemannium
  • 6,491
2
votes
2 answers

Decrease in entropy in a fluid flow

Let's imagine a section of a pipe through which a fluid, gas for example, flows. When there is no pressure gradient, there is no flow. However, that does not mean that the molecules are at rest. They are "Maxwell-Boltzmann" distributed but the net…
Amey Joshi
  • 2,235
2
votes
2 answers

Can entropy be explained in terms of a cleaning/keeping your room clean?

I'm trying to relate the concept of entropy to keeping my room clean as suggested by my high school teacher ~1993... Comparing the two scenarios: Every day I come home and throw an empty can on the floor and every night before bed I pick it up and…
2
votes
1 answer

Explanation of definition of entropy

I am trying to understand the definition of the entropy. The lecture notes I use define it as $$ S(E,V) = k_B \ln(\Gamma(E,V)) $$ with $$\Gamma(E) = \int_{E < H(p,q) < E + \Delta} d^{3N}p\ d^{3N}q\ \ \rho (p,q) $$ where $\rho$ is the distribution…
2
votes
1 answer

Why is the change in entropy greater for processes occurring at lower temperatures?

We have the thermodynamic definition of entropy $\Delta S = q_{rev}/T$. If heat transfer is the same for both processes at different temperatures, this implies that the same process occurring at lower temperature would generate more entropy than if…
notorious
  • 233
2
votes
1 answer

Defining Orderly/Chaotic states in terms of entropy?

I'm trying to properly understand the meaning of entropy, and how the universe is moving from an orderly state to a chaotic one. If a glass of wine (for example) only has meaning to a human, what makes a shattered spilled glass any less orderly than…
machinemessiah
  • 435
  • 5
  • 10
1
vote
1 answer

Why does a temperature increase on a fixed volume increase entropy?

I heard that this statement is correct. However, it seems odd to me. The number of possible microstates is still the same, so isn't the entropy constant?
DarkLightA
  • 1,422
1
2 3 4 5