2

We have the thermodynamic definition of entropy $\Delta S = q_{rev}/T$. If heat transfer is the same for both processes at different temperatures, this implies that the same process occurring at lower temperature would generate more entropy than if the process occurred at higher temperature. Why is this the case? Why isn't the change in entropy equal for both processes?

notorious
  • 233

1 Answers1

2

If you are looking for an intuitive explanation for this, as you said in the comments above, I'll try to give you one here.

We assume an heat transfert is done in a high temperature environment, which means the entropy of this environment is already high since particles are already more agitated and that the energy is dispersed in a disorded way. Now if we imagine the same heat transfert in a low temperature environment, where the entropy is initially lower, we could say that the impact on the entropy of this heat transfert is more important than the same heat transfert in the high temperature environment since the energy that has been transfered in the low temperature environment makes up a greater part of the overall energy.

Sorry if I am unclear, it seems logical to me, but it is quite hard to explain. Let me give you a simpleexample with numbers.

If we have the high temperature environment with an initial energy of 100 and the low temperature environment with an initial energy of 10 (no matter the units, the one you want). If the energy transfert is of 5 in the same units then we can calculate that this heat transfert increases the energy in the high energy environment by 5% and increases the energy by 50% in the low energy environment. So we see that the same heat transfert influences more the low temperature environment than the high temperature one.

Hope you see what I mean, ask me questions if some things are unclear.