I started thermodynamics mostly through independent study and basically built up my own definitions of terms that appeared to fit with what was going on. They seemed to work but my question is whether or not this is how they are actually supposed to be viewed.
Equipartition theorem. 'It is possible to show that at equilibrium molecules share an equal amount of energy between multiple independent coordinates or degrees of freedom which fully describe their states, so long as the energy term is quadratic for each coordinate.' The simplest example is an ideal gas, whose energy is $1/2mv_x^2+1/2mv_y^2 + 1/2mv_z^2$ by pythagoras theorem. The $x,y,z$ terms are statistically the same because rotating the coordinates doesn't change pythagorean distance.
'Temperature is the rate of energy transfer away from a point due to particular collisions of molecules (not net transfer)'. To assign a number to this value, we calculate the work done in a given collision on a molecule. Collisions occur in one degree of freedom and the velocity of approach is statistically the same as separation. so for instance, on average for a ideal gas
$kT = 2 \times \frac{1}{2}mv_x^2$.
Where $k$ is Boltzmann constant. $\frac{1}{2}mv_x^2$ is roughly $E/a$ where E is the energy of a particle and $a$ is the number of degrees of freedom. I have been able to apply calculus to this to deduce lots of useful facts, such as that the heat capacity of an ideal gas is $\frac{3}{2M_r}kN_a$, at equilibrium $PV = NkT$ and if little heat energy is transferred between an equilibrium gas and its surroundings, $T V^{\frac{2}{a}} = constant$
'Entropy is the average number of degrees of freedom that contain energy, for a given molecule in a system'. It is
s = $\frac{a_{mean}}{n}$
where $n$ is the number of moles. This can change with temperature. For example, increasing the temperature of oxygen gas will 'free up' a greater proportion of molecules, causing $a_{mean}$ to increase. $s$ of the universe is always increasing, much like water spreads through an ice cube tray, energy spreads through existing empty coordinates
'Enthalpy change is the amount of energy transferred (per mole) from the surroundings to the system, measured at constant temperature and pressure'. When negative, it goes into freeing up the coordinates of surrounding molecules if the 'universe' remains at roughly the same temperature. If enthalpy change is zero and the entropy change of the system is positive, all the energy released in the reaction frees up coordinates within the system. We can therefore say $\Delta S_{surroundings} = - \frac{\Delta H}{T}$
'Gibbs free energy (per mole) is proportional (-ve) to the amount of energy which has gone into freeing new coordinates in the 'universe' before and after an event at constant temperature and pressure'. Dimensionally it can be seen as $Ts - T(s_{system} + s_{surroundings}) = E_i - E_f$ where $E_i$ and $E_f$ are initial and final heat energies of the world. This leads to $-T(\Delta s_{system} + \Delta s_{surroundings}) = G < 0$ so
$\Delta H-T\Delta s_{system} = G < 0$
for any feasible reaction. Not sure about this because it suggests that energy has entered the closed system
So looking at some other questions I found this. Scroll to OrangeSherbet's answer, it does bear similarities with mine. I understand that there are probably more useful, intuitive and computable definitions of entropy but is there something fundamentally wrong here?
– May 01 '18 at 22:07