As far as I can tell temperature seems to be defined as something like average kinetic energy per molecule, but not quite. It looks like it measures something proportional to this average kinetic energy, where the coefficient of this proportion depends on the number of independent degrees of freedom by the equipartition principle.
If I put a thermometer in some substance and let it reach equilibrium to measure the temperature, then I would naively expect that the average kinetic energy per molecule in the thermometer would equal the average kinetic energy per molecule in the substance. But this means that the temperature would actually be different.
So, I conclude my naive understanding is wrong and that the average kinetic energy per degree of freedom must reach equilibrium, and if one object has more degrees of freedom it will end up with more energy. I can maybe convince myself of this intuitively by imagining a di-atomic gas and a monoatomic gas interacting.
But then since the thermometer measures something that does reach equilibrium, I suppose it must be directly measuring the average kinetic energy per degree of freedom. Is that accurate?
And if it is, how does it do that? A mercury thermometer measures temperature by thermal expansion. This seems to suggest that thermal expansion is governed by a fixed set of degrees of freedom. I would guess that it would be the translational kinetic energy (more specifically, since the mercury expands in only one dimension, I would say the translational kinetic energy in that direction). Is this an accurate description of the function of a thermometer?