I think your description of temperature has intuitive merit - it is a true and important statement about temperature. However, I would not use it as a definition of temperature because, to a certain extent, that would trivialize the second law of thermodynamics.
The second law of thermodynamics says that we can't move heat from a body of low temperature to a body of higher temperature without doing some work (the amount of work required can be quantified, but that is not critical). If we define temperature by referencing which way heat spontaneously flows, then the second law loses some of its physical content.
To put it another way, if we use this proposed definition of temperature, the only way to tell which of two bodies is hotter is to put them in contact and see which way the heat flows. That's not really our goal, though. Our goal in thermodynamics is to predict which way heat will flow (and how much work we can extract from the heat flow) before we ever put the bodies in contact, and to do it based on some other property the body has. It would be better to have a concept of temperature independent of direction of heat flow.
There are two good alternative ways to think about temperature. You already mentioned the first. In classical thermodynamics, temperature is simply what a thermometer measures. If we take one mole of an ideal gas and let it come into equilibrium with a system, the system's temperature is $PV/R$, with $P$ the pressure of the gas, $V$ its volume, and $R$ a constant called the gas constant. Thus, temperature is a state variable, and a characteristic of systems in equilibrium.
This definition of temperature, based on an ideal gas, is more useful than one based on a particular mercury thermometer because formulas such as the efficiency of an ideal heat pump come out to have a simple form when we use the gas thermometer to define temperature. When temperature is measured relative to an ideal gas, it turns out that mercury and other liquids have coefficients of expansion that change slightly with the temperature, so they aren't good references for precision measurements. However, they can be calibrated to agree with an ideal gas thermometer. We must also remember that an ideal gas is only closely approximated by real gases, so even in this case we cannot get absolutely perfect temperature measurements.
The most widely-used scale of temperature in science is the Kelvin scale. It equates absolute zero with 0 K and the triple point of water with 273.16 K. Then it takes a linear interpolation between those two points, based on the ideal gas mentioned above.
The other way to think about temperature is statistically. Suppose you have a bunch of gas molecules bouncing around in a room. At any given moment, some are going fast and others are going slowly. The molecules thus have a spread of kinetic energies.
We then wonder what the distribution of these kinetic energies is. Are they mostly clustered around one value, say all between $.016$ and $.020$ electron-volts? Or are they spread out widely from $0$ to $0.1$ electron-volts?
We could imagine the energy distribution having all sorts of different shapes. It could have five different humps, or look like a bell curve, or be flat with a gap missing, etc. It turns out this isn't the case. The energy distribution plot for equilibrium systems comes out looking pretty much the same all the time; it's a decaying exponential $\propto e^{-\beta E}$. Every time you go up by a certain fixed amount of energy, the number of particles at that energy state gets cut in half. (The exact fixed amount, $\ln(2)/\beta$, changes depending on the system; what's common is the property of getting regularly cut in half.) This is called the Boltzmann distribution.
All you have to do is specify the number $\beta$ and you know everything about the distribution of energy. This is quite a remarkable fact, and not an obvious one. This distribution occurs because it maximizes entropy for a fixed energy.
In this distribution, the average energy is $1/\beta$, and we call this average energy the temperature. (If the energies do not have a Boltzmann distribution for some reason we could still identify the mean energy as the temperature.) If $\beta$ is small, then the temperature is high, and the energy distribution is widely spread and fairly flat. At high temperature, things are quite random. If $\beta$ is big, the temperature is low and energies are clustered near zero. Low temperatures are more orderly. (An interesting side note is that a perfectly-flat energy distribution has the temperature being infinite, and further if the high-energy states are statistically more-likely than the low-energy states, the temperature is negative.)
It turns out that this statistical temperature and the thermodynamic temperature defined earlier agree with one another, up to a scale factor called Boltzmann's constant. So we can think of temperature either way, and the definitions are equally good. Either definition adds content to the second law of thermodynamics by making temperature an independent concept, and the direction of the flow of heat a physical hypothesis that involves temperature.
Finally, an important property of temperature is that it is well-defined. It is an observed fact if systems A and B are in thermal equilibrium, and B and C are in thermal equilibrium, and A and C are as well. This is called the zeroth law of thermodynamics.