TL;DR The problem with coldness exchange is not intrinsic to thermodynamics (in fact, thermodynamics secretly uses a similar idea), but rather that such a theory makes thermodynamics harder to apply to our other physical theories. Heat often represents a form of energy and you lose a large amount of engineering equity if you ignore that.
For those with time: Thermodynamics is designed to allow you to predict two things: the flow of heat through systems, and the spontaneity of chemical reactions. With some adjustment of physical laws, coldness works just fine for the second, but not at all for the first.
In ordinary scenarios, temperature is (roughly) proportional to energy. When you heat up some air, the air molecules start moving faster, absorbing energy (this is the kinetic theory of gases). When you heat up a solid, more vibrations (phonons) get created. When you emit light, you create photons--a process that consumes energy. So if we were to describe temperature in terms of coldness, we would betray the underlying physical realities of many applications of thermodynamics. Heat is normally the movement of energy, and if you want to allow the (imperfect) interconversion of mechanical energy and heat, you have to respect that.
To phrase it in terms of duality: you want a theory similar to the hole model for semiconductors or the Dirac sea. But the hole model or the Dirac sea only work because there is some fixed number of carriers (of charge). There is no fixed number of "energy units" in any realistic application of thermodynamics. Particles be created and destroyed in quantum field theory, energy is not conserved in GR, and there are certainly no truly isolated systems here on earth. Again, if you want to account for the ways in which your heat engines and biological systems pump energy through their environments, you need a proper sign convention.
You can get away with "coldness" for predicting chemical reactions. Reactions are spontaneous iff their entropy change is positive; typically calculating entropy is a matter of looking up tables of the enthalpy and entropy changes internal to the reaction and then finding the Gibbs free energy. If we redefined $ G = H + TS $, that system would work just fine.
With all that out of the way, I'll let you in on a surprisingly secret fact: statistical mechanics, which is the math behind thermodynamics, already uses coldness as a more fundamental concept. This will take the slightest pinch of calculus, but you can probably get by without it.
For some reason, the states of physical systems most likely to occur are those with maximal entropy. It is an open problem in mathematical physics exactly why this occurs, but when, by chance, you encounter in real life a system not maximizing its entropy, it will soon move to change that fact. (This is why we can predict chemical reactions in the way I described above.)
So consider two interacting systems, $ A $ and $ B $, given some fixed amount of energy to split. The energy can flow between the two systems, and we want to find the distribution that maximizes the entropy. If you know a little calculus, this should scream "Derivative", and, indeed, set $$ 0 = \frac{\partial S}{\partial E_A} = \frac{\partial (S_A + S_B)}{\partial E_A} = \frac{\partial S_A}{\partial E_A} + \frac{\partial S_B}{\partial E_A} $$ where $ S $ denotes entropy and $ E $ energy. This is an awkward formula, because $ \frac{\partial S_B}{\partial E_A} $ depends neither solely on $ A $ nor on $ B $. But there is some fixed $ E = E_A + E_B $, so $ \partial E_A + \partial E_B = 0 $, i.e. $ \partial E_A = -\partial E_B $. Substituting, $$ \frac{\partial S_A}{\partial E_A} = \frac{\partial S_B}{\partial E_B} $$ So in equilibrium, this quantity is equal across different parts of our system. This quantity is called thermodynamic beta, and we use it to define temperature: $$ T = \frac{1}{k\beta} = \frac{1}{k\frac{\partial S}{\partial E}} $$ where $ k $ is just a physical constant.
We don't go straight to $ T $ because $ \beta $ behaves nicely in (wacky) situations like a paramagnetic gas: we want "negative temperatures" to actually be hotter than "positive" ones, a situation $ \beta $ facilitates. In particular, $ T \to +\infty $ means $ \beta \to 0^+ $, so continuing to lower $ \beta $ means taking $ \beta < 0 $. Going back to $ T $, we find $ T < 0 $ too--$T$ has "passed through" $ \infty $ and wrapped around to the other side. This phenomenon is just an expression of some mildly advanced math, but it can be something hard to wrap a head (and math) around (in particular, it screws up notions of "greater than," no matter how you try to fix the math). For more complex scenarios, one uses Boltzmann factors, which also have a nice expression in terms of $ \beta $, namely $ e^{-E\beta} $.
To sum up, in statistical mechanics, the logically prior concept to temperature is thermodynamic beta, and, looking at how they are related, $ \beta $ is a large iff $ T $ is small--so it is a sort of measure of "coldness." But this only works because statistical mechanics isn't about practical applications--it's about making thermodynamics mathematical rigorous.