Perhaps counter-intuitively, radiative heat transfer equations show that if you have an internally-heated plate with constant energy input (eg electric resistance heating with constant power, or a constant light source) in a vacuum, it will reach a certain equilibrium temperature $T$, and if you then bring a cold plate (i.e. one with no heat source) near it but not touching it, the equilibrium temperature of the hot plate will be higher than $T$!
Indeed the closer the plates are together, the hotter the hot plate gets, to a point ($F$ is the view factor). Finally, when they touch, if conductivity is good they will both equalize to close to the initial equilibrium temperature.
This seems counter-intuitive and that it would violate the second law of thermodynamics. Calculations and models are all well and good, but my question is, have any experiments been done that actually demonstrate this in action?
This paper reproduces the set-up, but they don't show the result of running just the hot plate alone to compare. Plus it isn't in a vacuum so it couldn't fully discount effects of prevention of loss due to convection.
It seems easy enough to try at home in a vacuum chamber but I don't have experience rigging things like this up so it'd be a bit of a learning curve.