Given that there is no change in relative kinetic energy attributable to the expansion of the universe, what accounts for the difference in energy emitted vs energy absorbed (i.e. the red shift component) by two very distant bodies of mass whose relative velocities only exist due to the expansion of the universe and are therefore effectively zero?
Consider a photon that got emitted some time back in a far-off galaxy and absorbed today on earth. Also assume that there is no peculiar velocity involved, and that it was emitted due to an energy change in a hydrogen atom. The wavelength we would see were we able to observe the photon would be greater than that of a similar photon emitted from earth, and we might put that red-shift down to the space stretching effect of the expansion of the universe that had taken place between emission and absorption. All well and good in terms of observation, but not so good in terms of energy conservation. The photon energy, being only dependent on its frequency, is lower than expected. But unless there is another energy transfer taking place, the energy emitted must equal the energy absorbed right? How can this be explained?