Time-reversal invariance is broken in the general-relativistic description of our Universe, which used to be smaller and hotter than it is now and which seems to have had a beginning at a definite point in the past. For this reason "total energy" is not a conserved quantity in general relativity, and it's possible to imagine an experiment where you might observe such a thing locally.
The "natural" scale of variation in such an experiment is the time between iterations of the experiment $\Delta t$ and the age of the Universe $T\approx 10^{10}\rm\,years$. So two "identical" experiments which were repeated a decade apart and sensitive to the time evolution of the Universe in a simple and linear way would see a part-per-billion level changes in results.
But real part-per-billion sensitivity experiments are hard to compare that way. For instance, one part-per-billion asymmetry measurement I was involved in used optical detectors whose gain drifted, randomly and unavoidably, at a rate of roughly 1% per hour. In order to sustain the assumption that sequential measurements could be directly compared with the precision we wanted, we had to repeat our measurements separated by one millisecond or less. (Other clever tricks are involved, too --- there are many PhD theses on the subject.)
For example, we can currently measure the measure the temperature associated with the cosmic microwave background to be about 2.73 K. You might think to wait a decade and measure it again and discover 2.729 999 997 K, a part-per-billion difference. But actually you'd discover instead that there are local variations in CMB temperature at the part-per-million level and larger. Now if there had been radio astronomers ten million years ago, they might have measured a CMB temperature of 2.732 K, a part-per-thousand change. But if there were, we haven't found their publications yet.
(My PhD advisor recently re-calibrated an apparatus last used about thirty years ago, which included a plutonium calibration source. The group was troubled by an apparent change of roughly 0.05% in the efficiency of their detector even though the calibration procedure should have done better. Finally someone realized that thirty years is a small but measurable fraction of the lifetime of their plutonium.)
Furthermore, I wouldn't expect an Earthbound measurement to be directly sensitive to the absolute age of the Universe. The closest thing that would couple to it would be the Hubble flow, but that's something that matters between galaxy clusters; those of us who are stuck here on Earth, or even within the Local Group, aren't affected by cosmological expansion.
There is tantalizing observational evidence that the fine-structure constant, and thus the strength of the electromagnetic interaction, may have evolved as the Universe aged or may be different, starting in the fifth decimal place or so, in different regions of space. Either change would mean small but predictable year-on-year shifts in the energies associated with certain atomic transitions. So far, searches for such effects in Earthbound experiments are consistent with no change.