There are various questions that one would have to answer, if one wished to claim that there had been large changes in decay rates over geological time. Here is what I think might be the best experiment to prove this claim.
Without using radiological evidence, one can deduce that the Earth is at least a billion years old by counting annual sedimentation layers and measuring thicknesses of rock strata, and cross-correlating between them by presence of identical or near-identical fossil species. This is what Victorian geologists did, leading to the only case I know where geology beat physics for deducing the truth. The physicists asserted that the world could not be much older than 50 million years, because no known chemical process could keep the sun hot for longer than that. The geologists insisted on at least a billion years, and that if it wasn't chemistry, something else must be powering the sun. They were right. The Sun shines by then-unknown nuclear fusion, not chemistry. BTW, it's "at least" because it is hard to find sedimentary rocks more than a billion years old, and such rocks do not contain helpful fossils. Tectonic activity has erased most evidence of pre-Cambrian ages ... except for zircons, but I'm jumping ahead.
Now, jump forwards to today, when we can do isotopic microanalysis of uranium and lead inside zircon (zirconium silicate) crystals. (Skip to the next paragraph if you know about radio-dating zircons.) Zircon has several unique properties. An extremely high melting point. Extreme hardness, greater than quartz. High density. Omnipresence (zirconium in melted rock always crystallizes into zircons as the melt cools, before any other minerals crystallize at all). And most importantly, a very tight crystal structure, which cannot accommodate most other elements as impurities at formation. The main exception is uranium. The only way that lead can get into a zircon crystal, is if it started as uranium which decays into lead after the crystal has solidified from a melt. That uranium comes in two isotopes with different decay times, and each decay chain ends with a different lead isotope. By measuring the relative concentrations of two lead and two uranium isotopes in a zircon, you can deduce the time since it formed using two different "clocks". These zircons are typically the size of grains of sand, so a rock sample will contain millions of independent "clocks" which will allow for good statistical analysis.
So, let's find some zircons in an igneous intrusion into a sedimentary rock whose age we know, roughly, by Victorian geology. It's best if the igneous rock is one which formed at great depth, where all pre-existing zircons would have dissolved back into the melt. The presence of high-pressure metastable minerals such as diamond or olivine would allow us to deduce this, and the fact that all the zircons have the same uranium-to-lead ratios would confirm the deduction. Otherwise one would expect to find a mix of young and older zircons. Choose the youngest, which would have crystallized at the time of the intrusion, rather than having been recycled by tectonic activity from an older time. (Which in many cases is the primaeval solidification of the Earth's crust, and the best estimation of the age of our planet, but that's not relevant here).
Now, compare the age deduced by radioactive decay, to the less accurate age from Victorian geology. If the rate of radioactive decay has changed greatly over geological deep time, there will be a disagreement between these two estimated ages. Furthermore, the disagreement will be different for intrusions of different ages (as judged by Victorian geology), but consistent for intrusions of similar age in different location.
Look for locations where there is a sedimentary rock with intrusion, covered by a younger sedimentary rock without intrusion, meaning that the age of the intrusion can be deduced to be between that of the two sedimentary strata. The closer the age of the two sedimentary strata, the better.
I do not know if this has been done (I'd certainly hope so). Any serious proponent of time-varying radioactive decay, needs to research this. If nobody has looked, get out in the field, find those discrepancies, and publish. It might lead to a Nobel prize if he is right. The onus is certainly on him to do this, because otherwise Occam's razor applies to this theory.
Back to the physics, I'd ask another question, if this observation fails to uncover strong evidence that radioactive decay rates do vary with time. It is this. How come that the $^{238}$U and $^{235}$U "clocks" in zircons always agree? Radioactive decay is basically quantum tunnelling across a potential barrier. The half-life depends exponentially on the height of the barrier. Any proposed time variation, would mean that the height of this barrier varied in deep time, in such a way that the relative rate of $^{235}$U and $^{238}$U decay does not change. Which is a big ask of any such theory, given the exponential sensitivity to changes.