We have that concrete proof.
"The radioactive decay rates of nuclides used in radiometric dating have not been observed to vary since their rates were directly measurable, at least within limits of accuracy. This is despite experiments that attempt to change decay rates (Emery 1972). Extreme pressure can cause electron-capture decay rates to increase slightly (less than 0.2 percent), but the change is small enough that it has no detectable effect on dates.
Supernovae are known to produce a large quantity of radioactive isotopes (Nomoto et al. 1997a, 1997b; Thielemann et al. 1998). These isotopes produce gamma rays with frequencies and fading rates that are predictable according to present decay rates. These predictions hold for supernova SN1987A, which is 169,000 light-years away (Knödlseder 2000). Therefore, radioactive decay rates were not significantly different 169,000 years ago. Present decay rates are likewise consistent with observations of the gamma rays and fading rates of supernova SN1991T, which is sixty million light-years away (Prantzos 1999), and with fading rate observations of supernovae billions of light-years away (Perlmutter et al. 1998).
The Oklo reactor was the site of a natural nuclear reaction 1,800 million years ago. The fine structure constant affects neutron capture rates, which can be measured from the reactor's products. These measurements show no detectable change in the fine structure constant and neutron capture for almost two billion years (Fujii et al. 2000; Shlyakhter 1976). "
http://www.talkorigins.org/indexcc/CF/CF210.html
In order to change the decay rates you would have to change the fundamental laws of physics.