Quite the opposite in fact. First it is very hard to change radioactive decay rates significantly under conditions that could naturally occur on earth (except possibly for Be7 which is not used for radiometric dating). Second consider two types of decay, beta decay and electron capture. Anything that speeds up beta decay significantly, such as high levels of ionization, should slow down electron capture and vise versa while probably having little or no effect on alpha decay.
So, if we assume the true age of the earth is 6,000, and this value of 4 billion is an environmentally induced deviation from this true data, and we further assume that as one particular kind of decay gives a value two times too high, another gives one two times too low, we ought to observe ages of the earth in the region of 0.009 years.
While these are ridiculously naïve values with no scientific meaning whatsoever, perhaps it serves to illustrate just how much observed physics would have to deviate in order to agree with a young earth. Our observations would have to be out by a factor 10^5.82.
Upvote
0