how do you know that method works?
For a limited length of time, it can be checked against completely different methods, like ice core dating.
For a much longer period of time, it can be checked against other radiometric methods. If radiometric dating doesn't work, then there would be no reason for
different radiometric methods to agree with one another, but they do.
Have you or others been here long enough to experience that the isotopes of atoms break down at a constant rate.
It doesn't take that long if you have a lot of atoms. The point is that over a given length of time, half the atoms will decay. If you have billions of atoms, then it's going to take a tiny fraction of that time before a few atoms have decayed. It's therefore possible to measure radioactivity from a source over, say, a week, and show that the rate of decay is constant.
Scientists of course have not just done this for a week, but have done this for all different kinds of radioactive materials for a combined total of ages and ages.
if scientists have done tests how do you know if the rate doesnt change at a rate that we cannot experience like if the rate doubles in 1000 years then doubles in 500 years?
Well, firstly, if the rate of some or all radioactive isotope was different ages ago, we would expect wild disagreement between the various methods of radiometric dating. There should be no correlation at all. First let's imagine that all of the rates of all of isotopes changed in proportion. But different methods use different isotopes, so some methods will give older dates, some will give newer.
If you suppose that different isotope's rates changed differently, then it might be good for just
one rock sample, but then all of the other rock samples would just go crazy.
Secondly, to assume something changes without evidence is a big no-no. You don't think that the laws of physics might tomorrow change, do you? Why not? Because they've
always been the same. Well, decay rates have
always been the same, for as long as we can tell, so we can be pretty certain that they've been the same even longer.
Thirdly, 30 years ago scientists observed a supernova called SN1987A. We received gamma rays from the explosion which contained the signature of cobalt decay. I won't bore you with details, but from the way in which the gamma rays we received changed over time, scientists calculated that what they were seeing was the same decay rate as observed here on earth. The supernova
actually exploded 168,000 years in the past, so we know that radiometric dating is good for that long at least.
Finally, radioactive decay produces heat. If you speed up the rate of decay, you get more heat - simple. (This, by the way, is why a nuclear power plant works) What happens if you squash all the radiation that we thought occurred over billions of years into 10,000 years? Well, you get roast Adam and poached Eve. (In fact, according to Meert, it would be enough energy to melt the entire earth)
Conclusion? If decay rates had been different previously, we'd A) not be able to get radiometric dates at all, B) be extremely confused, C) be even more confused, and D) not be here at all.