I hear that radiometric dating is based on assumptions and therefore can't be counted as correct. There are good assumptions and bad assumptions. example, ice is cold, the sun is hot, therefore, there is no ice in the center of the sun. scientists don't have a magical ship to take them there to see if it's true, but that's still what is taught in the textbooks.
the half life of atoms has always been observed to be constant, so it is assumed that they have always been constant.
with rock dating the parent mineral is measured in ratio to the daughter mineral. this gives an estimate as to how much time the rock has existed. to avoid error, the ratio is compared to different parts of the rock and surrounding rocks to make sure it wasn't contaminated.
these are just a few assumption that radiometric dating makes. I'm just wondering something...
What specific assumption does radiometric dating make that is so bad?
the half life of atoms has always been observed to be constant, so it is assumed that they have always been constant.
with rock dating the parent mineral is measured in ratio to the daughter mineral. this gives an estimate as to how much time the rock has existed. to avoid error, the ratio is compared to different parts of the rock and surrounding rocks to make sure it wasn't contaminated.
these are just a few assumption that radiometric dating makes. I'm just wondering something...
What specific assumption does radiometric dating make that is so bad?