Why should we believe that the same institutions that still can't figure out where the Earth's oceans came from (among countless other features and phenomena), have also irrefutably demonstrated that the Earth is 4.5 billion years old.
1. Nothing is "irrefutable", but that being said, the radiometric dating technique shows consistent ages across a variety of methods.
2. There's almost no rational universe that would have a YOUNG earth that has the FEATURES we see on earth.
Only recently has the geologic community rejected uniformitarianism as an unjustified limitation on scientific inquiry into Earth's history
Do you mean 'recently' in a geologic sense? Because uniformitarianism was first proposed over 300 years ago.
, as it constrains past geologic rates and conditions to those of the present.
Are you trying to tell us that you don't know much geology? Yes uniformitarianism does hold sway most of the time, but there ARE known examples of catastrophes (like the Scablands in Washington State, or various volcanic eruptions, etc.)
We know a LOT about the structures we see in rocks because we see them forming in soft sediments in real time today. We see how ripple marks are made in ocean edges and we see ripple marks in the rock records. Why assume there is some difference in how the rock-based ones got there? We know how long it takes for clay-sized particles to settle in calm water, why assume that shales are anything but representatives of massive stretches of long times of calm depositional environments?
So why can't today's evolutionists and old-earth believers accept the possibility that radioactive decay rates may have been different in the past as well? Why must the radioactive uniformitarian assumption to be removed from questioning? Why is this particular uniformitarian assumption regarded as a sanctified truth of the universe?
We know of vanishingly few things that can alter the rate of radioactive decay. The last time I heard anything that even marginally looked like it might be a change in radioactive decay rate was an article (that wasn't even in peer review as I recall) from about a decade ago about neutrino-flux alteration of decay rate of a rare Si isotope but it was due to solar neutrino flux which was seasonal so it evened out in the course of a year if I recall. I know of no other means of altering the rate of radioactive decay on earth. I could be wrong, but without any evidence for such a change, why hypothesize it? Unless one has some INVESTED REASON for a young earth?
And if the rate was DRAMATICALLY different it would have fried all the life on the surface of the earth in order to get the amount of decay we see recorded in the rocks today.
(substantive responses only please. If all you have is emotional hand-waving or a snide flippant comment then please just move along)
Here's the biggest question for YOU: If all the data lines up (both from what we see in structures on the earth that indicate massive time passage, eg the Green River Varves etc.) and shows an OLD EARTH (much older than Young Earth) and radiometric dating ALSO shows an old earth with spans of time that make sense. AND we have no reason to assume an altered rate of decay of radioactive materials, why propose them?
The ONLY reason is to justify a YOUNG EARTH, and the ONLY reason the Earth
has to be young is to justify a small subset of Christian theologians/believers.
That is not a sufficiently compelling reason to toss all we know about physics, chemistry and radioactive decay.