I think I'll just drop it. I'm not going to spend an inordinately-long time wondering (and asking) how they dated anything (let alone some temple, for Pete's sake) beyond 6100 years.
I'm already confused, and have been for 54 years.
How on earth material that expanded out of a pixel 13.73 billion years ago shows an age of just 11,000 years is beyond me.
I'm sure the atoms in that temple are much, much older than your 11,000 years.
Do you mind if I try to enlighten you anyway?
(And if you do, I'm still doing it for the lurkers, and because I enjoy explaining things)
Radioactive isotopes have been decaying ever since they exist.
(That may itself be much shorter than the age of matter, because the heavier elements like carbon, potassium or uranium didn't form when matter - or even atoms - first appeared. Heavier elements need something like the nuclear fusion going on in stars to make them from lighter ones. The heaviest need very large stars or even supernovae.)
However, radiometric dating methods
don't measure the age of the isotopes themselves. They measure the time since a certain rock containing some radioisotope came into being
as an object, or (in the case of [sup]14[/sup]C), the time since an organism died. How so?
Let's take
potassium-40 as the first example. [sup]40[/sup]K decays to [sup]40[/sup]Ar, and it can be used to date when a rock crystallised from magma. It doesn't matter how long [sup]40[/sup]K had been in existence, or decaying, before that happened, because crystals only include K and not Ar in their structure (in this case, argon, being an inert gas, simply diffuses away while the rock is molten).
Therefore, when the rock finally solidifies, it's as if all the previous history of [sup]40[/sup]K hasn't happened. Our new rock contains only [sup]40[/sup]K, no argon. Time = 0. But now the rock is solid, so when some radioactive potassium decays, the argon can no longer escape (even though it still doesn't want to be part of the crystal structure). It accumulates. And knowing the half-life of [sup]40[/sup]K, the ratio of [sup]40[/sup]K to argon can give us a fairly good estimate of how much time has passed since the rock was formed. (Of course the accuracy of the estimate depends on the time scale involved - [sup]40[/sup]K has a half-life of 700 My IIRC, so it's pretty useless for very young rocks)
Carbon dating is a slightly different matter.
In the case of [sup]14[/sup]C, the isotope is constantly produced in the upper atmosphere by high-energy cosmic rays (which blast atoms apart, and the resulting free neutrons fuse with nitrogen to give [sup]14[/sup]C). And as any good carbon does, carbon-14 reacts with oxygen and diffuses throughout the atmosphere as carbon dioxide.
While an organism lives, it keeps taking up [sup]14[/sup]C - plants directly from CO[sub]2[/sub], animals indirectly through eating plants (or plant-eaters

). So while something lives, the ratio of different carbon isotopes in its body is roughly the same as it is in the atmosphere. Once the organism dies, it rather obviously stops eating or photosynthesising, so nothing replaces the decayed carbon-14. Then either by measuring the amount of radioactivity per unit mass of carbon or measuring the ratio of [sup]14[/sup]C to stable carbon isotopes directly, you can find out how much time has passed since death. Carbon dating doesn't work on anything older than a few tens of thousands of years, since carbon-14 has a half-life of only 5730 years. After a (geologically speaking) very short time, virtually all of it is gone from any dead organic stuff.
(BTW, the
Wikipedia page on radiocarbon dating seems like a good load of information.)
As you can hopefully see,
it all has nothing to do with the age of the universe or matter, because (for a number of different reasons) radioisotopes in the objects dated were "set" to t = 0 at some well-defined point in the past. The history of the isotopes before that time is completely irrelevant to dating.