What are they, What lengths of time are they acurate to?
Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.
What are they, What lengths of time are they acurate to?
What are they, What lengths of time are they acurate to?
There is no "radiation" in the rocks. You probably mean starting daughter isotopes. Isochron dating makes no assumptions about starting daughter isotopes and can be used to determine if there were any to begin with.How do scientists deside how much radiation was in the rocks to begin with, so they can decide percentage of rate of decay?
All evidence infers this, including examination of distant stars.How do they know decay is a constant rate over billions of years?
Only under extreme physical conditions is there any change in the decay rate.How do they know nothing has interfeared with the rate of decay over the billions of years?
2% accuracy still gets you into the tens of millions of years out, and I would like to know how they verify the acuracy?
What sounds sketchy? You've barely looked at any details.Sounds a bit sketchy to me.
As Split Rock said, there is no "radiation" in the rocks. What can throw off age calculations are excess daughter isotopes (decay products) that were not produced by the original parent isotopes in the rock. The effects of this are minimized by choosing minerals to analyze that don't incorporate daughter particles when they form. For example, I work with zircon uranium-lead dating. When they form, zircons incorporate uranium into their crystal structure, but not lead, so it can be assumed that all lead in the zircon was produced by the decay of uranium. You can get inclusion of other minerals within zircons, which may contain lead, so before we do any analysis, we select zircons that are clear (not cloudy from mineral inclusions) and that don't have inherited cores from older zircons that were re-incorporated into the melt. This can easily be seen under a microscope, and ensures that the dates we get are as accurate as possible.How do scientists deside how much radiation was in the rocks to begin with, so they can decide percentage of rate of decay?
Many different methods, all with different decay constants, are independently consistent with each other. If there was variation in the decay constants, no dating methods would agree. Moreover, they can be cross-verified against methods such as ice core dating and varves that do not rely on radioactive decay. These methods are consistent as well.How do they know decay is a constant rate over billions of years?
See above.How do they know nothing has interfeared with the rate of decay over the billions of years?
In most cases, the accuracy is better than 2%, though it really depends on the method. For example, modern uranium-lead dating often has an uncertainty of less than 0.5%. In practice, not all numbers are perfect, which is why we include the uncertainty. It allows others to judge the validity of our data and conclusions.2% accuracy still gets you into the tens of millions of years out, and I would like to know how they verify the acuracy? Sounds a bit sketchy to me.
There is no "radiation" in the rocks. You probably mean starting daughter isotopes. Isochron dating makes no assumptions about starting daughter isotopes and can be used to determine if there were any to begin with.
All evidence infers this, including examination of distant stars.
Only under extreme physical conditions is there any change in the decay rate.
The accuracy is determined by replication, and the technique used. Also, 2% error doesn't get you anywhere near to 6,000 years.
What sounds sketchy? You've barely looked at any details.
I thought IDers had no problem with Deep Time, since they weren't creationists (ha, ha).
No. It is not.
As Split Rock said, there is no "radiation" in the rocks. What can throw off age calculations are excess daughter isotopes (decay products) that were not produced by the original parent isotopes in the rock. The effects of this are minimized by choosing minerals to analyze that don't incorporate daughter particles when they form. For example, I work with zircon uranium-lead dating. When they form, zircons incorporate uranium into their crystal structure, but not lead, so it can be assumed that all lead in the zircon was produced by the decay of uranium. You can get inclusion of other minerals within zircons, which may contain lead, so before we do any analysis, we select zircons that are clear (not cloudy from mineral inclusions) and that don't have inherited cores from older zircons that were re-incorporated into the melt. This can easily be seen under a microscope, and ensures that the dates we get are as accurate as possible.
How do scientists deside how much radiation was in the rocks to begin with, so they can decide percentage of rate of decay?
How do they know decay is a constant rate over billions of years?
How do they know nothing has interfeared with the rate of decay over the billions of years?
2% accuracy still gets you into the tens of millions of years out, and I would like to know how they verify the acuracy? Sounds a bit sketchy to me.
Hey, brother. Be careful. There IS radiation in the rock.
"Don't put a pegmatite in your shirt pocket for long" is a joke. But it has some truth. Needless to say an U ore. (I gave my hot U-rich samples to physics department long time ago. It seems they don't mind but enjoy the clicking sound. But I do.)
There is no "radiation" in the rocks. You probably mean starting daughter isotopes. Isochron dating makes no assumptions about starting daughter isotopes and can be used to determine if there were any to begin with.
All evidence infers this, including examination of distant stars.
Only under extreme physical conditions is there any change in the decay rate.
The accuracy is determined by replication, and the technique used. Also, 2% error doesn't get you anywhere near to 6,000 years.
What sounds sketchy? You've barely looked at any details.
I thought IDers had no problem with Deep Time, since they weren't creationists (ha, ha).
There may be radioactive isotopes in the rock, but there isn't radiation sitting inside the rock like a mineral. Radiation may come out from the rock, but it doesn't reside in the rock.
How do scientists deside how much radiation was in the rocks to begin with, so they can decide percentage of rate of decay?
False... Isochron dating makes no assumptions about starting daughter isotopes and can be used to determine if there were any to begin with. ...
False.
"All forms of isochron dating assume that the source of the rock or rocks contained unknown amounts of both radiogenic and non-radiogenic isotopes of the daughter element, along with some amount of the parent nuclide."
.
How do scientists deside how much radiation was in the rocks to begin with, so they can decide percentage of rate of decay?
How do they know decay is a constant rate over billions of years?
How do they know nothing has interfeared with the rate of decay over the billions of years?
2% accuracy still gets you into the tens of millions of years out, and I would like to know how they verify the accuracy? Sounds a bit sketchy to me.