D
DerelictJunction
Guest
I'm sorry. I must have missed the part where you addressed the evidence that would be left if the rates of decay were significantly different that they are today. Rates of radioisotope decay are what we are discussing rather than your philosophy of science. Discovery of evidence is science's means of lowering uncertainty.I never said we should be agnostic about things that happened in the past. Yesterday was my wife's birthday and I bought her a cake. There's nothing to be agnostic about. I was there and physically witnessed it.
What you are claiming, however, is that because you have viewed the behavior of U238 for some 50 years, that you can say how it has behaved over the last 4-6 billion years. So a 0.00000125% sample can tell you everything you need to know about something? Wow.
However, let's take your argument at face value and see where it leads us. You think that the past is a good guide to the future. Great–has science ever been wrong in the past? Yes? Then science will also be wrong about things in the future provided that your assumption holds true.
You start talking about pork chops in the freezer. Apparently your solution is to pretend that you know things you don't. My solution is called Decision Theory. You should look into it.
Now, having dispensed with your initial claims, I want to get to the heart of your argument. The foundation of your argument is that evidence is both important and necessary to make a positive claim. Why do you think so?
Since we've already established that no amount of evidence can prove a theory true, or even provide probable support, then what's the point of requiring evidence? What you should require, assuming that you are a pro-science kind of guy, is that the theory be testable. If I said, for example, that a certain spinning magnet will always cause cold fusion if paired with certain types of metal and heavy water, then you could test that theory. The theory is probably wrong, by the way, since I just made it up.
To go one further, your claim that evidence is necessary is a claim made without evidence to back it up! Unlike scientific theories, it's not even testable, as far as I'm aware. What experiment could I do to determine whether evidence is necessary to make a positive claim?
All you are really saying is that you have an a priori philosophical bias in favor of evidenced claims. Good for you. Not everyone does. Why should I, for example, adopt this philosophical bias?
In this case we can lower the uncertainty regarding possible changes in decay rates if we can determine what changes in materials occur as decay rates increase. I think the key is the energy given off by the process.
To get you up to speed, decay products are typically moving at a high speed and are highly ionized. In order to slow down, they must transfer their energy to the atoms of their surroundings. That energy transfer manifests itself as heat. Higher rates of decay result in more heat input in a shorter period of time. That raises the temperature.
Significant changes in decay rates result in significant increases in radioactivity.
So: 800,000 years divided into 4,500,000,000 year is 5,625.
The decay rate would have to increase by 5625 times in order to give us the evidence we see today. That is if the rate was constant up until this year.
That would mean radiation exposure for humans would be 5625 times higher than it is today. That is significant, and it is also refuted by the fact that massive numbers of humans have not died of radiation sickness in recorded history.
I'd say that your claim of increased radioactivity levels is on shaky ground.
Maybe you'd like to step up and defend your whole "nothing is certain" philosophy.
Upvote
0