Who is to say what the amounts of isotopes are deep inside the earth. The equilibrium as you say that exists would depend on an assumed value of isotopes to balance the exchange. I claim a new balance could work equally well to fit a mathematical model. A good question to explore would be the total thermal units that it would take to warm the earth from 2.7 deg Kelvin to the measured temperature observed today and balance that against radioactive decay output. My guess would be that there is not enough energy from radio active decay alone to melt the entire earth surface. I am playing with some numbers but dont claim a result.
I googled the amount of heat produce by radioactivity in granite and basalt
Granite is 2.6x10[sup]-13[/sup]cal/gram sec
Basalt is 3.8x10[sup]-14[/sup]cal/gram sec
in a million years
a gram of granite will produce 8.3 cal
a gram of basalt will produce 1.2 cal
The specific heat of
Granite is 0.19 cal/g°C
Basalt is 0.2 cal/g°C
Which means
the 8.3 cal from granite will raise its temperature 43°C
the 1.2 cal from basalt will raise it temperature 6°C
To look at your question, lets take the highest temperature of the mantle 4000°C To raise the temperature to that 4000°C from -270°C, a total of 4270°C would take granite 99 million years and basalt 713 million years. Assuming as you point out that the mantle had a similar isotopic composition. We do not know the concentration of radioisotopes in the mantle, but let's take the material we do know. Granite and basalt. Granite starts to melt at 1215°C and basalt at 984°C. Assuming the earth was created at a comfortable 20°C, how many million years worth of accelerated decay would it take to bring granite and basalt to a temperature where they start melting?
It would only take 28 million years worth of decay to melt the granite and 161 million years would melt the basalt.
If we can't compare mantle isotopes with crust, what we can say is that creationism needs hundreds of millions and billions of years worth of accelerated decay to give the hundreds of millions and billions of years read by radiometric dating. But a mere 28 million years worth of accelerated decay would start to melt granite and 161 million years worth would melt basalt. It also means we should not be able to get dates for granites older than 28 million years or basalts older than 161 my. Any more than that and the radiometric clocks would reset, yet we have basalts 4.28 billion years old in the Canadian Shield.
A regenerating magnetic field needs a dynamo. Attempts to model dynamos today in computer simulations that operate in the outer core dont seem to work; they need extra heat. In some cases the model produces dynamos but assume parameters thousands of orders different than proposed exist at this time. As far as I know no model is operating within observed parameters. A residual decaying magnetic field would be the best explanation for the observed field decay of the earth that has been documented by Gauss since 1835 (~10%).
Except as I pointed out the magnetism would have been lost when you went past the Curie temperature. Did Gauss know the temperature at the core was hot enough to degauss a bar magnet? Modeling the earth magnetic field may be difficult, but we know a molten core would produce a magnetic field, while it cannot be a bar magnet at those temperatures.