Or at least they were going to be. Then I hit upon this lifesaver:
http://energy.er.usgs.gov/products/databases/CoalQual/index.htm
from which I was able to obtain median* uranium concentration of USGS coal samples from all the counties named in the RATE report. And guess what the data looks like?
*median used because many outliers, hence mean would not be a good measure of center.
There isn't any clear correlation (scatterplot's messy), but that's only to be expected since I cannot pinpoint the exact data point that corresponds to the RATE team's coal sample. But a point in particular to note is the Union, Kentucky point, which happens to have both the highest median and the highest pmc. Coincidence? I doubt it.
To put things into perspective, in 1g of coal, 1ppm of uranium translates into 1 microgram of uranium ... or, 2.53 x 10^15 uranium atoms. That's a lot, especially since .4pmc (say) translates into a mere 2.4 x 10^8 C-14 atoms in that same gram of carbon.
And that completely ignores the presence of uranium decay products which
themselves are radioactive, too.
(If you're wondering why you've never heard that coal is radioactive, it's because you haven't listened:
http://www.ornl.gov/info/ornlreview/rev26-34/text/colmain.html ,
http://greenwood.cr.usgs.gov/energy/factshts/163-97/FS-163-97.html )
Also, this:
http://www.vanderbilt.edu/radsafe/9905/msg00149.html is a thread discussing (in terms of technical geochemistry - you have been warned) why uranium would tend to be associated with coal.
And by the way, here's an obvious oversimplification from the RATE report:
The uniformitarian approach for interpreting the 14C data assumes a constant 14C production rate and a constant biospheric carbon inventory extrapolated into the indefinite past. (accompanying a graph)
This is clearly wrong, it completely omits the existence of calibration methods that agree to carbon dating.