Down to the asterisk was edited to add. T cited a sentence
The fact that the non-HCN subset of Jones has the same trend as HCN suggests that the time of observation bias has been properly treated in the latter. (ibid, emphasis added)
This may not be true
The NCDC data show a
regional decline in temperature by 0.1°C, whereas the USHCN
data shows an increase of 0.4°C.
The USHCN results are
consistent with the Intergovernmental Panel on Climate
Change (2001) and Karl et al. (1996) for New England, which
strongly suggests that the region has warmed to an even larger
extent than that documented by the New England Regional
Assessment Group (2001) who used the NCDC climate
divisional data to analyze statewide and regional trends from
1895-1999. Even at the climate divisional level, the USHCN
pattern is more geographically cohesive in that no division has
cooled over the period of record, and the region of significant
warming are all contiguous divisions in the southeastern
portion of the study region (Figure 1). This seems much more
logical than the NCDC data pattern where adjacent divisions
have significant trends, but in opposing directions, e.g., MA-1
and MA-2.” Barry D. Keim, et al, “Are there spurious temperature trends in the United States Climate Division database?”
http://www.ccrc.sr.unh.edu/~cpw/papers/Keim_GRL2003.pdf
And if you understand the homogeneity correction which is used, it basically changes the trend to match the approved trend. That was in the article you cited the Peterson article.
“The homogeneity adjustments applied to the stations with poor siting
makes their trend very similar to the trend at the stations with good
siting.” THOMAS C. PETERSON, “EXAMINATION OF POTENTIAL
BIASES IN AIR TEMPERATURECAUSED BY POOR STATION
LOCATIONS,” American Meteorological Society, Aug, 2006, p. 1078 fig 2
So having the same trend may be an artefact of the editing. Don't rest your entire argument upon this, T
*
That is a mighty weak response to a curve that shows that the yearly corrections between the raw data and the final edited data grow each year.
I guess I will use this to complete my post from above.
Now there is another issue that my amateur ways draws me to. That is the issue of the significant digits. The attached picture of a temperature report from 2006 from Bartow Florida shows that all the data is in integer format. Thus, any reporting of temperature with a single decimal significant digits is questionable (given that there is lots of variation in the single digits column, but granting that one can go one more digit, then the maximum place for reporting temperatures should be the tenth of a degree.. For those who don’t know, the rule is that in any mathematical operation the answer ends up having the same level of significant digits as the number having the fewest significant digits that is involved in the calculation. If you multiply 1.6789 x 2.3, the answer is 2.8, not 3.8 not 3.86147. So, any calculation of global temperature that claims to be more accurate than this is problematical.
James Hansen, when correcting the average temperature of the world said this
““Sorry to send another e-mail so soon. No need to read further unless you are interested in temperature changes to a tenth of a degree over the U.S. and a thousandth of a degree over the world.”
http://www.columbia.edu/~jeh1/mailings/20070810_LightUpstairs.pdf
But we don’t measure temperature to a tenth or a thousandths of a degree. So this is like the engineer with a calculation showing that the oil is flowing at the rate of 50.82045876210934 barrels of oil per day, when we measure it to the half barrel.
I know everyone wants me to discuss the satellite data. I already have. I have no doubt it is rising if one uses a linear regression. The reason I am not impressed is that I know the periodicities of long term temperature variations. The Vostock core measured the ocean temperature via the proxy deuterium. He hotter the oceans the more Deuterium can be found in the ice. The cooler, it remains in the ocean and there will be less. Below is a chart of CO2 (red) Deuterium Blue in years BP which starts at 1950 and goes back. Note the huge changes in ocean temperature while the CO2 values don’t wiggle much. CO2 seems to be doing very little while the oceans do a lot. And, the red bar is exactly the length of time we have been measuring satellite data. It can’t detect long term cycles, but we know they are there. So, any test for whether or not it is rising or not is simply not relevant to long term behavior. So, yes, the F-test will show rising, but as a geologist I know the mistakes one can make taking a short term view
The mistake of taking 30 years and claiming that we are destined for a GW catastrophe as some have claimed is precisely the mistake novice investors make. They think if the market is rising, it will rise forever, and if it is going down it will go down forever. We know this isn’t true for the market or the weather. There are feed backs, the temperature will neither fall to -460 F nor will it rise to 8000 F, to use extremes, so the temperature must be cyclical. As long as we have an atmosphere and present solar output the global temperature is unlikely to fall below freezing. But similarly, since we have already seen CO2 contents of the atmosphere as high as 3000 ppm (10x today’s level), it is unlikely that we will kill ourselves by having CO2 go to 1000.
There is another problem I see in the data, given my amateur status, I can ask dumb questions without fear of embarrassment.
Now, lets look at Electra. What is the local SD used as input to calculate the global SD? At Electra the average of the raw is 61.1 deg F after editing it is 60.2. The raw sd is 4.26, the sd of the edited is 1.36. Now, when the climatologists calculate the final SD, what value do they use as input, 4.26 or 1.36?? It would make a bit of a difference if the final claims for error were based upon the edited and sanitized data rather than on the actual SD derived from the raw data which are, after all, the real observations. But then, I, the amateur might be wrong.
One other thing, At Electra, 5% of the measurements are beyond 3 SD. 1986 is 14 deg above the average, 1987 is 16 deg above (almost at 4 SD) and 1988 is 15 deg above, 1991 is 14 deg above and 1993 is 15 deg above the mean. With a SD of 4.26, these five points are outside of the 3rd standard deviation or outside the .99% confidence interval. That is not good, at least my novice views tell me so.
By editing the data, they fundamentally change its statistical properties, and they replace the bad values with something judged to be good, but how close is that value to the real average temperature of those years?
So, When I saw the note asking me to get into the statistics, I had to go away and calculate for a while. But, if I have made error, I have little doubt that my superiors will correct (and chide) me for it. I await the lashings from those who feel superior.
I do think it is time to go back to some gut feel stuff for those who don't enjoy nerd grenades.
Below is Roseberg Oregon,'s temperature station. Note the air conditioners near by, and it is on a hot roof. What foresight and planning these weather guys have. They get to ensure their continued employment moving the raw data 400 standard deviations to its 'correct' value so that we humble peasants can bow and ask them how much money they need to save us from these nasty GW problems.
Also is a plot of two towns in Illinois, which are very close. Note the size of the difference in yearly annual average temperature over such short distances. I know it doesn't bother Thaumaturgy but it does bother me. But hey, I am just a stupid amateur. (for those who should know, the word amateur comes from Latin Amo. It means one who loves. I love science, which is why I have ranged all over the place over the years. I am proudly an amatuer, even in geophysics where I get paid for my amateur work.