In the anthropogenic climate change contrarian (or ACCC for short) blogosphere the past month, there's been two uproars about NOAA (National Oceanic and Atmospheric Administration) data. In this, I will look at both of these uproars, and I will also request math help because my math skills fell apart while trying to do some maths.

Uproar 1: July 1936

I keep pointing out that the NOAA data was faked for years, leading everyone to think the Earth has been getting a lot hotter

Advertisement

Did you guys know that the NOAA data that people have been using for years to claim the world has been getting hotter was actually faked to all hell?

Do you guys not hear me when I mention that the NOAA global warming data was faked for years, and it's the main reason why people believe the world is now hotter? Wrong...it was hotter in the 30's, bitches.

Being that the NOAA data on global warming was faked, how has the climate been changing?

Advertisement

What, are you afraid to put a link to the article about how NOAA has been fudging their data for years now to make climate change seem far more dramatic?

NOAA has just changed the "hottest year on record" back to 1934 after reviewing the poor data they have.

you might check with the NOAA's latest flipflop (they've quietly reinstated 1936 as the hottest year on record, not 2012)

In response to the last of these comments, I put up a quick reply explaining what happened. In short, NOAA recalibrated their surface temperature records which are organized into three datasets: USCRN (which extends back to the mid-2000s), ClimDiv, and USHCN (which both extend back into the mid-1890s). These recalibrations occur on occasion, and every time they do, ACCC blogs lose their minds, because in their minds, any change in scientific data is evidence of the conspiracy theory they believe in. Rather than what recalibrations are in reality, which is that data comes in to augment previously reported records and these records need to be altered. If you find a dollar in the sofa and you recalibrate your personal wealth, don't tell an ACCC blog, as they might say you lied about your personal wealth.

In one of those datasets reaching back to the 1890s (ClimDiv) but not the other (USHCN), these recalibrations made July of 1936, not July of 2012, into the average warmest July in the contiguous US. USHCN still says that July of 2012 is the warmest July in the contiguous US, as does USCRN (although it says so with many fewer Julys).

…that's it. That's the entirety of what happened, but ACCC advocates blew this out of proportion into thinking this meant that NOAA now claimed the 1930s were warmer than modern temperatures. I could make a lot of snarky comments here but I won't. I'll just post this graph of USHCN average yearly temperatures in the contiguous US), which very strongly appear to indicate that the 21st century has been, on average, warmer in the contiguous US than the 1930s were:

Uproar 2: A US cooling trend

now that they have the pure NOAA temperature stations showing a temperature decline why don't they use them?

Advertisement

if you do use the USCRN you can see that not only has the US not been warming since 2005, it's been cooling.

This uproar is about the above graph, which shows USCRN monthly reports on contiguous US average temperate anomalies from January 2005 to April of 2014. It does not show an increase in average temperatures, it probably even shows a slight decline, and thus in the minds of ACCC advocates it voids longer-term trends, such as the one in the graph posted above it.

Advertisement

Obviously this is cherry-picking of data: 9 years of data should not void over a century of data, and people who say otherwise have to provide a very good reason for why they think so. Their very good reason is that the USCRN is being set up to provide the very best possible weather data, and thus its 9 years of data are better than the preceding century of data, which might as well get tossed in the garbage. I don't think that qualifies as a very good reason but YMMV.

One thing I found particularly weird about this uproar is that articles at ACCC blogs looked at the USCRN graph and took it at face value. They didn't appear to show any interest in investigating the data that the graph is based on. I find that weird, because the data is all available. I think this reluctance is because those files make no differentiation between the USCRN (U.S. Climate Reference Network) and the USRCRN (U.S. Regional Climate Reference Network), which was/is the pilot program for the better weather monitoring stations that NOAA is installing. But ACCC blogs do know which stations are USCRN and which aren't, see for instance this excel spreadsheet from a prominent ACCC blog. So why aren't they doing the work of verifying what NOAA is reporting?

From sitting down and crunching the numbers, I will very grumpily report that verifying what NOAA is reporting is difficult without knowing what the values are for calibrating these monthly temperature anomalies. I downloaded the USCRN and USRCRN data files, I imported the monthly average data from those files, and I can create a graph showing absolute values of averaged monthly average temperatures from USCRN weather monitoring stations. Like this:

But I don't know how to convert these actual temperatures into temperature anomalies because I don't understand the methods used to estimate temperature normals, which apparently follow the methodology of Sun and Peterson 2005.

Advertisement

So, yeah. If anyone wants to do some math, there's a google document which could be looked at to figure out how to get temperature normals and how to get temperature anomalies from them. Please let me know if you get any success. I kicked this data for hours and all I got from it was confused.

Images from NOAA's twitter page or from one of their above-linked data visualization websites. Artiofab really wishes he could math better.