Hey there, time traveller!
This article was published 11/1/2013 (1503 days ago), so information in it may no longer be current.
Last summer headlines blared, Hottest July in the history of the United States! The National Climatic Data Center (NCDC) of the U.S. National Oceanic and Atmospheric Administration (NOAA) said so, so it must be true.
This week, NCDC is reporting the same, with the added alarm that 2012 was the warmest year on record and one of the top two years for extreme weather in America.
Climate activists are linking this to man-made global warming, ignoring the fact that the area reported on in the NCDC reports, the U.S. contiguous states (i.e., continental America, not including Alaska), is only two per cent of the Earth’s surface.
So trends that may, or may not, be real in the U.S. in no way indicate global phenomena. In fact, the U.K. Met Office has admitted that there has been no global warming for 16 years and, this week, announced that temperatures are expected to stay relatively stable for another five years.
Regardless, all NCDC temperature proclamations must be taken with a large grain of salt. Here’s why.
Until the use of thermocouples became common in the U.S. climate network, temperatures were determined with mercury thermometers that have, at best, a +/-0.5° Celsius accuracy. Even today, many U.S. stations only record temperatures to the closest whole degree Fahrenheit (0.56°C).
Thus, breaking the 1936 high temperature record by about 0.1°C, as NCDC claimed occurred last July, is not meaningful. This change falls within the uncertainty of the measurement. It is akin to being alarmed that the Moon has moved a millimeter closer when we can only measure the Earth/Moon distance to within a few centimeters.
All that was recorded for most of the U.S. record was minimum and maximum temperature for each day. NCDC’s so-called average daily temperatures were derived by simply computing the average of the minimum and maximum temperatures. But this is not a true average since it does not take into account how temperatures varied throughout the day.
Trusting the NCDC averaging method to come to "hottest ever" conclusions is a mistake because higher minimums at night will yield a higher daily average, even if day time highs do not rise.
This is what happened in July 2012, when, because NCDC records indicate that the U.S. was less cool at night than in July 1936, the average they computed for July 2012 was higher than in 1936. Yet, as demonstrated by Dr. Roy Spencer of the University of Alabama at Huntsville, NCDC records show that daytime high temperatures in July 1936 far surpassed those of 2012. So, July 2012 was not the warmest month in the American 118-year instrumental record.
This week, NCDC’s credibility was further damaged when Chico, California meteorologist Anthony Watts announced that he had discovered huge differences between their "State of the Climate" (SOTC) reports released each month and the actual database of NCDC temperatures. For example, the July 2012 SOTC report, issued in early August, announced that a new record had been set with the average July temperature for the contiguous U.S. being 25.33°C, or 0.11°C higher than in July 1936.
However, NCDC the said the July 2012 average was actually 24.96°C, or 0.37°C less. This is then 0.26°C cooler than 25.22°C claimed as the previous monthly record in 1936. What is going on?
It turns out that NCDC does not wait for all the data to be received before computing, and very publicly announcing, the U.S. average temperature and its rank compared to other months and years. While some stations, such as those at airports, send the data quickly via radio links and the Internet, other stations use old paper forms that arrive by mail considerably later.
When the data from lower technology sources finally arrives, NCDC update their temperature database typically "cooling" the country when all the data is used.
But neither NCDC nor NOAA tells the public and the press if, when the complete data set is analyzed, the temperature announcements in previous SOTCs are no longer correct.
Strangely, NCDC changes temperature data even from the distant past without notification. For example, NCDC now asserts that the average temperature in July 1936 was 24.68°C, more than half a degree cooler than the 25.22°C that they claimed for the month in the July 2012 SOTC report. This allows NCDC to continue to say that July 2012 set a record.
Watts found that, in the 23 monthly SOTC reports between October 2010 and November 2012 (three SOTC reports did not list average temperatures), 22 of them do not match the NCDC database, presumably due to later updating when all the data is received and analyzed. And in all cases except one, the country cooled when all the data was incorporated.
Watts concludes: "It is mind-boggling that this national average temperature and ranking is presented to the public and to the press as factual information and claims each month in the SOTC, when in fact the numbers change later."
So, we don’t really know how much, if any, warming has occurred in the U.S. over the past century. Since the American record is considered to be the most accurate part of the Global Historical Climatology Network, we really don’t even know that global warming has occurred at all in the past century.
NOAA has not responded to questions from the International Climate Science Coalition concerning this issue.
Tom Harris is executive director of the International Climate Science Coalition. Tim Ball is a Victoria-based environmental consultant and former climatology professor at the University of Winnipeg. Both are advisers to the Frontier Centre for Public Policy.