Support Our Work
Blogs Section
Thoughts on everything from climate modeling to energy policy.

Taking the Earth’s Temperature

By Philip Duffy

A visitor to our site recently asked a sensible and important question about how scientists analyze temperature measurements to infer changes in Earth’s overall temperature. We believe that Earth has warmed about 1.3 º F since 1880, based upon tens of thousands of temperature measurements made at weather stations all over the world.

Well, not REALLY all over, and that’s the basis of the question. As you might imagine, the network of temperature-measuring stations was mighty sparse back in 1880, particularly in remote regions like Antarctica and the middle of the Pacific Ocean. So wouldn’t that give an incorrect measurement of Earth’s average temperature? And to make matters worse, as new weather stations were added over the years (and some also disappeared) wouldn’t that tend to distort the apparent trend in Earth’s average temperature? For example, if we brought online a new measuring station in Antarctica, wouldn’t those very cold temperature measurements artificially drag down the average, and make it seem like the planet were cooling?

Yes, yes, and yes!

So how can we be confident that Earth is actually warming? Scientists avoid (to a large extent) the potential problems associated with gaps in weather station coverage by looking at temperature anomalies, rather than absolute temperatures. In other words, we don’t try to measure Earth’s average temperature. In fact, if you have ever looked through the climate literature trying to learn the average temperature of the Earth, you’ve probably noticed that it is almost never discussed. That’s because it’s difficult to measure, for exactly the reason that this question raises—gaps in weather station coverage. (More recently, satellite measurements have provided more complete coverage, but they go back only to 1980 or so, and we’ve learned also that it’s exceedingly difficult to accurately measure very slow temperature trends from satellites. But that’s another story.)

A temperature anomaly is simply the change in temperature compared to the average temperature at that location over a “reference period” (for example 1961-1990). When scientists analyze temperature measurements to try to see if Earth is warming or cooling overall, they first find the yearly-average temperature anomalies (changes) at each location where they have measurements. Then they take the average of the yearly-average anomalies at all these locations. You’ve probably seen temperature-change plots made in this way.

But don’t the gaps in the coverage still distort the answer? Actually, very little (we think). The reason is that while temperatures are very different at different locations, year-to-year temperature changes are not. Imagine a (slightly) idealized world where temperatures in different locations vary dramatically (as they do in the real world) but year-to-year temperature changes everywhere are the same (which is approximately true). In this world, if we try to measure the average temperature using a spotty network of measurements, we would get the wrong answer (unless we were extremely lucky about where the gaps in the coverage are). On the other hand, if we measured the average year-to-year changes (anomalies) in temperature, we would get the right answer regardless of any gaps in thermometer coverage, since the year-to-year temperature changes are the same everywhere.

Well, the real world is not too different from this idealized world. In fact, there are different temperature changes in different locations, but these differences are relatively small. Furthermore, the greatest warming, on the whole, has been in high-latitude regions, where the thermometer coverage has been particularly poor. So we think that our measurements have disproportionately excluded regions where the most rapid warming has occurred, meaning that gaps in the thermometer coverage have caused us to slightly underestimate the true global-average warming. This effect is fairly small: the measured warming during the 20th century (using a gappy thermometer network) was about 1.1 ºF; scientists have estimated that with complete thermometer coverage we would have measured an increase of a little under 1.3 ºF.

For further reading:
P. B. Duffy, C. Doutriaux, I. Fodor, and B. Santer, Effect of Missing Data on Estimates of Near-Surface Temperature Change Since 1900, J. Climate, 14, 2809-2814, 2001.


Temperatures of the Future This week marks the date of the 30-year future setting from the Back to the Future trilogy.

View Gallery