we can’t account for the lack of warming at the moment and it is a travesty that we can’t
– US climate change scientist Kevin Trenberth, whose private emails are included in thousands of documents stolen by hackers and posted online
If you’re interested in statistics then I highly recommend that you add the Stats in the Wild blog to your daily must-read list, and if you’re interested in climate change science then I highly recommend you check out the recent post on the topic. In fact, go and read it now and I’ll wait here until you get back…
Really good stuff, isn’t it?
Now I’m in broad agreement with Stats in the Wild on this one. While it appears there may be a few rogue elements amongst the climatologist community (which obviously I don’t condone, but every family has its dodgy Uncle), overall the science is still sound. I’ve nailed my colours firmly to the mast of Anthropogenic (i.e. human-caused) Global Warming (AGW). Personally I think that to deny AGW is to deny the power of academic peer review, a system that has so successfully underpinned the scientific method for many centuries. Yes, Earth has gone through significant periods of warming and cooling in the past, but the on-average increasing temperatures observed over the last 150 years or so, and particularly the rate of change, is almost certainly a result of increased greenhouse gases (carbon dioxide and methane, particularly) pouring into the atmosphere in ever greater quantities as human populations, industry and agriculture expand.
On the issue of recent lack of warming that has caused Kevin Trenberth such disquiet, Stats in the Wild points out the simple statistical explanation:
You can have a system that is, on the average increasing over the long term, while still observing very flat or even declining trends when we know the overall system is increasing. That doesn’t mean that the system isn’t increasing, it just means we’ve seen one realization of the random system that hasn’t increased entirely by chance.
Stats in the Wild illustrates the point with repeated computer simulations. If you don’t have access to the “R” statistical analysis package you can produce a simplified version yourself using any spreadsheet. The parameters in the example below are made up to illustrate a point.
Imagine that the average temperature of the Earth has increased from 15°C to 17°C by a precise, constant amount every year over a 100 year period. Scientists’ ability to measure this temperature at any point in time is limited by sampling errors, instrument accuracy, and other sources of variability outside of their control. Despite these shortcomings, assume that the scientists’ estimate is always pretty good – consistently within +/-3% of the true value. So the sources of measurement variability result in estimates that bounce around randomly, but always within this 3% margin of error around the true value. Graphically what I’m talking about might look a bit like this:
The grey line in the graph above is the scientists’ estimated temperature randomly distributed around the true value (yellow line) but between the margins of error (red lines). Now if I emphasise the observed temperature you see that, to the scientists, whose measurement limitations mean they can never know the true value, the situation looks like this:
From the graph above you can see there are (crudely drawn lines in orange) intra-time series periods where random variation makes the trend appear to level off or even decrease. If you were an observer at roughly year 65, for example, you’d be forgiven for thinking global warming was a right load of old cobblers, because it would appear that temperatures are stagnant, even getting cooler. As Stats in the Wild concludes, you can have a system that is, on the average increasing over the long term, while still observing very flat or even declining trends when we know the overall system is increasing.
It’s to be expected.
Selected further reading: