Scientists are trying to combat growing scepticism over their global-warming predictions with a new generation of climate simulation computer models.
Wind of change in weather debate
Whatever the truth about global warming, there is no doubt that the climate-change debate is hotting up. Scientists may still insist human activity is driving up global temperatures, but opinion polls reveal growing public scepticism about the evidence. A survey published last week suggests that almost one in three people in Britain now thinks the problem has been exaggerated - compared to one in five a year ago.
Some blame the scepticism on the bitterly cold weather still sweeping across the northern hemisphere, which seems hard to square with a warming planet. Others point the finger at the controversy surrounding leaked e-mails between leading climate scientists, which revealed shoddy research practices and a dismissive attitude towards sceptics. Yet public scepticism was already on the rise before the "Climategate" debacle hit the headlines. Last October, a report by the Pew Research Centre in the US showed that just 57 per cent of those polled believed there was compelling evidence of global warming, down from 71 per cent in April 2008.
To combat such scepticism, scientists need to boost confidence in their often scary prognostications. And over the coming months, they will be attempting to do just this with a new generation of computer models that simulate the Earth's climate in unprecedented detail. Based on millions of lines of computer code, these models attempt to capture the interactions of the land, sea and air of our entire planet, right down to the effect of dust in the air and plants on the ground.
One such simulation, now being put through its paces at the British Meteorological Office's Hadley Centre in Exeter, even includes the seasonal change in foliage for different types of vegetation. Doing all this demands mind-boggling amounts of computing power. Among the machines being roped in by scientists in the US is the world's fastest supercomputer, the Jaguar Cray XT5 at the Oak Ridge Leadership Computing Facility in Tennessee, capable of more than a million billion calculations a second.
The hope is that the increased realism will result in more reliable predictions of how the Earth will respond to the increasing atmospheric pollution in the form of greenhouse gases. Certainly researchers are in no doubt about the dangers of pushing simplistic models too far - not least because sceptics won't let them forget one egregious case dating back to the early days of climate models. In 1971, two scientists at Nasa's Goddard Institute for Space Studies (GISS), in New York, published a pioneering study of a key issue in climate research: the role of dust and smoke on global warming. Produced by both natural and man-made sources from volcanoes to power stations, these so-called aerosols end up in the atmosphere, but their climatic effects are complex. In some cases, they can reflect sunlight back into space, combating the warming effect of greenhouse gases such as CO2.
To find out which wins, Ichtiaque Rasool and Stephen Schneider created a simple computer model which included both greenhouse gas and aerosol effects. They found that if enough aerosols were released over the next 50 years, the cooling effect of aerosols might overcome greenhouse gas warming - and possibly trigger a new ice age. Despite pointing out that aerosol levels may never reach such high levels, the authors found their research fuelling a media frenzy over an impending ice age. The fact that global temperatures soon headed in the opposite direction is now often used by sceptics as evidence that climate scientists do not know what they are doing.
Yet even at the time, the GISS model was recognised as just a first stab at understanding the future climate. It used questionable figures for both aerosol production and global warming effects, and failed to capture the spread of sources of both across the planet. Today's climate simulations benefit from 40 more years of research - plus much more computing power. Around the world, teams of climate scientists are now working to debug their latest models, checking their success by seeing how well they reproduce the climate of the past. They will then be ready to tackle the big questions: just how much will the Earth warm up? And what will be the consequences for us all?
The fact that the latest generation of models are so much more sophisticated suggests that they should be much better placed to provide clear answers. But paradoxically, they may end up muddying the debate. That's because when it comes to resolving doubt, more sophistication is not always better. For a start, it is becoming clear that getting the details wrong can be as problematic as messing up on the big stuff. The current issue of the journal Nature reports how researchers testing the Hadley Centre's latest model were caught out by failing to include enough vegetation in arid areas. When the virtual computer-generated winds blew, they whipped up vast amounts of dust - which in turn fertilised large areas of the oceans, causing outbreaks of climate-changing phytoplankton.
Then there is the issue of realism. Every extra detail added to a climate model may make it more realistic, but at the cost of an extra source of uncertainty. For example, a climate model which includes, say, the effect of foliage may well produce a more reliable figure than simpler models for the warming of our planet. But there are inevitable uncertainties about the foliage effect - and these will boost the uncertainties surrounding the final result, as uncertainties always tot up.
In short, all the extra sophistication of the new models may end up giving us a more realistic view of the future climate - but one that is also more vague. Predictions of future global temperatures may be more reliable, but also span a bigger range of values. And that could spell trouble for scientists hoping to boost confidence in their predictions. They may be comfortable with the dictum attributed to the economist John Maynard Keynes that "It is better to be vaguely right than precisely wrong". But whether the public and politicians will be impressed remains to be seen.
Robert Matthews is Visiting Reader in Science at Aston University, Birmingham, England.