It aims to do this by running hundreds of thousands of different models (a large climate ensemble) using the donated idle time of ordinary personal computers, thereby leading to a better understanding of how models are affected by small changes in the many parameters known to influence the global climate.
CPDN, which is run primarily by Oxford University in England, has harnessed more computing power and generated more data than any other climate modelling project.
[5] As of June 2016[update], there are more than 12,000 active participants from 223 countries with a total BOINC credit of more than 27 billion, reporting about 55 teraflops (55 trillion operations per second) of processing power.
In the past, estimates of climate change have had to be made using one or, at best, a very small ensemble (tens rather than thousands) of model runs.
By using participants' computers, the project will be able to improve understanding of, and confidence in, climate change predictions more than would ever be possible using the supercomputers currently available to scientists.
The climateprediction.net experiment is intended to help "improve methods to quantify uncertainties of climate projections and scenarios, including long-term ensemble simulations using complex models", identified by the Intergovernmental Panel on Climate Change (IPCC) in 2001 as a high priority.
Hopefully, the experiment will give decision makers a better scientific basis for addressing one of the biggest potential global problems of the 21st century.
Roughly half of the variation depends on the future climate forcing scenario rather than uncertainties in the model.
Myles Allen first thought about the need for large climate ensembles in 1997, but was only introduced to the success of SETI@home in 1999.
Following a presentation at the World Climate Conference in Hamburg in September 1999 and a commentary in Nature[15] in October 1999, thousands signed up to this supposedly imminently available program.
A thermohaline circulation slowdown experiment was launched in May 2004 under the classic framework to coincide with the film The Day After Tomorrow.
For example, by Gavin Schmidt (a climate modeler with the NASA Goddard Institute for Space Studies in New York).
[20] Climate sensitivity is defined as the equilibrium response of global mean temperature to doubling levels of carbon dioxide.
[21] The possibility of such high sensitivities being plausible given observations had been reported prior to the climateprediction.net experiment but "this is the first time GCMs have produced such behaviour".
This does not check on realism of seasonal changes and it is possible that more diagnostic measures may place stronger constraints on what is realistic.
Published in Geophysical Review Letters, this paper concludes:[22]When an internally consistent representation of the origins of model-data discrepancy is used to calculate the probability density function of climate sensitivity, the 5th and 95th percentiles are 2.2 K and 6.8 K respectively.
Difference in temperature between this and the control phase then gives a measure of the climate sensitivity of that particular version of the model.
Many volunteer computing projects have screensavers to visually indicate the activity of the application, but they do not usually show its results as they are being calculated.
The real-time desktop visualisation for the model launched in 2003 was developed[24] by Jeremy Walton at NAG, enabling users to track the progress of their simulation as the cloud cover and temperature changes over the surface of the globe.
The IDL Advanced Visualisation was written by Andy Heaps of the University of Reading (UK), and modified to work with the BOINC version by Tesella Support Services plc.
Only CPView allows you to look at unusual diagnostics, rather than the usual Temperature, Pressure, Rainfall, Snow, and Clouds.
On 8 March 2009, climateprediction.net officially declared that BBC Climate Change Experiment was finished, before shutting down the project.