News Section
Stories from Climate Central's Science Journalists and Content Partners

Got A Computer? Do Some Climate Science

By Alyson Kenward

With software that can be downloaded to almost any home computer, thousands of people are running climate simulations to help researchers learn about how weather and climate will change in the future. Above is a snapshot of global cloud cover taken during a simulation.

With the help of thousands of home computers around the world, an international group of climate scientists has recently set out to better understand how climate changes will impact future weather in places like California, South Africa and parts of Europe.

As part of the new WeatherAtHome.net project, researchers from a number of European and American research institutions are inviting the public to download to their personal computers a climate model that until now has been extensively used to learn about future global climate changes. Now the model is equipped to focus on changes occurring at the local level, with the ability to simulate a number of climate processes at finer spatial scales than its original formulation, for chosen regions of the world. By running this model on thousands of computers, with each run being a bit different than every other, the scientists are hoping to collect a more accurate picture of the local weather changes we might expect in the next 50 years.

According to the researchers involved, there have been nearly 45,000 climate simulations submitted since the project was launched back in mid-September, and this volume of information already dwarfs what is typically available in these types of experiments.

“Statisticians are usually happy with 30 runs through a model,” says Philip Mote, a climate scientist from Oregon State University now collaborating with the international team, headed by researchers at Oxford University that launched Climateprediction.net a few years back. Mote’s part of the project is aimed at better modeling future climate in the western United States. “We’ve got almost 45,000, so we’re already in great shape.”

For the past ten years, people around the world have been volunteering their personal computers’ extra time and energy to help researchers run millions of global climate simulations as part of the Climateprediction.net experiment. Each simulation monitors various aspects of the climate, including global temperatures, rainfall, and cloud coverage, but each one runs according to a slightly different way in which the computers estimate the real climate.  The idea is that while home computers are sitting idle, they can be put to use churning through global climate simulations that return valuable climate forecasts to the researchers. By collecting information from thousands, and eventually millions of individual experiments, the scientists are learning about which estimates are best and thus learning about the accuracy of their predictions. Results from the project have already been published in a number of research papers.

The idea is that while home computers are sitting idle, they can be put to use churning through global climate simulations that return valuable climate forecasts to the researchers.

The success of the Climateprediction.net’s approach of spreading global simulations amongst thousands of personal computers has now been taken a step further in order to tackle another climate modeling challenge, whose relevance may go well beyond the readership of research papers.

“What we’re doing now is translating that framework down to a regional level relevant to decision-making [about the future],” says Mote. Though global climate models are useful for predicting average changes over large regions, he explains, they can’t zoom in to a smaller scale with much precision. But by incorporating more specific local information, like whether the area is coastal or mountainous, a more accurate picture of how weather patterns change over time will emerge, says Mote, which may inform how people approach climate adaptation projects in the future.

In the first few years of their new Weatherathome.net experiments, researchers are focused on three areas that are of particular interest: Europe, southern Africa and the western U.S. Each of these regions has been particularly difficult to model in the past, says Mote, so they serve as great testing grounds to explore this alternative approach. In the western U.S. specifically, the varied landscape and coastal environment introduces cloud and snow patterns that are too complicated to be easily simulated in a global model.

With all the information already available from early simulations, Mote expects that within just a few years these experiments will help them predict how weather will change in California and the Pacific Northwest over the next 50 years.

Though this citizen-based approach to climate modeling has already produced a wealth of data about the future global climate, helping to characterize how the planet will respond to increasing levels of greenhouse gases, it does have shortcomings compared to traditional methods of climate modeling. Typically, researchers use high-performance supercomputers that can handle the complex interplay between all the climate variables more easily than what an average home computer can.

“A supercomputer has tens of thousands of processes running simultaneously,” says Arthur Mirin, a climate modeler from Lawrence Livermore National Laboratory in California, “and they are all communicating with each other on the order of a few seconds.” Mirin has not worked with the Climatepredictions.net team, but he says that more simplified simulations don’t usually give as complete a picture for each run as the supercomputer simulations do.

Yet supercomputer time is hard to come by, says Mote, because so many different researchers are trying to use them for complicated computer modeling. Running thousands of simulations on different computers instead, he says, is one way to help overcome some of the limitations in their experiments.

“With our set-up, we put this experiment up on the website for people to use and boom, you’ve got 30,000 simulations run,” says Mote. “You could never get that with a single supercomputer.”