News Section
Stories from Climate Central's Science Journalists and Content Partners

Recent Storms Highlight Flaws In Top U.S. Weather Model

Repost This

The U.S., which pioneered the groundbreaking science of weather forecasting using mathematical simulations of the atmosphere, has fallen behind other nations when it comes to the accuracy of its global forecasting model. The consequences could be dire for people in harm's way if the U.S. is less prepared for extreme weather and climate events.

The emerging “modeling gap” could erode the accuracy of U.S. weather forecasts and also cause greater economic losses from weather events. A 2011 study found that routine weather variability alone affects the American economy to the tune of approximately $485 billion each year, not including the billions that are lost when major storms strike heavily populated areas. 

Projection from the Euro model showing a worst-case scenario storm track for parts of the Mid-Atlantic.
Click the image to enlarge. Credit: Weatherbell.com

Interviews with more than a half-dozen weather forecasters and federal officials reveal widespread concern over that modeling gap, and guarded optimism about the government’s ability to narrow it in a harsh budgetary climate.

A more accurate U.S. model, and a more capable national and regional weather forecasting system as a whole, could help Americans better anticipate extreme events at longer lead times, which would save lives and limit economic losses at a time when global warming is making some extreme events, such as heat waves, more likely and severe.

Hurricane Sandy is the poster child for that discussion. As Sandy was spinning its way northward from the Caribbean Sea, it was the model run by the European Center for Medium-Range Weather Forecasts (ECMWF) that sounded the earliest alarm. The European Center’s model projected about a week in advance that the storm would make an unprecedented and devastating left hook into the Mid-Atlantic coastline, wreaking havoc the likes of which parts of the East Coast had not seen in modern times.

The top-of-the-line U.S. weather forecasting model, known as the Global Forecasting System (GFS) didn’t catch on to that worst-case scenario until the storm was closer to making landfall in the U.S. That delay contributed to a large degree of uncertainty in the forecasts until just three to four days before the storm hit.

Fast-forward four months to the Feb. 7 blizzard that paralyzed the Northeast by dumping up to 40 inches of snow. Again, it was the European Center’s model that proved to be the most accurate, giving local officials throughout southern New England ample time to prepare, while the U.S. model vacillated between varying projections of the storm’s path, strength, and snowfall amounts.

When it comes to medium-range projections, the gap between the two models’ accuracy  is especially wide. Most U.S. weather forecasters now look to the ECMWF model run by the Europeans, which is located in Reading, England, as well as models run by other organizations — like the U.K. Met Office — to get the most accurate picture of how weather systems are likely to evolve in the 3-to-8-day time frame. Forecasters still take the GFS model into account in those time frames, but usually with greater caution compared to the European Center’s simulations.

Model simulation from the ECMWF model for the path of Hurricane Sandy. The computer model simulation shown here was run on October 25, four days before the storm made landfall.
Credit: ECMWF.

That disparity is, in part, the result of years of decisions made by officials at the National Oceanic and Atmospheric Administration (NOAA), as the agency has tried to cope with increasingly strained resources while still making advancements in climate science, meteorology, and ocean research. At the same time, NOAA has struggled to stem the financial bleeding from long-delayed and mismanaged weather and climate satellite programs. The end result is that NOAA’s operational weather capabilities are not keeping pace with those of other countries

“There’s no question that our global modeling system is inferior,” said Cliff Mass, an atmospheric scientist at the University of Washington who has written a series of blog posts criticizing NOAA, which runs the National Weather Service (NWS), for the lackluster performance of some American weather models during recent high-profile weather events.

More Computer Power

At its most basic level, this is about a power struggle. But not your classic power struggle. Rather, it’s about who has enough juice to run their computers.

Computer models simulate how weather or climate conditions evolve based on a set of initial atmospheric and oceanic conditions, and they help guide meteorologists in making forecasts. The models divide the world into grid boxes — the smaller the size of the grid box, the higher the model’s resolution, and the more fine-grain details it can pick up on. Many storms and weather features, from thunderstorms to the all-important rain/snow line, occur on smaller scales than many computer models were designed for. This means that higher-resolution models often produce more accurate projections.

High-resolution models require more computing power to run, and this is where the National Weather Service has been stymied in recent years, as the lion’s share of the supercomputing resources within NOAA have gone to climate science research, rather than operational weather forecasting.

Currently, the European Center's model is run at about twice the resolution of the GFS model (16-kilometer grid spacing vs. 28-kilometer grids). For forecasts beyond eight days the GFS model has an even coarser resolution, which means it divides the world into larger grid boxes, on the order of 84 kilometers.

Boston's North End neighborhood amid the snow drifts on from the Feb. 7 blizzard, which the European Center's model accurately forecast several days in advance.
Credit: Twitter via Matt Meister.

In other words, it’s like the GFS model is looking at the atmosphere with slightly blurred vision, while the European Center’s model has a clearer view.

Newly appointed National Weather Service director Louis Uccellini said the European Center’s model has a distinct advantage when it comes to helping forecasters anticipate the development of extreme storms many days in advance.

“Their model is better on a day-to-day basis and what they have shown on extreme events is that they can forecast the possible occurrence of these events further out in time” before the National Weather Service’s models pick it up, Uccellini said.

One of the main obstacles to improving the accuracy of the U.S. models is that running higher-resolution models requires more powerful computers, and NOAA has devoted far less computing power to day-to-day weather modeling than the Europeans. Currently, NOAA is using the equivalent of a V6 engine from the family minivan to power the GFS model, whereas in Europe, they are using the equivalent of a V8 engine from a high-performance racing car.

Mass said there is an imbalance between the computing resources devoted to climate modeling, compared to the resources going to operational weather forecasting. “We put so much resources into climate simulation, what I’m saying is that we should put less into that and more into weather forecasting.”

Uccellini said he is working to boost the supercomputing resources devoted to those operational weather models so that the GFS model and other U.S. weather models can be run at higher resolutions. “We’re not there yet with our computing capacity operationally,” Uccellini said. “That’s what I’m focused on, even before I walked in the door.”

Uccellini rejects the zero-sum approach that Mass and others advocate, which pits climate-computing resources against weather resources within NOAA. “There’s a synergy here” between weather and climate research,” Uccellini said. “I work closely with the climate community and the research that they do.”

The National Weather Service has already upgraded the way weather data is fed into the GFS model, a process known as data assimilation, and is putting in place a three-fold increase in computing capacity and speed for the GFS model later this year, Uccellini said. “We have made it a very high priority to enhance the computing capacity even more.

Security camera image of the Hoboken, N.J. PATH station flooding during Hurricane Sandy.
Credit: NY/NJ Port Authority.

“We expect a significant bump up in our operational computing capacity” once the computing upgrade is put in place at the end of August, he said.

While the automatic budget cuts known as the sequester went into effect on March 1, NOAA may have the money necessary to make those improvements.

The Hurricane Sandy relief bills, which passed Congress in January, authorized $207 million for upgrades to weather infrastructure, including the National Weather Service’s supercomputing resources. But the agency has not submitted a plan for spending that money to the House and Senate appropriations committees, which must approve the proposal before any cash can flow. A NOAA spokesperson said the spending plan should be delivered to Congress by the end of this week.

Uccellini said the additional funds will help the National Weather Service close the gap between its capabilities and other countries. “This is a high priority for us, for NOAA and for the Department (of Commerce) and we believe we have the support on the Hill to address this issue,” he said. “I’m optimistic.”

Bureaucratic Roadblocks

Adequate funding and computing power, however, may be only half the battle. According to climate and weather experts, there are also organizational challenges that the National Weather Service faces that make it difficult for the agency to take advantage of the latest atmospheric science research to improve its forecast models.

Compared to the European Center, which is an integrated research and modeling agency dedicated solely to medium-range weather forecasting, the National Weather Service has far broader responsibilities and lacks its own research arm. Instead, it is dependent on the fruits of weather and climate research programs scattered throughout NOAA. A prime example of that is NOAA’s Earth Systems Research Lab in Boulder, Colo., which does not fall under Uccellini’s purview, even though much of the research it conducts is related to weather forecasting.

The National Center for Atmospheric Research recently opened this new supercomputing center in Wyoming to conduct research on weather, climate, and space weather. The computer can perform 1.5 quadrillion operations per second.
Credit: NCAR.

Uccellini said he admires what he likes to call “The European business model,” with the more integrated research and operations structure. However, NOAA is not about to adopt that business model, especially considering that the European Center charges a hefty sum for full access to its data. The National Weather Service forecasts and model data are made available to the public for free.

Uccellini said he is hopeful that he can work with NOAA’s research centers to meet the National Weather Service’s goals. Those goals include far more than just advancing the GFS model to be more on par with the Europeans. They include making improvements in estimating and communicating the uncertainties of each forecast, and adding capabilities for short-term models that could detect smaller-scale, high-impact events like the severe thunderstorms that knocked out power to much of the Mid-Atlantic one sultry July evening last year.

One of the major strengths of the National Weather Service is that it provides a suite of weather forecast models and tools for forecasters, the media and businesses. Those resources are used in a wide array of industries, from aviation and shipping to electric utilities and the financial industry. Many of those same resources are unavailable in Europe.

“My colleagues and I are able to avail ourselves to an array of shorter-term (3- to 4-day) model output, which is absolutely invaluable in our day-to-day work as a forecasters and which isn't available — certainly not in (the) scope or range of the products I access in my weather office — from the European Center,” said Tom Skilling, chief meteorologist for Chicago’s WGN-TV, in an email message.

But while that abundance of data may be helping to minimize the impact that one poorly performing model can have on overall forecast accuracy, experts like Mass say that the U.S. risks becoming increasingly dependent on the European Center and other organizations like it for weather forecasting. That despite the fact that American taxpayers are already on the hook for the data gathering used to feed the flawed model.

Closing the modeling gap will require sustained investments from Congress and commitment from NOAA’s leadership. Uccellini, who as the former director of NOAA’s Weather Prediction Center is intimately familiar with the challenges involved with forecasting high-impact weather events, said he is intent on ensuring that his agency has the resources that are required to address the issue.

In the meantime, weather forecasters have their eye on a series of potential storm systems that will impact much of the U.S. next week, from the West to the East Coast. True to form, the European Center’s model is showing a distinctly different evolution of these storms compared to the GFS. Time will tell which model is more accurate this time around, but the smart money is on the European Center model. 

Related Content
Ongoing Coverage of Historic Hurricane Sandy
B
udget Cuts May Degrade Weather, Climate Forecasting
Weather, Climate Forecasts Imperiled As Programs Cut
Can We Trust Climate Models? Increasingly the Answer is 'Yes'
Flood Warnings At Risk As Cuts to Critical Gauges Loom
Senate Bill Would Boost Funding For Satellites
Got a Computer? Do Some Climate Science

Comments

By Steve Tracton (Washington, DC)
on March 15th, 2013

Andrew, nicely written description of the current state affairs on this subject, but nothing all that new (See: http://tinyurl.com/cfxm8a4).

I continue to believe that a major reason for the funding shortages for human and computer resources for model development at NCEP is tied to the decade plus uncompromisingly high priority given to some high priced satellite programs (http://tinyurl.com/cezh37j). Claims to their criticality to forecasting have been excessively hyped and not sufficiently substantiated, especially in regard to maximizing bang for the buck relative to enhancing capabilities in short to medium range predictions (which critically encompasses doing better on the apparent increase in extreme events).

Reply to this comment

By Evan Lowery (Bethlehem)
on March 15th, 2013

True, but DEFINITELY not always the case.  A weather model is only as good as the observations it ingests and the physics used to interpolate those observations to places where there are none.  Increasing the resolution requires models to put more emphasis on mathematical interpolation of physical processes versus REAL observations.  This increases noise and subsequently erroneous model feedback.  Simply increasing the resolution of the GFS (while definitely beneficial in the day 8+ timeframe), is not going to make it as good as the ECMWF, and could possibly make it worse.

Step 1: Increase the resolution of the day 8+ timeframe
Step 1: Ingest more and more real-time observations (MesoNet)
Step 2: Improve the math/equations used to represent physical processes.
Step 4: Increase the resolution of the day 1-15 timeframe

“Computer models simulate how weather or climate conditions evolve based on a set of initial atmospheric and oceanic conditions, and they help guide meteorologists in making forecasts. The models divide the world into grid boxes — the smaller the size of the grid box, the higher the model’s resolution, and the more fine-grain details it can pick up on. Many storms and weather features, from thunderstorms to the all-important rain/snow line, occur on smaller scales than many computer models were designed for. This means that higher-resolution models often produce more accurate projections.”

Reply to this comment

By Andrew
on March 15th, 2013

Steve - thanks. Your 2010 story had some good info from the NRC report. I do think this issue is much more relevant now than it was in 2010, but it’s a pity that it took some big storms to serve as wakeup calls. Some critics blame climate research for getting all the funding, instead of weather, but I think your view is more accurate - the challenge has been the satellite programs, as I indicated in the story.

Evan - thanks for the clarification. Good pt.

Reply to this comment

By Steve Tracton (Washington, DC 20024)
on March 15th, 2013

Evan, there’s good reason to believe the problem with the GFS vs ECMWF is in the arena that you mention, namely interpolating observations where there are none. To modelers this is referred to as data assimilation which provides the analyses (initial conditions)  which models extend into the future via approximations to the physical and dynamical processes (as best we know them)  governing evolution of weather systems.

In the very short range (~3 hours) the forecast provides the “first guess” to incorporate later observations to the next analysis. Hence the forecast model and data assimilation procedure are mutually interdependent and collectively referred to as the data assimilation system. 

The ECMWF data assimilation procedure is considerably more sophisticated than that now used for the GFS (technically, 4D vs 3D VAR “variational analysis”). While the innards of the forecast model - including resolution- are important, the majority of the superiority of ECMWF is thought to be in the 4-D VAR providing a better description of initial state of the atmosphere (analysis).  This is supported by experiments with the GFS running from ECMWF analyses generating notably improved forecasts (including SANDY). 

BTW: The rule of thumb in modeling is that increasing the resolution by a factor of 2 requires about 9 times an increase in computer crunching time. If as Louis indicates the NCEP computer resources will increase by 3-fold, that’s not going to be nearly enough to get to the current resolution of ECMWF - which will probably increase within the next year and a half. But, resolution is not everything, but that’s a different narrative   ....

Reply to this comment

By Doug Brockman
on March 17th, 2013

Gee.

I wonder how my ancestors in Olde England survived ‘extreme events’ without supercomputers.

Oops, forgot,  there was no “climate change’ then and the entire world was an untouched Eden.

Reply to this comment

By Kevin
on March 20th, 2013

Andrew - Great article; hope to see more like this in the future.  Keep up the good work!

Steve - you’re obviously very educated on the subject of modeling.  Therefore, I can’t understand why you dismiss the critical importance of our weather satellites.  The best supercomputers and the best models mean nothing without the data those satellites provide.  Garbage in, garbage out…

Kevin

Reply to this comment

Name (required):
Email (required):
City/State/Zip:
Enter the word "climate" in the box below:

[+] View our comment guidelines.

Please note: Comment moderation is enabled. Your comment will not appear until reviewed by Climate Central staff. Thank you for your patience.