To produce a storm forecast that includes climate change impacts, StormCaster uses two sources of data:
General Circulation Models (GCMs) – GCMs are models of the global oceanic, atmospheric, and terrestrial system. To get projected trends in climate, GCMs are the state of the art. These models have been developed and run by several climate science centers around the world. They typically cover the globe in a grid of about 2-3 degrees latitude and longitude, extending many layers down into the ocean and up into the atmosphere.
The models are run for extended time frames into the future, using scenarios for greenhouse gas (GHG) governance policy. The scenarios range from a future world which is environmentally focused and limited in terms of international trade (low GHG emissions) to a world which is more economically focused and has few limits to trade (high GHG emissions). The scenarios are set by the United Nations Intergovernmental Panel on Climate Change (IPCC).
Due to differences in the scenarios and in the models themselves in terms of assumptions and physics, multiple models with multiple scenarios each are used in the StormCaster to produce an ensemble forecast. The ensemble captures the range of projected trends in climate, giving a more balanced and informative picture of potential future changes in local storms.
GCM results are a key input for StormCaster. The challenge in its development was establishing a web service that the tool could tap into, and that provided GCM results from multiple models for any place in the United States. Atkins developed a purpose-built web service specifically for projects like StormCaster to act as a one-stop shop for GCM results data, which can be found in the Technology section.
StormCaster gets information on trends in climate from General Circulation Models. Four models are used, with three green house gas control scenarios each. Through our North Slope Decision Support System (NSDSS) project, Atkins has established a web service that serves up GCM data from these models. The StormCaster uses this service.
Historical rainfall is the second source of data for StormCaster. GCMs are notoriously biased in terms of rainfall projections at the local scale. To counteract this, a process called downscaling or localization is used. The local historical rainfall record is used to construct a library of storms that reflect local microclimates and storm generation mechanisms. The GCM results are then superimposed on these storms to produce a forecast that carries the signal of local climate and the long-term precipitation trends projected by the GCMs.
As with GCM results, a dependable one-stop-shop for historical rainfall was a key requirement for StormCaster. Atkins developed a public-facing web service using a repository of 3300 rain sites with curated data from the National Climate Data Center (NCDC) historic dataset. This web service is described in more in the Technology section.
To reduce bias in the GCM results, and produce realistic storm events, StormCaster uses historical rainfall from a network of rain gages around the country. Atkins has established a historical rainfall web service to serve up data stretching back up to 100 years from 3300 sites around the country.
Once the user selects a location for forecasting, the StormCaster tool produces:
A short time step (15- or 60-minute) ensemble forecast of rainfall from 2010-2100. The forecast is an ensemble to reflect the uncertainty innate to the forecast. It has a short time step to help with understanding how individual storms will be impacted by climate change. Because of the short time step, the forecast can be used as input for flood models, ecosystem models, agricultural models, and a long list of other models that use rainfall as a driver.
An interpretation of the 100-year storm from the forecast, so that the user can see how the 100-year storm changes over time in the coming decade.
The StormCaster algorithm starts with the user selecting a historic rain site as their desired location. The StormCaster then builds a short time-step storm forecast from a combination of GCM projections and historical rainfall data.
To produce this forecast, the following algorithm is calculated:
Select Forecast Location – The user selects a rain site from the 3300 sites in the Atkins NCDC historical rainfall web service (see the Technology section for details on the web service).
Get GCM Data at Monthly Time Step – Using the NSDSS GCM-results web service (see the Technology section for information), the StormCaster finds the GCM grid cells that overlie the selected forecast location. Once identified, the StormCaster downloads the monthly data for these grid cells from the web service. Three GHG scenarios are used, which means a total of 12 (4 GCM models x 3 GHG scenarios) time series of precipitation are downloaded.
Localize the GCM Data using Historical Rainfall – Using a statistical localization or “downscaling” algorithm, the StormCaster creates a monthly forecast of rainfall for each of the 12 GCM-Scenarios. The localization process consists of:
Build a rainfall climatology from the historical data at the rain site selected. This climatology gives the average rainfall in January, February, March, and so on.
Build a base local climate forecast from this climatology, essentially repeating the climatology year after year from 2010-2100.
For each GCM-Scenario precipitation projection, create a monthly time series of departures from pre-industrial revolution control. This essentially means subtracting the GCM rainfall projection time series from a “control” time series of precipitation, where the GCM was run as if the industrial revolution – and its attendant increases in GHG – did not happen. These time series are called “delta” time series.
Add the delta time series to the base local climate forecast to produce a monthly localize forecast.
A crucial step in the algorithm is localization where the historical rainfall data is used to remove bias from the GCM projection and add signatures of local microclimate to the projection.
Synthesize short time-step forecast using historical storms as analogs –
Historic analog storms is the key to transferring from monthly time step to 15- or 60-minute time-step. StormCaster uses the historical rainfall data – which is in 15- or 60-minute time-step – to create a library of storms for each month of the year. Markov probabilities are also found from this data. These give the probability that it will rain today given it did rain the previous day, and that it will rain today given it didn’t rain the previous day. Once the library is established, StormCaster marches through each day of the forecast, using the Markov probabilities to decide if it rains that day. If it does, then the tool pulls a historic storm from the library and inserts it into the forecast. This process continues over the whole 2010-2100 time frame, with the StormCaster ensuring that the total rain in any given month matches the rainfall projected by the GCM.
Trends in rainfall projected over time is a key challenge for storm forecasting. If an increasing trend is projected at the monthly time step by the GCM, then we can expect larger and more frequently individual storms than have been seen on record. StormCaster tackles this problem through storm scaling. If a positive trend is observed in the GCM, the Markov probabilities for rain are inflated – making it rain more frequently. Storm size is also inflated, by multiplying each 15-minute rainfall amount by a factor proportional to the monthly increase in rainfall versus historical levels.
Steps 3 and 4 are repeated for each of the 12 GCM-Scenario projections, to produce 12 short time-step storm forecasts that are collectively called the ensemble forecast.
Evaluate the 100 year return period storm at each decade over the forecast –
To get a good grasp on how storms are changing over time, StormCaster uses the ensemble forecast to estimate the 100-year storm in 2010, 2020, 2030, and so on to 2100. This is done by evaluating the maximum storm in each year for the 30-year time frame leading up to the year in question – e.g. 1981 – 2010 for the 2010 estimate, 1991-2020 for the 2020 estimate. The maximum storm volumes are fit to a Weibull distribution, and then the 100-year storm (the storm with 1% probability) is estimated from the distribution. The end result is a time series of estimates of the 100-year storm with an uncertainty bound defined by the ensemble of forecasts. This 100-year storm forecast is valuable for understanding the trends in storms projected by the GCMs.
The final forecast includes a fine time-step forecast of storms (orange dots are historic storms and blue dots are forecasted storms) and an evaluation of the 100-year storm (the upper solid line) and the uncertainty of the estimate. Note that the above chart is just for one GCM scenario. There are actually 12 forecasts like this that compose the ensemble forecast.
A new tool for understanding climate change in your community.>See why
Take StormCaster for a spin.>Go to application
StormCaster incorporates several key technologies, and has inspired several more.>Discover the technology
A sample forecast: The Atlanta Case Study.>View case study