Results for ""
In recent years, the risk of natural disasters has increased due to climate change and extreme weather events driven by global warming, increasing the need for technology to predict them. Existing weather forecasting technologies based on physical and numerical models are not highly accurate and have limitations as certain variables, such as global warming, are not taken into account.
Predicting extreme weather events such as heat waves and cold spells is of significant scientific and societal importance. However, despite decades of progress in weather prediction, primarily through improving computationally demanding numerical weather prediction (NWP) models and data assimilation techniques, forecasting the anomalous atmospheric circulation patterns that often drive these extreme events has remained challenging.
To assess a community’s risk of extreme weather, policymakers rely first on global climate models that can be run decades and even centuries forward, but only at coarse resolution. According to Themistoklis Sapsis, the William I. Koch Professor and director of the Center for Ocean Engineering in MIT’s Department of Mechanical Engineering, a method to correct the predictions from coarse climate models is being developed. Combining machine learning with dynamical systems theory, the team’s approach “nudges” a climate model’s simulations into more realistic patterns over large scales.
When paired with smaller-scale models to predict specific weather events such as tropical cyclones or floods, the team’s approach produced more accurate predictions for how often specific locations will experience those events over the next few decades.
The researchers believe the new correction scheme is general and can be applied to any global climate model.
Today’s large-scale climate models simulate weather features, such as the average temperature, humidity and precipitation around the world, on a grid-by-grid basis. Running simulations of these models takes enormous computing power. To simulate how weather features will interact and evolve over decades or longer, models average out features every 100 kilometres or so.
To improve the resolution of these models, the researchers have gone under the hood to try to fix a model’s underlying dynamical equations. These equations describe how phenomena in the atmosphere and oceans should physically interact.
In the report, Sapsis explains that people have tried to dissect climate model codes developed over the last 20 to 30 years, which is a nightmare because you can lose a lot of stability in your simulation. What the MIT researchers are doing is a different approach. They are not trying to correct the equations but instead correcting the model’s output.
While testing their approach, the team leverages the ML scheme to correct simulations produced by the Energy Exascale Earth System Model (E3SM). This is run by the US. The scientists used eight years of past data for temperature, humidity, and wind speed to train their new algorithm, which learned dynamical associations between the measured weather features and the E3SM model. They found that the corrected version produced climate patterns that more closely matched real-world observations from the last 36 years but were not used for training.
Sapsis describes that an extreme event in the uncorrected simulation might be 105 degrees Fahrenheit versus 115 degrees with our corrections. But for humans experiencing this, that is a big difference.
The team then paired the corrected coarse model with a specific, finer-resolution model of tropical cyclones. They found that the approach accurately reproduced the frequency of extreme storms in specific locations around the world.
The researchers think it would be interesting to see what climate change projections this framework yields once future greenhouse gas emission scenarios are incorporated.
Research and Image: MIT News