AI can forecast the weather in seconds without needing supercomputers
While earlier weather-forecasting AIs have replaced some tasks done by traditional models, new research uses machine learning to replace the entire process, making it much faster
By Matthew Sparkes
20 March 2025
Thunderstorms over Indonesia, seen from the International Space Station
NASA Earth Observatory / International Space Station (ISS)
An AI weather program running for a single second on a desktop can match the accuracy of traditional forecasts that take hours or days on powerful supercomputers, claim its creators.
Weather forecasting has, since the 1950s, relied on physics-based models that extrapolate from observations made using satellites, balloons and weather stations. But these calculations, known as numerical weather prediction (NWP), are extremely intensive and rely on vast, expensive and energy-hungry supercomputers.
Read more
Microsoft has a new quantum computer – but does it actually work?
Advertisement
In recent years, researchers have tried to streamline this process by applying AI. Google scientists last year created an AI tool that could replace small chunks of complex code in each cell of a weather model, cutting the computer power required dramatically. DeepMind later took this even further and used AI to replace the entire forecast. This approach has been adopted by the European Centre for Medium-Range Weather Forecasts (ECMWF), which launched a tool called the Artificial Intelligence Forecasting System last month.
But this gradual expansion of AI’s role in weather prediction has fallen short of replacing all traditional number-crunching – something a new model created by Richard Turner at the University of Cambridge and his colleagues seeks to change.
Turner says previous work was limited to forecasting, and passed over a step called initialisation, where data from satellites, balloons and weather stations around the world is collated, cleaned, manipulated and merged into an organised grid that the forecast can start from. “That’s actually half the computational resources,” says Turner.