Science

Google AI slashes computer power needed for weather forecasts

AI could help predict the weather more accurately

Ranimiro Lotufo Neto/Alamy

Google researchers have built an artificial intelligence that they say can forecast weather and climate patterns just as well as current physics models while also requiring less computer power.

Existing forecasts are based on mathematical models run by enormously powerful supercomputers that deterministically predict what will happen in the future. Since they were first used in the 1950s, these models have grown more and more detailed, requiring ever more computer power.

Several projects have aimed to replace these intense calculations with much less demanding AI, including a DeepMind tool to forecast rain locally on short timescales. But like most AI models, these are a “black box” whose inner workings are a mystery, and the inability to explain or replicate their methods is problematic. Climate scientists also point out that if the models are trained on historical data, they will struggle to predict unprecedented phenomena now occurring due to climate change.

Now, Dmitrii Kochkov at Google Research in California and his colleagues have created a model called NeuralGCM that they believe strikes a balance between the two approaches.

Typical climate models divide Earth’s surface into a grid of cells up to 100 kilometres across; the limits of computing power make it impractical to simulate at higher resolutions. Phenomena like clouds, air turbulence and convection inside those cells are merely approximated by computer code that is continually tweaked to more accurately match observational data. This approach, called parameterisation, hopes to capture, at least partially, the small-scale phenomena that the wider physics model cannot.

NeuralGCM is trained to take over this small-scale approximation, making it less computationally intensive and more accurate. In a paper, the researchers say that the model can process 70,000 days of simulation in 24 hours using a single chip called a tensor processing unit (TPU). In comparison, a competing model called X-SHiELD uses a supercomputer with thousands of processing units to process just 19 days of simulation.

The paper also claims that NeuralGCM produces forecasts with accuracy comparable to, and sometimes better than, best-in-class models. Google didn’t respond to an interview request from New Scientist.

Tim Palmer at the University of Oxford says the research is an interesting attempt to find a third way between pure physics and opaque AI approximation. “I feel uncomfortable with the idea that we’re completely abandoning equations of motion and just going to some AI system, which even the experts will say they don’t really fully understand,” he says.

This hybrid approach could open up further debate and research in the modelling community, but only time will tell if it gets adopted by modellers around the world, he says. “It’s a good step in the right direction and it’s the type of research that we should be doing. It’s great to see all these alternative methods out there on the table.”

Topics:


Source link

Related Articles

Back to top button