Google researchers have developed an artificial intelligence that they say can predict weather and climate patterns as accurately as current physical models, but with less computing power.
Existing forecasts are based on mathematical models run by extremely powerful supercomputers that deterministically predict what will happen in the future. Since they were first used in the 1950s, these models have become increasingly detailed and require more and more computer power.
Several projects aim to replace these computationally intensive tasks with much less demanding AI, including a DeepMind tool that forecasts localized rainfall over short periods of time. But like most AI models, the problem is that they are “black boxes” whose inner workings are mysterious and whose methods can’t be explained or replicated. And meteorologists say that if these models are trained on historical data, they will have a hard time predicting unprecedented events now being caused by climate change.
now, Dmitry Kochkov The researchers, from Google Research in California, and his colleagues created a model called NeuralGCM that balances the two approaches.
Typical climate models divide the Earth's surface into a grid of cells up to 100 kilometers in size. Due to limitations in computing power, simulating at high resolution is impractical. Phenomena such as clouds, turbulence, and convection within these cells are only approximated by computer codes that are continually adjusted to more closely match observed data. This approach, called parameterization, aims to at least partially capture small-scale phenomena that are not captured by broader physical models.
NeuralGCM has been trained to take over this small-scale approximation, making it less computationally intensive and more accurate. In the paper, the researchers say their model can process 70,000 days of simulations in 24 hours using a single chip called a Tensor Processing Unit (TPU). By comparison, competing models, called X-Shield A supercomputer with thousands of processing units is used to process the simulation, which lasts just 19 days.
The paper also claims that NeuralGCM performs predictions at a rate comparable to or better than best-in-class models. Google did not respond to a request for an interview. New Scientist.
Tim Palmer The Oxford researcher says the work is an interesting attempt to find a third way between pure physics and opaque AI approximations: “I'm uncomfortable with the idea of completely abandoning the equations of motion and moving to AI systems that even experts say they don't fully understand,” he says.
This hybrid approach is likely to spur further discussion and research in the modeling community, but time will tell whether it will be adopted by modeling engineers around the world, he says. “It's a good step in the right direction and the type of research we should be doing. It's great to see different alternatives being explored.”
topic:
Source: www.newscientist.com