Published
Reading time
4 min read
Long-Range Weather Forecasts: This ML-based forecast simulator outperformed medium-range forecast systems.

Machine learning models have predicted weather a few days ahead of time. A new approach substantially extends the time horizon.

What’s new: Remi Lam and colleagues at Google developed GraphCast, a weather-forecasting system based on graph neural networks (GNNs). Its 10-day forecasts outperformed those of conventional and deep-learning methods.

GNN basics: A GNN processes input in the form of a graph made up of nodes connected by edges. It uses a vanilla neural network to update the representation of each node based on those of neighboring nodes. For example, nodes can represent customers and products while edges represent purchases, or — as in this work — nodes can represent local weather while edges represent connections between locations.

Key insight: Short-term changes in the weather in a given location depend on conditions in nearby areas. A graph can reflect these relationships using information drawn from a high-resolution weather map, where each node represents an area’s weather and edges connect nearby areas. However, longer-term changes in the weather depend on conditions in both nearby and distant areas. To reflect relationships between more distant areas, the graph can draw on a lower-resolution map, which connects areas at greater distances. Combining edges drawn from higher- and lower-resolution weather maps produces a graph that reflects relationships among both nearby and distant areas, making it suitable for longer-term predictions.

How it works: GraphCast produced graphs based on high- and low-resolution weather maps and processed them using three GNNs called the encoder, processor, and decoder. The authors trained the system on global weather data from 1979 to 2017. Given a set of weather conditions and a set of weather conditions measured 6 hours previously for all locations on Earth, GraphCast learned to predict the weather 6 hours in the future and multiples thereof.

  • The authors divided a map of Earth into areas 0.25 by 0.25 degrees to make a graph — actually a grid — with roughly 1 million nodes, each containing over 200 values (for conditions such as temperature, humidity, air pressure, wind speed, precipitation, and so on) measured at a given time and 6 hours earlier. The nodes were connected at their northern, southern, eastern, and western borders.
  • The authors created a new graph by connecting each node of the grid to a smaller graph of around 41,000 nodes, where each node covered a larger region and nearby regions were connected via edges. (Specifically, the smaller graph’s nodes and edges coincided with those of a sphere divided into roughly 82,000 equilateral triangles. The authors connected nodes in the grid to those in the smaller graph if, when the two graphs were overlaid, the distance between them did not exceed a threshold.) Given the smaller graph, the encoder GNN learned to compute an embedding for each node.
  • To produce a multi-resolution graph, the authors represented Earth as an icosahedron (12 nodes and 20 equilateral triangles) and iteratively divided each triangle into 4 more triangles. They did this 6 times, creating 6 additional graphs of between 12 and roughly 10,000 nodes. They superimposed these graphs’ edges over the 41,000-node graph. Given the multi-resolution graph, the processor GNN learned to update the 41,000 node embeddings.
  • To return the resolution to 0.25 by 0.25 degrees, the authors created yet another graph by connecting the 41,000 nodes to their corresponding locations among the 1 million nodes on the initial grid. (Specifically, for each grid node, they found the triangular face that would contain it if the grid and 41,000-node graph were overlaid. Then they connected the grid node to the 3 nodes that formed this triangle.) Given this graph, the decoder GNN learned to compute the change in weather conditions for each node on the grid.
  • To predict the next time step, the authors added the decoder’s output to the values at the current time step. To forecast further into the future, they repeated the process, predicting the next time step based on the previously predicted values.
  • The system learned to predict the values at the next time step by minimizing the mean squared error between its predictions and actual measurements in 6-hour increments up to three days in advance (that is, over 12 sequential forecasts).

Results: Using 2018 data, the authors compared GraphCast’s 10-day forecasts to those of a popular European system that predicts weather based on differential equations that describe atmospheric physics. Compared to actual measurements, GraphCast achieved a lower root mean squared error in 90 percent of predictions. It produced a 10-day forecast at 0.25-degree resolution in under 60 seconds using a single TPU v4 chip, while the European system, which forecasts at 0.1-degree resolution, needed 150 to 240 hours on a supercomputer. GraphCast also outperformed Pangu-Weather, a transformer-based method, in 99.2 percent of predictions.

Yes, but: GraphCast’s predictions tended to be closer to average weather conditions, and it performed worse when the weather included extreme temperatures or storms.

Why it matters: Given a graph that combines multiple spatial resolutions, GNN can compute the influence of weather over large distances using relatively little memory and computation. This sort of graph structure may benefit other applications that process large inputs such as ultra-high resolution photos, fluid dynamics, and cosmological data.

We’re thinking: When it comes to forecasting weather, it looks like deep learning is the raining champ.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox