Los Alamos National Laboratory

2 Posts

John Conway's Game of Life
Los Alamos National Laboratory

Life Is Easier for Big Networks: Neural networks learn better with more parameters.

According to the lottery ticket hypothesis, the bigger the neural network, the more likely some of its weights are initialized to values that are well suited to learning to perform the task at hand. But just how big does it need to be?
Map of the area analyzed in Cascadia and sketch of the subduction zone
Los Alamos National Laboratory

Prelude to a Quake?

Geologists call them slow slips: deep, low-frequency earthquakes that can last a month but have little effect on the surface. A model trained to predict such events could help with forecasting potentially catastrophic quakes.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox