Graphs and data related to Plan2Vec
Efficiency

Visual Strategies for RL

Reinforcement learning can beat humans at video games, but humans are better at coming up with strategies to master more complex tasks. New work enables neural networks to connect the dots.
Image processing technique explained
Efficiency

Preserving Detail in Image Inputs

Given real-world constraints on memory and processing time, images are often downsampled before they’re fed into a neural network. But the process removes fine details, and that degrades accuracy. A new technique squeezes images with less compromise.
Graphs related to double descent
Efficiency

Moderating the ML Roller Coaster

Wait a minute — we added training data, and our model’s performance got worse?! New research offers a way to avoid so-called double descent.
Data and graphs related to equations that optimize some training parameters.
Efficiency

Optimize Your Training Parameters

Last week we reported on a formula to determine model width and dataset size for optimal performance. A new paper contributes equations that optimize some training parameters.
Simplified depiction of LSH Attention
Efficiency

Transformers Transformed

Transformer networks have revolutionized natural language processing, but they hog processor cycles and memory. New research demonstrates a more frugal variation.
EfficientDet explained
Efficiency

Easy on the Eyes

Researchers aiming to increase accuracy in object detection generally enlarge the network, but that approach also boosts computational cost. A novel architecture sets a new state of the art in accuracy while cutting the compute cycles required.
OctConv example
Efficiency

Convolution Revolution

Looking at images, people see outlines before the details within them. A replacement for the traditional convolutional layer decomposes images based on this distinction between coarse and fine features.
An illustration of filter pruning
Efficiency

High Accuracy, Low Compute

As neural networks have become more accurate, they’ve also ballooned in size and computational cost. That makes many state-of-the-art models impractical to run on phones and potentially smaller, less powerful devices.
DeepScale's automated vehicle technology
Efficiency

Tesla Bets on Slim Neural Nets

Elon Musk has promised a fleet of autonomous Tesla taxis by 2020. The company reportedly purchased a computer vision startup to help meet that goal. Tesla acquired DeepScale, a Silicon Valley startup that rocesses computer vision on low-power electronics.
Illustration of Facebook AI Research method to compress neural networks
Efficiency

Honey, I Shrunk the Network!

Deep learning models can be unwieldy and often impractical to run on smaller devices without major modification. Researchers at Facebook AI Research found a way to compress neural networks with minimal sacrifice in accuracy.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox