Graphs and data related to RubiksShift
Machine Learning Research

More Efficient Action Recognition

Recognizing actions performed in a video requires understanding each frame and relationships between the frames. Previous research devised a way to analyze individual images efficiently known as Active Shift Layer (ASL). New research extends this technique to the steady march of video frames.
2 min read
Data and examples related to IMLE-GAN
Machine Learning Research

Making GANs More Inclusive

A typical GAN’s output doesn’t necessarily reflect the data distribution of its training set. Instead, GANs are prone to modeling the majority of the training distribution, sometimes ignoring rare attributes — say, faces that represent minority populations.
2 min read
Data and examples related to a new technique to detect portions of an image
Machine Learning Research

The Telltale Artifact

Deepfakes have gone mainstream, allowing celebrities to star in commercials without setting foot in a film studio. A new method helps determine whether such endorsements — and other images produced by generative adversarial networks — are authentic.
2 min read
Examples of Generative Adversarial Networks used for image to illustration translation
Machine Learning Research

Style and Substance

GANs are adept at mapping the artistic style of one picture onto the subject of another, known as style transfer. However, applied to the fanciful illustrations in children’s books, some GANs prove better at preserving style, others better at preserving subject matter.
2 min read
Graphs related to different attention mechanisms
Machine Learning Research

More Efficient Transformers

As transformer networks move to the fore in applications from language to vision, the time it takes them to crunch longer sequences becomes a more pressing issue. A new method lightens the computational load using sparse attention.
2 min read
Example of Occupancy Anticipation, a navigation system that predicts unseen obstacles, working
Machine Learning Research

Guess What Happens Next

New research teaches robots to anticipate what’s coming rather than focusing on what’s right in front of them. Researchers developed Occupancy Anticipation (OA), a navigation system that predicts unseen obstacles in addition to observing those in its field of view.
2 min read
Bert (muppet) and information related to BERT (transformer-based machine learning technique)
Machine Learning Research

Do Muppets Have Common Sense?

Two years after it pointed a new direction for language models, Bert still hovers near the top of several natural language processing leaderboards. A new study considers whether Bert simply excels at tracking word order or or learns something closer to common sense.
2 min read
Electroencephalogram (EEG) and data related to contrastive predictive coding (CPC)
Machine Learning Research

Unlabeled Brainwaves Spill Secrets

For people with neurological disorders like epilepsy, attaching sensors to the scalp to measure electrical currents within the brain is benign. But interpreting the resulting electroencephalogram (EEG) graphs can give doctors a headache.
2 min read
Graphs comparing SimCLR to SimCLRv2
Machine Learning Research

Fewer Labels, More Learning

Large models pretrained in an unsupervised fashion and then fine-tuned on a smaller corpus of labeled data have achieved spectacular results in natural language processing. New research pushes forward with a similar approach to computer vision.
2 min read
Graphs related to a comparison and evaluation of 14 different optimizers
Machine Learning Research

Optimizer Shootout

Everyone has a favorite optimization method, but it’s not always clear which one works best in a given situation. New research aims to establish a set of benchmarks. Researchers evaluated 14 popular optimizers using the Deep Optimization Benchmark Suite some of them introduced last year.
2 min read
Data and information related to dropout
Machine Learning Research

Dropout With a Difference

The technique known as dropout discourages neural networks from overfitting by deterring them from reliance on particular features. A new approach reorganizes the process to run efficiently on the chips that typically run neural network calculations.
2 min read
Graphs and data related to transformer networks
Machine Learning Research

The Transformation Continues

Transformer networks are gaining popularity as a high-accuracy alternative to recurrent neural networks. But they can run slowly when they’re applied to long sequences.
2 min read
Data related to experience replay
Machine Learning Research

Experience Counts

If the world changes every second and you take a picture every 10 seconds, you won’t have enough pictures to observe the changes clearly, and storing a series of pictures won’t help. On the other hand, if you take a picture every tenth of a second, then storing a history will help model the world.
2 min read
Graphs and data related to semi-supervised learning
Machine Learning Research

All Examples Are Not Equal

Semi-supervised learning — a set of training techniques that use a small number of labeled examples and a large number of unlabeled examples — typically treats all unlabeled examples the same way. But some examples are more useful for learning than others.
2 min read
Information related to Policy Adaptation during Deployment (Pad)
Machine Learning Research

Same Job, Different Scenery

People who take driving lessons during daytime don’t need instruction in driving at night. They recognize that the difference doesn’t disturb their knowledge of how to drive. Similarly, a new reinforcement learning method manages superficial variations in the environment without re-training.
2 min read

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox