Self-Supervised Learning

9 Posts

The Big Picture and the Details: I-JEPA, or how vision models understand the relationship between parts and the whole
Self-Supervised Learning

The Big Picture and the Details: I-JEPA, or how vision models understand the relationship between parts and the whole

A novel twist on self-supervised learning aims to improve on earlier methods by helping vision models learn how parts of an image relate to the whole.
Alexei Efros
Self-Supervised Learning

Alexei Efros: Learning from the ground up

Things are really starting to get going in the field of AI. After many years (decades?!) of focusing on algorithms, the AI community is finally ready to accept the central role of data and the high-capacity models that are capable of taking advantage of this data.
Animated illustration shows the model architecture of a graph neural network.
Self-Supervised Learning

A Deeper Look at Graphs: Graph Neural Networks Work Better With More Layers

New research shows that drastically increasing the number of layers in a graph neural networks improves its performance on large datasets.
Electroencephalogram (EEG) and data related to contrastive predictive coding (CPC)
Self-Supervised Learning

Unlabeled Brainwaves Spill Secrets: Deep learning helps doctors interpret EEGs.

For people with neurological disorders like epilepsy, attaching sensors to the scalp to measure electrical currents within the brain is benign. But interpreting the resulting electroencephalogram (EEG) graphs can give doctors a headache.
Graphs comparing SimCLR to SimCLRv2
Self-Supervised Learning

Fewer Labels, More Learning: How SimCLRv2 improves image recognition with fewer labels

Large models pretrained in an unsupervised fashion and then fine-tuned on a smaller corpus of labeled data have achieved spectacular results in natural language processing. New research pushes forward with a similar approach to computer vision.
Graph related to imple Contrastive Learning (SimCLR)
Self-Supervised Learning

Self-Supervised Simplicity: Image classification with simple contrastive learning (SimCLR)

A simple linear classifier paired with a self-supervised feature extractor outperformed a supervised deep learning model on ImageNet, according to new research.
Association for the Advancement of Artificial Intelligence conference in New York
Self-Supervised Learning

Meeting of the Minds: Deep learning pioneers discuss the state of AI.

Geoffrey Hinton, Yoshua Bengio, and Yann LeCun presented their latest thinking about deep learning’s limitations and how to overcome them.
Information related to Greedy InfoMax (GIM)
Self-Supervised Learning

Better Than Backprop: Greedy InfoMax trains AI without end-to-end backpropagation.

End-to-end backpropagation and labeled data are the peanut butter and chocolate of deep learning. However, recent work suggests that neither is necessary to train effective neural networks to represent complex data.
Yann LeCun
Self-Supervised Learning

Yann LeCun — Learning From Observation: The power of self-supervised learning

How is it that many people learn to drive a car fairly safely in 20 hours of practice, while current imitation learning algorithms take hundreds of thousands of hours, and reinforcement learning algorithms take millions of hours? Clearly we’re missing something big.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox