Simple Contrastive Learning (SimCLR)

4 Posts

Collage of self portraits
Simple Contrastive Learning (SimCLR)

Unsupervised Prejudice: Image classification models learned bias from ImageNet.

Social biases are well documented in decisions made by supervised models trained on ImageNet’s labels. But they also crept into the output of unsupervised models pretrained on the same dataset.
Graphs comparing SimCLR to SimCLRv2
Simple Contrastive Learning (SimCLR)

Fewer Labels, More Learning: How SimCLRv2 improves image recognition with fewer labels

Large models pretrained in an unsupervised fashion and then fine-tuned on a smaller corpus of labeled data have achieved spectacular results in natural language processing. New research pushes forward with a similar approach to computer vision.
Data and information related to Contrastive Unsupervised Representations for Reinforcement Learning (CURL)
Simple Contrastive Learning (SimCLR)

RL and Feature Extraction Combined: CURL combines reinforcement with contrastive learning.

Which comes first, training a reinforcement learning model or extracting high-quality features? New work avoids this chicken-or-egg dilemma by doing both simultaneously.
Graph related to imple Contrastive Learning (SimCLR)
Simple Contrastive Learning (SimCLR)

Self-Supervised Simplicity: Image classification with simple contrastive learning (SimCLR)

A simple linear classifier paired with a self-supervised feature extractor outperformed a supervised deep learning model on ImageNet, according to new research.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox