Google Brain

3 Posts

Graphs comparing SimCLR to SimCLRv2
Google Brain

Fewer Labels, More Learning

Large models pretrained in an unsupervised fashion and then fine-tuned on a smaller corpus of labeled data have achieved spectacular results in natural language processing. New research pushes forward with a similar approach to computer vision.
2 min read
Data related to experience replay
Google Brain

Experience Counts

If the world changes every second and you take a picture every 10 seconds, you won’t have enough pictures to observe the changes clearly, and storing a series of pictures won’t help. On the other hand, if you take a picture every tenth of a second, then storing a history will help model the world.
2 min read
Graph related to Noisy Student performance on ImageNet
Google Brain

Self-Training for Sharper Vision

The previous state-of-the-art image classifier was trained on the ImageNet dataset plus 3.5 billion supplemental images from a different database. A new method achieved higher accuracy with one-tenth as many supplemental examples — and they were unlabeled, to boot.
2 min read

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox