6 Posts

Shifted Patch Tokenization (SPT) | Locality Self-Attention (LSA)

Less Data for Vision Transformers: Boosting Vision Transformer Performance with Less Data

Vision Transformer (ViT) outperformed convolutional neural networks in image classification, but it required more training data. New work enabled ViT and its variants to outperform other architectures with less training data.
Different x-rays and CT scans displayed

AI Sees Race in X-Rays

Researchers from Emory University, MIT, Purdue University, and other institutions found that deep learning systems trained to interpret x-rays and CT scans also were able to identify their subjects as Asian, Black, or White.
Graphs and data related to ImageNet performance

ImageNet Performance, No Panacea: ImageNet pretraining won't always improve computer vision.

It’s commonly assumed that models pretrained to achieve high performance on ImageNet will perform better on other visual tasks after fine-tuning. But is it always true? A new study reached surprising conclusions.
Data and graphs related to teacher networks

Flexible Teachers, Smarter Students: Meta Pseudo Labels improves knowledge distillation.

Human teachers can teach more effectively by adjusting their methods in response to student feedback. It turns out that teacher networks can do the same.
EfficientDet explained

Easy on the Eyes: More accurate object detection with EfficientDet

Researchers aiming to increase accuracy in object detection generally enlarge the network, but that approach also boosts computational cost. A novel architecture sets a new state of the art in accuracy while cutting the compute cycles required.
Graph related to Noisy Student performance on ImageNet

Self-Training for Sharper Vision: The noisy student method for computer vision, explained

The previous state-of-the-art image classifier was trained on the ImageNet dataset plus 3.5 billion supplemental images from a different database. A new method achieved higher accuracy with one-tenth as many supplemental examples — and they were unlabeled, to boot.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox