EfficientNet

6 Posts

Shifted Patch Tokenization (SPT) | Locality Self-Attention (LSA)
EfficientNet

Less Data for Vision Transformers

Vision Transformer (ViT) outperformed convolutional neural networks in image classification, but it required more training data. New work enabled ViT and its variants to outperform other architectures with less training data.
2 min read
Different x-rays and CT scans displayed
EfficientNet

AI Sees Race in X-Rays

Researchers from Emory University, MIT, Purdue University, and other institutions found that deep learning systems trained to interpret x-rays and CT scans also were able to identify their subjects as Asian, Black, or White.
2 min read
Graphs and data related to ImageNet performance
EfficientNet

ImageNet Performance: No Panacea

It’s commonly assumed that models pretrained to achieve high performance on ImageNet will perform better on other visual tasks after fine-tuning. But is it always true? A new study reached surprising conclusions.
2 min read
Data and graphs related to teacher networks
EfficientNet

Flexible Teachers, Smarter Students

Human teachers can teach more effectively by adjusting their methods in response to student feedback. It turns out that teacher networks can do the same.
2 min read
EfficientDet explained
EfficientNet

Easy on the Eyes

Researchers aiming to increase accuracy in object detection generally enlarge the network, but that approach also boosts computational cost. A novel architecture sets a new state of the art in accuracy while cutting the compute cycles required.
2 min read
Graph related to Noisy Student performance on ImageNet
EfficientNet

Self-Training for Sharper Vision

The previous state-of-the-art image classifier was trained on the ImageNet dataset plus 3.5 billion supplemental images from a different database. A new method achieved higher accuracy with one-tenth as many supplemental examples — and they were unlabeled, to boot.
2 min read

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox