Contrastive Learning (MICLe)

1 Post

Sequence showing a training step that uses different perspectives of the same patient to enhance unsupervised pretraining
Contrastive Learning (MICLe)

Same Patient, Different Views

When you lack labeled training data, pretraining a model on unlabeled data can compensate. New research pretrained a model three times to boost performance on a medical imaging task.
2 min read

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox