Multiple-Instance

1 Post

Sequence showing a training step that uses different perspectives of the same patient to enhance unsupervised pretraining
Multiple-Instance

Same Patient, Different Views: Contrastive pretraining improves medical imaging AI.

When you lack labeled training data, pretraining a model on unlabeled data can compensate. New research pretrained a model three times to boost performance on a medical imaging task.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox