University of Tübingen

3 Posts

Graph with difference in test error in keeping hard versus easy examples
University of Tübingen

Unsupervised Data Pruning: New method removes useless machine learning data.

Large datasets often contain overly similar examples that consume training cycles without contributing to learning. A new paper identifies similar training examples, even if they’re not labeled.
Graphs related to a comparison and evaluation of 14 different optimizers
University of Tübingen

Optimizer Shootout: An evaluation of 14 deep learning optimizers

Everyone has a favorite optimization method, but it’s not always clear which one works best in a given situation. New research aims to establish a set of benchmarks. Researchers evaluated 14 popular optimizers using the Deep Optimization Benchmark Suite some of them introduced last year.
Data and information related to shortcut learning
University of Tübingen

When Models Take Shortcuts: The causes of shortcut learning in neural networks

Neuroscientists once thought they could train rats to navigate mazes by color. Rats don’t perceive colors at all. Instead, they rely on the distinct odors of different colors of paint. New work finds that neural networks are prone to this sort of misalignment between training goals and learning.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox