Overfitting
Learning After Overfitting: Transformers Continue Learning After Overfitting Data
When a model trains too much, it can overfit, or memorize, the training data, which reduces its ability to analyze similar-but-different inputs. But what if training continues? New work found that overfitting isn’t the end of the line.