Tsinghua University

4 Posts

Illustration of how different data split strategies partition the labelled data
Tsinghua University

Fine-Tune Your Fine-Tuning: New method optimizes training for few shot NLP models.

Let’s say you have a pretrained language model and a small amount of data to fine-tune it to answer yes-or-no questions. Should you fine-tune it to classify yes/no or to fill in missing words — both viable approaches that are likely to yield different results?
Harry Shum
Tsinghua University

Harry Shum: Tsinghua University’s Harry Shum on how AI is changing creativity

In 2021, I envision that the AI community will create more tools to unleash human creativity. AI will help people across the globe to communicate and express emotions and moods in their own unique ways.
Data and graphs related to a method that synthesizes extracted features of underrepresented classes
Tsinghua University

Augmentation for Features: A technique for boosting underrepresented data classes

In any training dataset, some classes may have relatively few examples. A new technique can improve a trained model’s performance on such underrepresented classes. Researchers introduced a method that synthesizes extracted features of underrepresented classes.
Sesame Street characters together
Tsinghua University

Inside AI’s Muppet Empire: Why Are So Many NLP Models Named After Muppets?

As language models show increasing power, a parallel trend has received less notice: The vogue for naming models after characters in the children’s TV show Sesame Street.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox