Fudan University

3 Posts

Memory-Efficient Optimizer: A method to reduce memory needs when fine-tuning AI models
Fudan University

Memory-Efficient Optimizer: A method to reduce memory needs when fine-tuning AI models

Researchers devised a way to reduce memory requirements when fine-tuning large language models. Kai Lv and colleagues at Fudan University proposed low memory optimization (LOMO), a modification of stochastic gradient descent that stores less data than other optimizers during fine-tuning.
Bert (muppet) and information related to BERT (transformer-based machine learning technique)
Fudan University

Do Muppets Have Common Sense?: The Bert NLP model scores high on common sense test.

Two years after it pointed a new direction for language models, Bert still hovers near the top of several natural language processing leaderboards. A new study considers whether Bert simply excels at tracking word order or or learns something closer to common sense.
Examples of original and cloaked portrait photos
Fudan University

Secret Identity: Invisible patterns hide faces from AI.

Hoping to keep surveillance capitalists from capitalizing on your face? Safeguard your selfies with a digital countermeasure. Researchers devised a program that subtly alters portrait photos to confuse face recognition models without distorting the image to the human eye.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox