Giant Models Bankrupt Research Will training AI become too expensive for most companies?

Published
Reading time
2 min read
Illustration of a neighborhood haunted by an evil pumpkin and a black cat

What if AI requires so much computation that it becomes unaffordable?
The fear: Training ever more capable models will become too pricey for all but the richest corporations and government agencies. Rising costs will throttle progress as startups, academics, and students — especially in emerging economies — are left out in the cold. Customers will turn away from AI in search of less costly alternatives.

Behind the worries: Training a model to beat the top image classification and object detection benchmarks currently costs millions of dollars. And that cost is rising fast: The processing power required to train state-of-the-art models doubled every 3.4 months between 2012 and 2018, according to a study by OpenAI.

  • The high cost of beating the state of the art has prompted some institutions to rethink their approach. OpenAI, founded as a nonprofit lab, has morphed into a for-profit company. Last month, the organization granted Microsoft an exclusive commercial license for its GPT-3 language model.
  • A European grocery store chain recently decided against deploying an inventory tracking model due the cost in cloud computing charges, Wired reported.
  • AI’s environmental impact is growing as training consumes increasing quantities of energy. A 2019 paper from the University of Massachusetts concluded that training a large language model produced five times as much carbon dioxide as an average car spews over its entire working life.

How scared should you be: The massive inflation in training costs arises from trying to beat the best models. If you can make do with something less, the price comes way down. The cost to train an image classification model with Top-5 accuracy of 93 percent on Imagenet fell from $2,523 in 2017 to $13 the following year, according to a Stanford report. Pretrained models like Hugging Face’s implementations of popular language models and APIs like the one the OpenAI offers for GPT-3 make access to high-end AI even less expensive.

What to do: Researchers at the Allen Institute for AI and elsewhere argue that we should consider a model’s energy efficiency to be just as important as accuracy. Meanwhile, policymakers and executives who see the value in fostering competition should work to boost research funding and access to compute resources.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox