GPU Data Centers Strain Grid Power AI's electricity demands spur an expansion of power sources.

Published
Reading time
2 min read
GPU Data Centers Strain Grid Power: AI's electricity demands spur an expansion of power sources.

The AI boom is taxing power grids and pushing builders of data centers to rethink their sources of electricity.

What’s new: New data centers packed with GPUs optimized for AI workloads are being approved at a record pace, The Information reported. The extreme energy requirements of such chips are pushing builders to place data centers near inexpensive power sources, which may be far away from where users live.

How it works: The coming generation of GPU data centers promises to supply processing power for the burgeoning AI era. But builders aren’t always able to find electricity to run them.

  • In the data center hub of Northern Virginia, power company Dominion Energy temporarily ceased connecting new data centers for three months in 2022. It warned that future connections would be in question until 2026.
  • Although many data center operators pledged to rely on energy sources other than fossil fuels, their rising demand for power has made that difficult, Bloomberg reported. Regulators in Virginia considered allowing data centers to use diesel generators before they abandoned that plan under pressure from environmental groups. In Kansas City, Missouri, Meta’s apparent plan to build a giant data center helped convince one utility to postpone the planned retirement of a coal plant.
  • Some companies that rely on data centers are looking into less conventional power sources. Microsoft is considering small, modular nuclear reactors that, while largely speculative, promise to be less expensive and more flexible than traditional nuclear power plants. Microsoft recently appointed a director of nuclear technologies.

What they’re saying: “We still don’t appreciate the energy needs of [AI] technology. There's no way to get there without a breakthrough.” — Sam Altman, CEO, OpenAI, on January 16, 2024, quoted by Reuters.

Behind the news: Data centers alone account for 1 to 1.5 percent of global demand for electricity. It’s unclear how much of that figure is attributable to AI, but the share is likely to grow. 

Why it matters: The world needs innovation in both energy resources and power-efficient machine learning. The dawning era of pervasive AI brings with it the challenge of producing energy to develop and deploy the technology, which can contribute to pollution that disrupts ecosystems and accelerates climate change. Fortunately, AI can shrink the environmental footprint of some energy-intensive activities; for example, searching the web for information generates far less CO2 emissions than driving to a library. 

We’re thinking: Climate change is a slow-motion tragedy. We must push toward AI infrastructure that uses less energy (for example, by using more efficient algorithms or hardware) and emits less carbon (for example, by using renewable sources of energy). That said, concentrating computation in a data center creates a point of significant leverage for optimizing energy usage. For example, it’s more economical to raise the energy efficiency of 10,000 servers in a data center than 10,000 PCs that carry out the same workload in 10,000 homes. 

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox