Chatbot-fueled FOMO is overwhelming cloud-computing services.
What’s new: Cloud providers are struggling to meet sharply rising demand by a crowd of AI startups eager to cash in on generative AI, The Information reported.
Behind the bottleneck: The surge in demand caught Amazon Web Services, Microsoft Azure, and others off guard.
- Some cloud providers didn’t place their orders for extra AI chips early enough, while Nvidia, which manufactures the specialized GPUs that process many AI workloads, typically takes months to fulfill orders. (Google Cloud, which uses proprietary TPU chips, said it has been able to meet nearly all its customer demand.)
- Microsoft has been rationing GPU access for its internal teams. Microsoft partner OpenAI has had to slow down development.
- Electrical power is in short supply in Northern Virginia and Northern California’s Silicon Valley, two of the biggest data-center markets. The shortages have driven up cloud computing costs and further strained server capacity.
What they’re saying: Engineers and entrepreneurs shared their pain.
- Yasyf Mohamedali, engineer in residence at venture capital firm Root Ventures, said it was impossible to find servers without prepayment or an existing contact.
- Naveen Rao, CEO of startup MosaicML, said customers who had committed to multi-year spending had better luck gaining access to large blocks of servers.
- Some startups are turning to smaller cloud providers like RunPod, Lambda Labs, Crusoe Energy, and CoreWeave.However, even these firms are struggling to meet demand, said Stephen Balaban, CEO and co-founder of Lambda Labs.
- Even customers that get access to cloud servers often lack sufficient capacity, said Johnny Dallas, founder and CEO of Zeet, which automates management of cloud services.
Behind the news: China is facing its own chip shortage — and finding ways to address it. That situation, though, is a result of United States trade sanctions rather than a surge in demand.
Why it matters: Startups that serve a market with generated text or pictures are white-hot, but even the most promising ventures can’t do without servers to build, test, and deploy their models. The winners will need not only a great product but also ready access to computation.
We’re thinking: Our hearts go out to everyone who is trying to build AI products in these unpredictable times. We trust that the supply of compute will catch up in due course and that the current run of AI-fueled growth will continue for the foreseeable future.