Published
Reading time
1 min read
Animated drawing of hardware related to AI

The for-profit research organization OpenAI has a new supercomputer to help achieve its dream of building the world’s most sophisticated AI.

What’s new: Microsoft engineered the new hardware network to train immense models on thousands of images, texts, and videos simultaneously.

How it works: Hosted on Microsoft’s Azure cloud platform, the system comprises 10,000 GPUs and 285,000 CPUs.

  • OpenAI has exclusive access to the new network.
  • The company believes that putting enormous computing power behind existing models could lead to artificial general Intelligence (AGI) capable of reasoning across a variety of domains.

Behind the news: In 2019, Microsoft invested $1 billion in OpenAI in exchange for the first shot at commercializing the research outfit’s innovations. Built using an undisclosed portion of that investment, the new system ranks among the world’s five most powerful computers.

Yes, but: While some experts see AGI on the horizon, others are less sanguine. Prominent researchers including Yann LeCun, Jerome Pesenti, Geoffrey Hinton, and Demis Hassabis have thrown cold water on AGI’s prospects.

Why it matters: OpenAI and Microsoft believe that the new supercomputer will open the door to systems capable of running hundreds of language and vision models simultaneously. Microsoft said that techniques developed on it eventually will benefit other Azure customers.

We’re thinking: We love supercomputers as much as anyone. But if Moore’s Law keeps up, today’s supercomputer will be tomorrow’s wrist watch.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox