Cerebras

5 Posts

Charts showing performance scaling of TPU vs GPU on MLPerf-Transformer and Cerebras GPT-3 XL Performance
Cerebras

Built to Scale: Andromeda Supercomputer from Cerebras Speeds up AI

A new computing cluster delivers more bang per chip. Cerebras unveiled Andromeda, a supercomputer based on its processors. Unlike conventional clusters, the system’s processing speed rises linearly with additional processors.
Semiconductor chip
Cerebras

Chips at Risk: How the chip shortage impacts AI.

The hardware that runs the latest AI systems faces rising uncertainty as models grow larger and more computationally intensive. The U.S. Commerce Department sounded an alarm over bottlenecks in the availability of semiconductor chips.
Illustration of giant Christmas tree in a town plaza
Cerebras

Trillions of Parameters: Are AI models with trillions of parameters the new normal?

The trend toward ever-larger models crossed the threshold from immense to ginormous. Google kicked off 2021 with Switch Transformer, the first published work to exceed a trillion parameters, weighing in at 1.6 trillion.
David Patterson
Cerebras

David Patterson: Faster Training and Inference

Billions of dollars invested to create novel AI hardware will bear their early fruit in 2020. Google unleashed a financial avalanche with its tensor processing unit in 2017.
Wafer Scale chip
Cerebras

Size Matters

Silicon Valley startup Cerebras shifted out of stealth mode to unveil its flagship product: an enormous chip designed from the ground up to accelerate neural networks.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox