Turing-NLG

4 Posts

smaller town bigger tree
Turing-NLG

Trillions of Parameters: Are AI Models With Trillions of Parameters the New Normal?

The trend toward ever-larger models crossed the threshold from immense to ginormous. Google kicked off 2021 with Switch Transformer, the first published work to exceed a trillion parameters, weighing in at 1.6 trillion.
2 min read
Graphs with data related to Microsoft's library DeepSpeed
Turing-NLG

Toward 1 Trillion Parameters

An open source library could spawn trillion-parameter neural networks and help small-time developers build big-league models. Microsoft upgraded DeepSpeed, a library that accelerates the PyTorch deep learning framework.
2 min read
Talking bubbles inside talking bubbles
Turing-NLG

Bigger is Better

Natural language processing lately has come to resemble an arms race, as the big AI companies build models that encompass ever larger numbers of parameters. Microsoft recently held the record — but not for long.
2 min read
Generative BST example and graph
Turing-NLG

Big Bot Makes Small Talk

Facebook recently rolled out its entry in the World’s Biggest Chatbot sweepstakes. In keeping with the company’s social-networking dominance, the bot is designed to excel at chitchat on any subject.
2 min read

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox