Text-to-Text Transfer Transformer (T5)

1 Post

Different graphs showing switch transformer data
Text-to-Text Transfer Transformer (T5)

Bigger, Faster Transformers

Performance in language tasks rises with the size of the model — yet, as a model’s parameter count rises, so does the time it takes to render output. New work pumps up the number of parameters without slowing down the network.
2 min read

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox