Gated Multi-layer Perceptron (gMLP)

1 Post

Animation showing gMLP, a simple architecture that performed some language and vision tasks as well as transformers
Gated Multi-layer Perceptron (gMLP)

Perceptrons Are All You Need: Google Brain's Multi-Layer Perceptron Rivals Transformers

The paper that introduced the transformer famously declared, “Attention is all you need.” To the contrary, new work shows you may not need transformer-style attention at all.What’s new: Hanxiao Liu and colleagues at Google

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox