Pale Transformer

1 Post

Illustration shows different self-attention mechanisms used by Transformer-based AI models.
Pale Transformer

Attention to Rows and Columns: Altering Transformers' Self-Attention Mechanism for Greater Efficiency

A new approach alters transformers' self-attention mechanism to balance computational efficiency with performance on vision tasks.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox