Chinese Academy of Sciences

2 Posts

Illustration shows different self-attention mechanisms used by Transformer-based AI models.
Chinese Academy of Sciences

Attention to Rows and Columns: Altering Transformers' Self-Attention Mechanism for Greater Efficiency

A new approach alters transformers' self-attention mechanism to balance computational efficiency with performance on vision tasks.
Face recognition system working on a person entering a building
Chinese Academy of Sciences

Who Has the Best Face Recognition? U.S. Government Agency Ranks the Best Face Recognition Systems

Face recognition algorithms have come under scrutiny for misidentifying individuals. A U.S. government agency tested over 1,000 of them to see which are the most reliable.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox