Counter Strike

1 Post

Automated player learning by watching recorded gameplay
Counter Strike

Behavioral Cloning Shootout: AI learns to play Counter Strike Global Offensive.

Neural networks have learned to play video games like Dota 2 via reinforcement learning by playing for the equivalent of thousands of years (compressed into far less time). In new work, an automated player learned not by playing for millennia but by watching a few days’ worth of recorded gameplay.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox