From Pandemic to Panopticon How Russia is using face recognition to punish dissidents.

Published
Reading time
1 min read
Security cameras somewhere around the Red Square in Moscow, Russia

Governments are repurposing Covid-focused face recognition systems as tools of repression.

What's new: Russia’s internal security forces are using Moscow’s visual surveillance system, initially meant to help enforce pandemic-era restrictions, to crack down on anti-government dissidents or protestors against the war in Ukraine, Wired reported.

How it works: Moscow upgraded its surveillance network in 2020 to identify violators of masking requirements and stay-at-home orders. The system includes 217,000 cameras equipped to recognize faces and license plate numbers. It also tracks medical records and mobile-phone locations. Companies including Intel, Nvidia, Samsung, and Russian AI startup NtechLab have supplied equipment.

  • Last year, Moscow police used the system to detain at least 141 activists and protestors, according to human rights group OVD-Info.
  • A lawyer who challenged the system in court later left Russia in fear for her personal safety.
  • Critics say the agency that operates it is not accountable to the public. Some municipal officials have said they don’t control it or understand how it works.
  • The national government plans to expand the system to other metropolitan areas across the country.

Behind the news: Numerous governments have co-opted technology originally deployed to counter Covid-19 for broader surveillance, the Pulitzer Center reported. For instance, police in Hyderabad, India, allegedly targeted minorities for harassment using face-detection systems initially implemented to spot people flaunting mask mandates.

Why it matters: There’s a fine line between using surveillance for the greater good and abusing it to exercise power. When the pandemic hit, computer vision and contact tracing were important tools for containing the spread of disease. But the same technology that helps to keep the public safe lends itself to less laudable uses, and governments can find it hard to resist.

We're thinking: Governments often expand their power in times of crisis and hold onto it after the crisis has passed. That makes it doubly important that government AI systems be accountable to the public. The AI community can play an important role in establishing standards for their procurement, deployment, control, and auditing.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox