When Safety Becomes Surveillance Colleges Track Students Using AI Designed to Monitor Mental Health

Published
Reading time
2 min read
A series of screen captures from the Social Sentinel platform

United States colleges tracked activists using a natural language processing system intended to monitor their mental health.

What’s new: An investigation by The Dallas Morning News and UC Berkeley Graduate School of Journalism found that schools in Georgia, North Carolina, and elsewhere used Social Sentinel, which monitors social media posts to identify individuals who intend to harm themselves or others, to keep tabs on protestors from 2015 to 2019 and possibly beyond.

What they found: The system, which was renamed Navigate360 Detect in 2020, uses an “evolving AI language engine” to analyze public communications. Users can query social media posts to Facebook, Instagram, Reddit, Twitter, and YouTube, although searches are limited to eight topics and 25 subtopics related to safety and security. The reporters studied documents acquired through leaks and requests to the government along with interviews with school employees. Among their findings:

  • Beyond public posts, the system also scans emails, Google Docs, Google Hangouts, and Facebook Messages. It can also detect web searches of domains that a customer deems harmful.
  • The developer privately promoted the system to school officials to mitigate and forestall campus protests.
  • North Carolina Agricultural and Technical State College in 2019 used the software to track social-media comments made by a student who criticized university authorities for mishandling her rape complaint.
  • Kennesaw State University in Georgia used the software to monitor protestors — including at least one person who did not attend the university — in at least three demonstrations in 2017.
  • UNC-Chapel Hill’s campus police used the software to monitor participants in pro- and anti-abortion protests in 2015, and demonstrations in 2018 calling to remove a statue that celebrated the rebel army in the U.S. Civil War of the mid-1800s.

The response: Navigate360, the Ohio-based company that acquired Social Sentinel in 2020, stated that the investigation was inaccurate and that the word “protest” was not in the system’s list of search topics. School officials didn’t respond to the reporters’ requests for comment and declined to discuss policies that govern their use of such software.

Why it matters: Campuses must tread a line between keeping students safe and hosting free expression. Protests can spiral out of control, causing injury and loss of life. Yet students have a reasonable expectation that educational institutions have their best interests at heart and will support their intellectual inquiries — even if they lead to peaceful protests.

We’re thinking: AI can do good by alerting school officials to students who are severely disturbed or distressed. It should go without saying that systems designed for this purpose should never be used to stifle dissent.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox