Court Blocks AI-Assisted Proctoring AI Cheating Software Loses in Court

Published
Reading time
2 min read
Capture of Honorlock, an AI-powered software designed to catch students who cheat on academic examinations

A U.S. court ruled against an implementation of AI-powered software designed to catch students who cheat on academic examinations.

What’s new: A federal judge determined that Cleveland State University’s use of Honorlock, a system that scans students’ behavior and surroundings for signs of cheating, violates their rights, National Public Radio reported.

How it works: Students install Honorlock as a web browser extension and permit access to the computer’s microphone and camera.

  • During a test, the extension uses voice detection and computer vision to issue alerts if it detects tablets, phones, open textbooks, dimmed lighting, faces of people other than the student, talking, phrases like “Hey Siri” or “Okay Google,” the student’s looking down or away from the screen before answering questions or absence from the camera’s view for an extended time, and other signs.
  • Instructors can initiate a 360-degree scan of a student’s room prior to a test. Scans take about a minute to complete. Honorlock stores the recorded video data for a year.
  • If it detects anything amiss, the system alerts a human proctor.

The case: In 2021, Cleveland State University student Aaron Ogletree sued the school for subjecting him to a virtual room scan, which he claimed violated his Constitutional protection against unreasonable searches. He complied with the scan but filed suit later. The university argued that a room scan doesn’t constitute a search because it’s limited in scope and conducted to ensure academic integrity. The judge ruled that the university had violated Ogletree’s rights.

Behind the News: Scientific investigations of other AI-powered proctoring systems have reached conflicting conclusions about their effectiveness.

  • A 2021 study of a program called Proctorio found that it failed to catch any of 30 students whom the authors instructed to cheat. It also incorrectly flagged non-cheating students as engaging in suspicious activities.
  • A 2020 study by Radford University found that test-takers scored lower when they were monitored by proctoring software than when they weren’t. The authors interpreted this result as evidence that
    automated proctoring discourages cheating.

Why it matters: Automated proctoring has value, especially in the era of remote education. Although the ruling against Cleveland State applies only to that school, it raises questions about the legality of such automated room scans nationwide.

We’re thinking: While the judge's decision ostensibly affects AI-powered proctor software, many institutions use human proctors who might occasionally request a manual room scan. The underlying question —  what proctoring methods are reasonable, ethical, fair, and legal? — is independent of whether machines or humans should do the job.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox