Rules For Medical AI

Reading time
2 min read
X-ray of a person with a pacemaker

The U.S. Food and Drug Administration regulates medical devices from stents to diagnostic systems. Once approved, those things don’t change much. Now the agency is formulating standards for medical devices that take advantage of AI that's constantly learning.

What’s new: The first public comment period for the FDA’s draft framework on AI-based medical devices ended in June. The National Law Review pored over the comments. NLR reports that the AI community is pushing for tighter definitions and clearer understanding between industry and government, even as it broadly supports that agency’s effort.

The issue: Current rules for software in medical devices require manufacturers to submit programs for review with each significant update. That works for algorithms that are locked and updated periodically, say, in a system that monitors people for signs of stroke. But it’s not a great fit for models that learn from use, like a system that spots cancer more accurately with increasing exposure to real-world data.

Moving target: The FDA wants to establish a lifecycle approach to AI-based medical devices. According to the guidelines, the agency would ask developers to submit a roadmap of expected changes in a model’s output as it learns. Developers also would describe how they expect to manage any risks that might arise as the model adjusts to new data.

Public opinion: Businesses, professional groups, and concerned individuals who submitted comments generally liked the approach but requested a number of tweaks:

  • Many commenters wanted the agency to describes the types of AI models it expects the framework to govern. For instance, what constitutes a “continuously learning” AI?
  • Some wanted the FDA to explain what kinds of changes would trigger a review.
  • Others called on regulations to harmonize their guidelines with those engineers have adopted on their own. For instance, the IEEE’s standard for Transparency of Autonomous Systems calls for designing models so they can be evaluated independently for compliance with external protocols (like the FDA’s).

What’s next: The process is bound to wind through many more steps. Expect another draft framework, further rounds of public feedback, and internal reviews before the rules are finalized.

Our take: Regulation is messy. That goes double for medicine, and perhaps triple for AI. Still, it's critical to protect patients without squelching innovation. Government, industry, and researchers all have an important role to play in hammering out the final rules.


Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox