AI under FDA Control?

On April 3, a Tuesday, the FA announced that it is building a new framework that would regulate AI-based medical devices, which learns continually from healthcare data and adapts.

Scott Gottlieb, MD, FDA Commissioner announced his resignation last month and released a detailed discussion paper outlining the approach of the administration to pace up with the rapid development of AI algorithms and ensure safety. The agency also seeks feedback about this paper.

In his personal statement from the FDA, Gottlieb wrote, “The goal of the framework is to assure that ongoing algorithm changes follow pre-specified performance objectives and change control plans, use a validation process that ensures improvements to the performance, safety, and effectiveness of the artificial intelligence software, and includes real-world monitoring of performance once the device is on the market to ensure safety and effectiveness are maintained.”


Insight of the Latest Model

The research paper is an integral part of the agency’s evolution to familiarize with current rules to the latest emerging technologies.

Within the same framework, FDA debates a product lifecycle approach, which includes performance monitoring, required for regulating AI/ML SaMD with the reasonable assertion of effectiveness and safety of the product.

In the statement, Gottlieb said, “This first step in developing our approach outlines transformation specific to devices that include artificial intelligence algorithms that make real-world modifications that the agency might require for premarket review. They include the algorithm’s performance, the manufacturer’s plan for modification and the ability of the manufacturer to manage and control risks of the modification.”

Furthermore, the agency proposes a ‘predetermined change control plan’, stating that it may be required to provide detailed information to FDA about the anticipated changed that an algorithm could undergo together with an explanation of the method used for implementing those changes.

FDA’s white paper clearly states, “We also anticipate that in certain cases, the SaMD’s risk or the intended use may significantly change after learning.” With this, the agency strictly warns that any such change could possibly trigger the need for a new submission in the premarket.

According to Brad Thompson, the device attorney of Epstein Becker Green, FDA admits in its discussion paper that the additional statutory authority would be required for implementing the idea fully.

In Thompson’s statement to MedTech Dive through an email, he wrote, “I am worried a bit that FDA is becoming really good at coming up with ideas and not so good about carrying through with them. It’s unclear whether they are saying that certain portions of this idea can be implemented right away through guidance, while others may have to await legislation.”

FDA also intends to issue detailed draft guidance based on all the inputs received on this new discussion paper using the “current authorities in new ways to keep up with the rapid pace of innovation and ensure the safety of these devices,” says Gottlieb.

Whereas Thompson is concerned about how both these things – AI framework and FDA Software Precertification Program – would be implemented, he also praises FDA for the pursuit of innovative regulatory approach for advanced technologies such as Artificial Intelligence.

Thompson also added, “I just hope that the agency can carry through with some of these new initiatives all the way to completion. Between the two programs – precertification and this new AI initiative – speaking personally I think I am more excited about the new AI initiative.”

This newly proposed AI framework focuses mainly on adaptive algorithms that continue to evolve and does not require manual updates. The steps involved are focused on the performance of the algorithm, manufacturer’s modification plans and the manufacturers’ ability to manage and control other risk factors associated with these modifications.

In addition, it is also probable that the FDA would review the predetermines change control plan of software to provide the agency with adequate information about the changes in algorithm based on updated strategy, methodologies, and re-training techniques used for putting those changes into place.

Gottlieb also asserted, “Artificial Intelligence has helped transform industries like finance and manufacturing, and I’m confident that these technologies will have a profound and positive impact on health care. I can envision a world where, one day, artificial intelligence can help detect and treat challenging health problems, for example by recognizing the signs of disease well in advance of what we can do today. These tools can provide more time for intervention, identifying effective therapies and ultimately saving lives.”


Approved AI products generally have certain locked algorithms that do not automatically change over time when new data is being gathered continuously. However, Gottlieb suggests that it is possible to rely on periodic modifications by manufacturers that may delay the possibility of AI actively learning and improving intervention, timeliness, and outcomes.

The whole idea that has been laid out in research and represented through a discussion paper is only to determine the type of AI or machine learning-based SaMD modifications that could be exempted from premarket submission requirements.

sepStream® has the latest and most advanced range of diagnostic tools and technologies to assist patients. The team is positive about introducing many such new technologies over time.