FDA’s Regulation of CDS Software: Will Physicians Have to Understand the Underlying Algorithms?

Posted 19 April 2018 | By Zachary Brennan 

FDA’s Regulation of CDS Software: Will Physicians Have to Understand the Underlying Algorithms?

A key section of the recently passed 21st Century Cures Act explained to the US Food and Drug Administration (FDA) which categories of software should be exempt from medical device regulations.

Most of these newly excluded categories involve relatively harmless billing or scheduling software, software for encouraging a healthy lifestyle, or even electronic health record (EHR) or Medical Device Data System (MDDS) software, as long as they don’t interpret or analyze the data for the purpose of “diagnosis, cure, mitigation, prevention, or treatment of a disease or condition.”

As an FDA official noted at a meeting at Xavier University last May, MDDS software is a good example of a device that went from a high-risk, or Class III, device about five years ago, to now not even being classified as a device by FDA.

But an article published in the American Journal of Law & Medicine last month also delved into another category that the new Cures law excluded from FDA’s purview: a subset of Clinical Decision Support (CDS) software that could present some difficult questions for the agency.

“To escape FDA regulation, the software vendor/manufacturer must intend for the software to make it possible for health care professionals to override its recommendations by explaining its rationale in terms that a clinician could understand, interrogate, and possibly reject,” the authors of the article, Barbara Evans, professor at the University of Houston Law Center and Department of Electrical and Computer Engineering, and Pilar Ossorio, professor at the University of Wisconsin Law School and Morgridge Institute for Research, wrote. “Whether CDS software is subject to FDA regulation potentially turns on the software’s ability to answer the quintessential epistemological question: How do we know?”

Evans explained to Focus: "The question is whether health care professionals actually will be able to understand the basis for the software's recommendations, in situations where the software uses machine learning algorithms. Intending for doctors to be able to understand something, and having doctors actually understand it, can be two different things. The Cures Act does not require proof that the recommendations actually are understandable to physicians, in order to be excluded from FDA oversight. They merely need to be intended to be understandable."  

But she also explained how if physicians find it difficult to understand and review recommendations from machine-learning CDS algorithms, "that ultimately may not be a problem that is within FDA's purview to solve.It's in the nature of a medical practice regulatory issue: Is the physician workforce we have qualified to apply some of the new software tools that are coming? Is additional training required?  Will we need a new breed of specialist physicians who are dually trained in medicine and technology?"   

And for FDA, physicians who may have to learn how these algorithms work, and the clinical decision support market, which is projected to reach almost $5 billion by 2021, these are issues that need to be resolved in real time as the products ready to launch.

Moving forward, the authors question, how will FDA be able to tell when CDS software “explains its recommendations in a way that physicians can understand and critique”? And will FDA be able to tell “when the manufacturer intended for its software to do so? If not, what problems may clinicians face in using CDS software?”

Another part of the problem, as the authors explain, is that understanding the “ground truth” (i.e. how the software functions in the real world) may “be may be unknowable in the premarket period, before the algorithm moves into wide clinical use.

“Consider CDS software that bases its diagnostic or treatment recommendations on deeply descriptive datasets that incorporate thousands of clinical, genomic, and exposure data points for each individual. For patients who present unique combinations of these data points, there is no ground truth that can be ascertained in advance: nobody like them was ever seen by a doctor before,” Evans and Ossorio write.

According to Gail Javitt, a lawyer for Epstein Becker and Green, FDA generally considers CDS software that “do not direct a specific clinical decision to pose less risk than those that will be used as the sole basis for decision making or that provide directive clinical recommendations.”

But those lines are beginning to blur.

As Javitt notes, “For example, in the case of computer-aided detection (CAD) devices used in conjunction with breast imaging, FDA distinguishes between those that are intended solely to direct the clinician’s attention to portions of an image or aspects of radiology device data (CADe) and those that also are intended to assess disease risk, specify disease type, severity, or stage and/or recommend an intervention (CADx). Whereas CADe devices generally are class II (moderate risk), CADx devices are class III (high risk).”

How FDA implements this section of the Cures Act will better determine if certain promising technologies that harness real-world evidence and data can move into clinical use and how quickly.

“The crux of effective implementation is for FDA to enunciate standards of transparency CDS software must meet, before it will escape FDA regulation,” Evans and Ossorio write. “Transparency in this context includes algorithmic transparency, physician access to underlying data that algorithms use, and CDS software vendor contracts that allow open, collegial airing of the strengths and weaknesses physicians encounter as they apply CDS in practice settings.”

But companies last month began pushing back on this idea of algorithmic transparency, according to comments on FDA draft guidance.

Device industry group AdvaMed said the concept of transparency in the context of the CDS draft guidance “should remain focused on the information underlying the recommendation within an artificial intelligence and/or machine learning process, rather than the algorithm itself.”

Athenahealth also argued that “the critical element is whether the software makes the information available to the user, not whether it is publicly available,” saying FDA’s approach may “limit the types of information that can be incorporated into CDS, which will in turn decrease the number of CDS tools on the market.”

Categories: Regulatory News

Regulatory Focus newsletters

All the biggest regulatory news and happenings.

Subscribe