Posted 29 August 2017
By Michael Mezher
A team of researchers from Novartis, Oracle Health Sciences and the University of California, San Francisco say the US Food and Drug Administration's (FDA) adverse drug reaction database could be improved by grouping drugs by their chemical structure and automating certain reporting functions.
In a paper appearing in eLife earlier this month, the researchers say such changes could help address several major challenges to interpreting data in FDA's adverse event reporting system (FAERS).
Specifically, the authors say that the ability to automatically map drugs by ingredient could make it easier to identify trends and sift through some of the noise inherent in the open-submission database. The authors also say that introducing automation to the reports themselves could cut down on the number of erroneous or misclassified reports.
FAERS currently contains well over 8.5 million reported adverse drug reactions (ADRs), and is growing at an increasing rate, with the number of ADRs reported to FDA rising from less than 100,000 per quarter in the late-1990s through the mid-2000s to around 300,000 per quarter in 2015.
The majority of reports to FAERS originate from patients (40%) and healthcare providers (physicians, 25%; other health professional, 16%; and pharmacist 5%); the remaining reports come from lawyers (3%) or did not identify the reporter (9%).
According to the authors, those reports match up to just over 7 million individual ADRs after accounting for multiple reports about the same incident and duplicate reports.
Too Many Names
One of the biggest factors that complicates research into FAERS data is the fact that many drug ingredients are sold under multiple names and in different formulations.
To get around this, the authors aggregated drugs by their active ingredient, which they say provided stronger drug-ADR signals than by aggregating based on synonym grouping alone.
This provided the authors with a list of 2,729 unique active ingredients. Of those, the authors found that a surprising 30% had no reported ADRs and that 90% of the reported ADRs were attributed to just under half the ingredients found in FAERS.
In the case of fluoxetine, the active ingredient in Prozac, the authors found nearly 400 "synonyms" for the drug. Overall, the authors found an average of 16 synonyms per active ingredient in FAERS.
Without a bigger picture view of ADRs for drugs with numerous synonyms, the authors say that certain side effects may falsely appear to be significant or insignificant. For example, sexual dysfunction, a well-known side effect of fluoxetine, appeared to be statistically insignificant when looking only at reports for Prozac.
But when looking across all aggregated data for fluoxetine, the authors found that sexual dysfunction stood out clearly.
The authors also found that reports in FAERS disproportionately tilted towards serious or life-threatening outcomes.
Nearly 13% of ADRs they identified were to report a death, 5% reported a life-threatening event, and 34% reported a hospitalization. But 42% of reported outcomes fell under "other," which the authors say is the only one of FAERS seven outcome options that is fit for reporting non-serious outcomes.
"It is a feature of reporting in an open submission database like FAERS that this ratio does not reflect the true balance between fatal and relatively benign drug ADRs, but rather the ratio of the ADRs that are thought to merit reporting," the authors write.
In many cases, the authors say that reporters confused adverse reactions with a drug's indication.
"Approximately 5% of all reports for any drug describe the drug's indication as an adverse event," the authors write.
For instance, some reports identified diabetes as a side effect for the drug rosiglitazone, which is used to treat type two diabetes.
The authors suggest that reporter education could play a role in reducing the number of reports that confuse indication and side effect, as the number of such reports began to drop in 2011 when FDA's 2010 final rule on adverse event reporting went into effect.
The authors also found that other factors, including regulatory activity and news reports can influence adverse event reporting in ways that can make interpreting trends difficult.
In the case of rosiglitazone and another related drug, pioglitazone, the authors observed a statistically significant signal between both drugs and cardiac events.
But while the authors found that the number of heart-related reports for rosiglitazone was fairly constant over time, such reports for pioglitazone peaked alongside a period of increased scrutiny of rosiglitazone and remained low otherwise.