Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

Appl Clin Inform

Alfred Sorbello, DO, MPH, US Food and Drug Administration, Center for Drug Evaluation and Research, Office of Translational Sciences, 10903 New Hampshire Avenue, Silver Spring, MD 20993-0002 USA, Email:

Published: March 2017

AI Article Synopsis

  • The project focuses on creating a software tool to help FDA reviewers analyze scientific literature for identifying safety risks and adverse effects related to drugs.
  • The prototype uses statistical methods and visual analytics to mine data from PubMed/MEDLINE and was tested by FDA reviewers for usability and effectiveness in real-world scenarios.
  • Feedback from usability tests highlighted the tool's user-friendly design and its ability to generate useful safety signals, while also pointing out areas for improvement, such as search comprehensiveness and integration with existing systems.

Article Abstract

Objectives: We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool.

Methods: A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use.

Results: All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools.

Conclusions: Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5373771PMC
http://dx.doi.org/10.4338/ACI-2016-11-RA-0188DOI Listing

Publication Analysis

Top Keywords

usability testing
16
ade safety
16
literature reports
12
software analytical
12
analytical tool
12
scientific literature
8
prototype software
8
fda regulatory
8
adverse drug
8
safety signal
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!