High-priority drugedrug interactions for use in electronic health records

4 downloads 13 Views 103KB Size Report
generating medication-related decision support alerts in all EHRs. Panelists .... KBs, availability of therapeutic alternatives, monitoring/ management options ...... management: an inventory of tools and techniques. Int J Med Inform. 2010 ...

Research and applications

High-priority drugedrug interactions for use in electronic health records Shobha Phansalkar,1,2,3 Amrita A Desai,3 Douglas Bell,4,5 Eileen Yoshida,3 John Doole,3 Melissa Czochanski,3 Blackford Middleton,1,2,3 David W Bates1,2,3 1

Division of General Internal Medicine and Primary Care, Brigham and Women’s Hospital, Boston, Massachusetts, USA 2 Harvard Medical School, Boston, Massachusetts, USA 3 Partners HealthCare System, Wellesley, Massachusetts, USA 4 RAND Corporation, Santa Monica, California, USA 5 Department of Medicine, David Geffen School of Medicine at UCLA, Los Angeles, California, USA Correspondence to Dr Shobha Phansalkar, Clinical Informatics Research and Development (CIRD), Partners Healthcare System, Inc., 2nd Floor, 93 Worcester Street, PO Box 81905, Wellesley, MA 02481, USA; [email protected] Received 27 September 2011 Accepted 29 March 2012 Published Online First 26 April 2012

ABSTRACT Objective To develop a set of high-severity, clinically significant drugedrug interactions (DDIs) for use in electronic health records (EHRs). Methods A panel of experts was convened with the goal of identifying critical DDIs that should be used for generating medication-related decision support alerts in all EHRs. Panelists included medication knowledge base vendors, EHR vendors, in-house knowledge base developers from academic medical centers, and both federal and private agencies involved in the regulation of medication use. Candidate DDIs were assessed by the panel based on the consequence of the interaction, severity levels assigned to them across various medication knowledge bases, availability of therapeutic alternatives, monitoring/management options, predisposing factors, and the probability of the interaction based on the strength of evidence available in the literature. Results Of 31 DDIs considered to be high risk, the panel approved a final list of 15 interactions. Panelists agreed that this list represented drugs that are contraindicated for concurrent use, though it does not necessarily represent a complete list of all such interacting drug pairs. For other drug interactions, severity may depend on additional factors, such as patient conditions or timing of co-administration. Discussion The panel provided recommendations on the creation, maintenance, and implementation of a central repository of high severity interactions. Conclusions A set of highly clinically significant drugdrug interactions was identified, for which warnings should be generated in all EHRs. The panel highlighted the complexity of issues surrounding development and implementation of such a list.

described here was to identify a set of critical interactions that can be implemented in KBs for use in EHRs. A secondary goal was to identify the process and barriers that would be involved in successful implementation of such a list of critical drugedrug interactions (DDIs).

BACKGROUND AND SIGNIFICANCE Previous studies have empirically evaluated the high rates of overriding medication-related CDS alerts, to range between 33% and 96%.4 5 Studies recommend reducing alert fatigue by lowering the number of alerts presented to clinicians and by increasing alert specificity.3 6 7 Most KBs tier DDIs based on their severity and strength of evidence but there is little overlap between these KBs on even the highly significant DDIs.8e10 Further, local customization of KBs is resource intensive, requires special expertise, and is thus rarely undertaken.11 12 To facilitate this effort this task order from the ONC focused on identifying high priority DDIs that could be used as a minimum standard for successful incorporation of such critical DDIs into EHRs.

METHODOLOGY An expert panel was convened with representatives from diverse stakeholders in the implementation of medication-related decision support in EHRs. The panel assessed a set of candidate high severity DDIs that should never be concurrently prescribed and could be used as the minimum standard for inclusion in medication-related decision support programs for use in EHRs. For the purposes of this discussion, a DDI was defined as a modification of the effect of one drug when administered with another drug not from the same therapeutic class.

INTRODUCTION Medication-related decision support has the potential to reduce morbidity and mortality associated with preventable adverse drug events and improve the quality of patient care.1 2 A majority of electronic health records (EHRs) employ clinical decision support (CDS) using commercially available medication knowledge bases (KBs).2 The extent of the benefit of implementing CDS is seldom realized, in part, due to “alert fatigue”.3 Alert fatigue results when a provider, after receiving too many alerts, ignores and/or overrides them, even clinically significant ones. To address the challenges of alert burden and its impact on EHR adoption, the Office of the National Coordinator for Health Information Technology (ONC) commissioned this effort. The goal of the effort J Am Med Inform Assoc 2012;19:735e743. doi:10.1136/amiajnl-2011-000612

Developing a list of highly clinically significant DDIs Sources of information considered in developing the list of DDIs included: Partners Healthcare System Medication Knowledge Base (PHS MKB); commercial medication KBs such as Micromedex; First Data Bank (FDB); http://Drugs.com; and academic research papers written by experts in this domain, for example Malone et al,13 Isaac et al,14 Van der Sijs et al,6 and Hansten and Horn.10 15 We elected to begin the panel discussions using the highest severity DDIs from PHS MKB because there was substantial variation among sources regarding high severity interactions. Additionally, Partners’ DDIs have been extensively used enterprise-wide in clinical practice, and included consideration of many of the above resources for assigning severity. 735

Research and applications DDIs in the PHS MKB The candidate list of DDIs was derived from the medication KB currently employed at PHS. The centralized KB is utilized to generate CDS for DDIs in clinical practice at two large academic medical centers and at primary care clinics that use the in-house developed EHR. This list was developed over several years of inhouse customization, based on feedback from clinical end-users and maintained by a team of pharmacists who review the literature evidence and severity ratings in vendor medication KBs to periodically assign the severity rating to keep the list current. Further, a content committee periodically reviews these ratings based on clinical alert logs to assess whether certain interactions need their severity levels to be up- or downgraded. In the PHS MKB, DDIs are documented as drug pairs, expressed with their generic names. DDIs are tiered into three levels depending largely on the severity of the interaction. Each level is presented differently and implies different capabilities for overriding. Level 1 consists of the most serious, life-threatening interactions implemented as “hard stop” alerts that require a clinician to either cancel the order he or she is writing or discontinue the preexisting, interacting medication order. Level 2 DDIs are of moderate severity and a reason needs to be provided in order to override the alert. Level 3 alerts are the least serious interactions which are presented as non-interruptive or information alerts. Of the 3327 DDI pairs in the PHS KB, 195 DDIs pairs are Level 1, 1561 are Level 2, and 1572 are Level 3 interactions. A medication knowledge committee periodically reviews recommendations from end users to modify the rules in the PHS MKB.5 7 Given that the PHS MKB has previously been evaluated for coverage of critical DDIs and is periodically tailored based on provider responses in clinical practice, the Level 1 alerts served as a good starting point for the candidate DDIs to be considered in this discussion.

Building a starter set of DDIs using the PHS MKB In order to facilitate the panel process, we extracted the highest severity interactions or Level 1 DDIs from the PHS MKB. Two clinical pharmacists who had expertise in medication KBs and clinical informatics and one physician with experience in KB engineering and pharmacology, reviewed this list. To consolidate the DDIs, ingredient level pairs were aggregated into appropriate therapeutic, pharmacological or structural classes. For example, the two Level 1 interactionsd(i) omeprazole with atazanavir, and (ii) rabeprazole with atazanavirdwere converted to a single class-based interaction because both omeprazole and rabeprazole belong to the same pharmacological class, “proton pump inhibitors”. Consideration of pharmacodynamic and pharmacokinetic properties also helped in the derivation of appropriate classes for representing the DDIs. We consulted a variety of MKBs, such as Micromedex, FDB, http://Drugs.com, and academic research papers written by experts in this domain (eg, Malone et al,13 Isaac et al,14 Van der Sijs et al,6 and Hansten and Horn10 15) to derive the appropriate level of the interaction and membership within a drug class. Using this process, 195 drugedrug pairs were consolidated into a total of 31 interaction pairs, with 12 drugedrug (eg, tranylcypromineeprocarbazine), 12 drugeclass (eg, atazanavireproton pump inhibitors), and 7 classeclass (eg, selective serotonin reuptake inhibitors and monoamine oxidase inhibitors) interactions.

Expert panel Twenty-one subject matter experts with experience in the development, maintenance and implementation of medicationrelated decision support in EHRs were invited to participate on 736

the panel. Diversity of expertise was important in the selection of the panel so as to include a broad array of perspectives. Clinical experts consisted of both practicing physicians and pharmacists who brought real world experience to the discussion. We invited experts to represent medication KB vendors, EHR vendors, proprietary and in-house KB developers, and academic medical centers. Several KB vendors and EHR vendors had pharmacists and providers on their teams who further contributed to the clinical expertise on the panel. In addition, we invited representatives from federal and private agencies involved in the regulation of medication use, such as the Food and Drug Administration (FDA) and the American Society of Health-System Pharmacists. A more detailed description of the participating institutions and panelists is available in table 1. Each panelist independently assessed all interactions based on the predicted clinical outcome or consequence of the interaction, the severity levels assigned to them across various medication KBs, availability of therapeutic alternatives, monitoring/ management options, predisposing factors, and the probability of the interaction based on the strength of evidence available in the literature. The panel also made suggestions regarding specific drugs that should be considered for either addition or deletion under a specific drug class for each candidate DDI. Two rounds of panel discussions were convened to seek consensus.

Ratings from KB vendors KB vendors are routinely involved with conducting reviews of the evidence in the literature to maintain their product databases and are most up to date with the DDI literature. Three commercial KB vendors, Wolters Kluwer (Medi-Span), FDB, and Cerner Multum, hold a majority of the market share in the area of medication KBs in the USA and participated on the panel. In addition, since the intent of this work was to provide a set of interactions that could be integrated with existing KB solutions, we requested each KB vendor to rate the interactions. Ratings were based on a 9-point scale, with 1 corresponding to “not at all important”, 5 to “equivocal”, and 9 to “extremely important”. KB vendors’ ratings were used to calculate average scores by summing the rating from each vendor and dividing the sum by 3; interactions that scored

Suggest Documents