Electrocardiogram Interpretation Training and ... - Wiley Online Library

4 downloads 1557 Views 44KB Size Report
directors (PDs) of all 125 Accreditation Council for Graduate. Medical ... These data suggest that EM PDs believe that EM residency ..... tammy/lo/oned2.html.
982

Pines et al.

d

EM RESIDENCY TRAINING IN ECG INTERPRETATION

Electrocardiogram Interpretation Training and Competency Assessment in Emergency Medicine Residency Programs Jesse M. Pines, MD, MBA, Debra G. Perina, MD, William J. Brady, MD Abstract Objectives: To determine the type of electrocardiogram (ECG) interpretation instruction in emergency medicine (EM) residency programs, the use and perceived value of teaching modalities and resources, and the methods used to assess competency of ECG interpretation. Methods: An interactive survey instrument was posted on the Internet using SurveySuite, Inc., software and e-mailed to program directors (PDs) of all 125 Accreditation Council for Graduate Medical Education–approved U.S. EM residency programs. Responses are reported in total numbers and percentages. Results: Ninety-nine of 125 PDs completed the online survey (response rate, 79.2%). Emergency department instruction (99%), case-based lectures (98%), and didactic lectures (98%) were most commonly used to teach interpretation of ECGs, followed by computer-based instruction (34%) and ECG laboratory (12%). The majority of programs

(53%) spent more than eight hours on formal ECG lectures per year, while 11% spent less than three hours. Observation during clinical time (99%), lecture time (76%), and hypothetical cases (57%) were the most common ways to determine competency in reading ECGs, while clinical observation and hypothetical cases were perceived as the most valuable. The most commonly used resource was personal or departmental ECG files (91%), and this had the highest perceived value. The majority of PDs were comfortable with residents’ abilities to read ECGs by the third year (96%) and fourth year (91%) of residency. Conclusions: These data suggest that EM PDs believe that EM residency is adequately preparing graduates to interpret ECGs. This goal is achieved through a variety of methods. Key words: ECG interpretation; resident training; skills assessment. ACADEMIC EMERGENCY MEDICINE 2004; 11:982–984.

Emergency physicians are often the only physicians interpreting electrocardiograms (ECGs) at the time of provision of acute patient care. A reasonable expectation of emergency medicine (EM) residency training is competence in bedside ECG interpretation. Quality patient care is the ultimate goal; however, missed myocardial infarction remains the highest dollar cost in emergency department (ED) malpractice litigation.1 A recent study found that 2% of patients with acute myocardial infarctions are mistakenly discharged; however, only two of 889 patients were discharged erroneously secondary to a misread ECG.2 Teaching ECG interpretation is a fundamental part of EM training programs. However, teaching methods are variable and consist of bedside teaching, lectures (didactic and case-based), textbooks, computer-based instruction, and dedicated cardiology months. Unlike bedside ultrasonography,3 there are no guidelines for teaching ECG interpretation and determining compe-

tency for emergency physicians. However, adequacy of teaching ECG interpretation may be determined by the existence of a formal curriculum, case-based learning at the bedside, expert guidance in interpretation in real time, and a method of competency assessment. We sought to determine the type of ECG interpretation instruction provided in EM residency programs, the use and perceived value of teaching modalities and resources, and the methods used to assess competency.

From the Emergency Medicine Residency Program (JMP), Department of Emergency Medicine (DGP, WJB), University of Virginia, Charlottesville, VA. Received January 6, 2004; revision received March 10, 2004; accepted March 23, 2004. Address for correspondence and reprints: William J. Brady, MD, Department of Emergency Medicine, P.O. Box 800699, Charlottesville, VA 22908. Fax 434-924-2877; e-mail: [email protected]. virginia.edu. doi:10.1197/j.aem.2004.03.023

METHODS Study Design. An interactive survey instrument was posted on the Internet using SurveySuite, Inc., (Charlottesville, VA) software (http://intercom. virginia.edu/SurveySuite) and e-mailed to the program directors (PDs) of all 125 Accreditation Council for Graduate Medical Education–approved U.S. EM residency programs. It was e-mailed three times to maximize responses. This study was considered exempt from informed consent by our institutional review board. Survey Content and Administration. The survey had 14 questions examining ECG training, the number of hours of didactic and clinical instruction, the perceived value of different teaching modalities, and competency assessment (using Likert scales).

ACAD EMERG MED

d

September 2004, Vol. 11, No. 9

d

983

www.aemj.org

Data Analysis. Responses were collected and reported in terms of percentage of total respondents. Data were tabulated and analyzed using Excel (Microsoft Corp., Redmond, WA).

RESULTS Ninety-nine of 125 PDs completed the survey (response rate, 79%). ED instruction (99%), case-based lectures (98%), didactic lectures (98%), computerbased learning (34%), and laboratories (12%) are used to teach interpretation of ECGs. Other modalities used include dedicated time with cardiologists (coronary care unit months, dedicated ECG months, cardiology ward months, and cardiology conferences) and individual reading. The highest perceived value was attributed to ED instruction, case-based lectures, and didactic lectures, with 96%, 98%, and 95% of PDs rating these as valuable or extremely valuable. By comparison, computer-based instruction and laboratory teaching were found to be valuable or extremely valuable by 58% and 25% of respondents, respectively. Of those programs that had computer-based learning, 70% of respondents found it to be valuable or extremely valuable. Programs devoted different lengths of time to a formal ECG curriculum: one to three hours (11%), four to seven hours (35%), eight to ten hours (16%), and more than ten hours (37%). Twenty-three percent of PDs were comfortable or very comfortable with the ability of first-year residents to interpret ECGs, compared with 78% for second-year residents, 96% for third-year residents, and 91% for fourth-year residents or greater. Observation in a clinical setting (96%), observation in a lecture setting (76%), hypothetical case scenarios (57%), informal testing (41%), and formal testing (27%) were the methods used most often to determine ECG competency. Other methods to determine competency were mock oral board examinations, in-service examination scores, and performance on an ECG rotation. Clinical observation, hypothetical case scenarios, formal testing, and informal testing were rated as valuable or very valuable by 94%, 91%, 76%, and 70% of PDs, respectively, while lecture observation was found to be valuable or very valuable by 58%. The personal/departmental ECG teaching file was the most commonly used resource (91%), while the Council of Emergency Medicine Residency Directors (CORD) ECG question bank (60%), test-based material (59%), and Internet tools (25%) were also used. Other resources included use of textbooks and computer programs such as the EKG Challenger, departmental files of the cardiology department, and materials from the American College of Cardiology program. The personal/departmental teaching file had the highest perceived value, with 93% finding it valuable or very valuable, compared with the CORD ECG question

bank (66%), test-based material (64%), and Internetbased tools (48%). Suggestions by respondents to improve ECG education included upgrading the CORD ECG question bank so that ECGs are easily downloadable and could be placed in PowerPoint presentations and developing a formal standardized curriculum to teach ECG interpretation.

DISCUSSION The ability to rapidly interpret ECGs in clinical EM decision making is a critical skill. Our results describe the current state of ECG interpretation education and competency assessment in EM residency programs and report the perceptions of PDs of the value of different teaching modalities and assessment methods, and the competence of residents in ECG interpretation. Because of the volume of ECGs interpreted by residents, much of the learning experience occurs at the bedside. It makes sense that competence in ECG interpretation would improve with each successive training year. Our findings support this; PDs noted that they were, for the most part, comfortable with the ability of their third- and fourth-year residents to interpret ECGs. A study comparing EM residents and attending cardiologists found only a 13% discrepancy rate in ECG interpretation, and discordant interpretations did not change patient management.4 Similar studies found higher concordance rates between attending emergency physicians and cardiologists, with discordant readings not affecting patient care.5–7 This may suggest that faculty are doing an adequate job of teaching bedside ECG interpretation. We found significant variability in the way that programs teach ECG interpretation. One way to construct ECG training methodologies in an adult learning framework involves Kolb’s learning theory.8 The spectrum of ECG education includes concrete experiences (learning ECG interpretation at the bedside), reflective observation (attending lectures or observing others interpret ECGs), abstract conceptualization (reading ECG books), and active experimentation (computerbased modules). Clinical instruction in the ED (concrete experience) has the highest perceived value among PDs. In an EM program, this is the setting where residents get the greatest exposure to ECGs. Lecturebased presentations (didactics and case-based material) are also perceived to be a valuable part of the curriculum but less so than bedside ED instruction. It was noted that the ‘‘hands-on’’ teaching of ECGs in lectures, typically through the Socratic method, might work well to reinforce bedside teaching. The value of the Kolb framework is to show that any one method may not work for all residents because adult learners assimilate information differently and learning styles must be tailored to the individual student. The PDs

984

Pines et al.

were not as supportive of computer-based and laboratory-based instruction, with 53% and 28% ranking them as extremely valuable or valuable, respectively. However, 70% of those PDs who used computer-based instruction found it to be valuable. There are large differences among programs in the number of formal curriculum hours spent teaching ECG interpretation. Slightly more than half of the responding programs spent more than eight hours per year in formal ECG instruction. Programs use a variety of methods to determine resident competence in ECG interpretation. More than a fourth of the programs used formal testing, while 41% used informal testing. An example of informal testing includes asking residents questions about ECGs at the point of care. Both informal testing and observation in a lecture setting received similar marks from PDs with respect to value to learning. In contrast, PDs agreed that observation of residents in an ED setting is the most valuable way to determine competency. By our definitions, informal testing and observation in a clinical setting are similar; however, we define informal testing as active (asking questions), while observation is passive. Observation of the performance of residents in a real-life environment received the highest marks to determine competency; however, it is more subjective than a formal approach. A combination of formal testing and clinical observation may provide the best way to determine competency. Most of the responding PDs reported using a personal/departmental ECG databank. The CORD ECG question bank was reportedly used by approximately 60% of PDs surveyed, and more than half rated it a valuable resource. Emergency physicians receive training to interpret objective studies such as radiographs, ECGs, and ultrasound studies. Our study shows that EM residents are formally taught ECG interpretation during their training and that competency is verified with assessment tools. Developing recommendations for minimal education, experience, training, and skills to determine a satisfactory level of competence in ECG interpretation during EM residencies would be valuable.

LIMITATIONS Study limitations include those inherent to survey research, such as the potential bias of unintended

d

EM RESIDENCY TRAINING IN ECG INTERPRETATION

leading questions. The survey link was e-mailed only to EM PDs; however, because no unique identifiers were included, it was impossible to verify that only PDs responded. The opinions of PDs may be biased and may not reflect the opinions of the program faculty. The fact that 20% of the PDs did not respond to this survey may have resulted in a reporting bias. Future research might attempt to define and prospectively evaluate a formal ECG teaching curriculum.

CONCLUSIONS There are a wide variety of curricula in EM training program instruction of ECG interpretation. Bedside teaching, clinical instruction in the ED, and lectures (didactics and case-based learning) are the most common teaching modalities and have the highest perceived teaching value. Computer-based learning and laboratories are used less frequently and have a lower perceived value; however, computer-based learning is ranked higher among programs that use the resource. Our data suggest that EM PDs believe that residency is adequate preparation to interpret ECGs. References 1. Rusnak RA, Stair TO, Hansen K, Fastow JS. Litigation against the emergency physician: common features in cases of missed myocardial infarction. Ann Emerg Med. 1989; 18: 1029–34. 2. Pope JH, Aufderheide TP, Ruthazer R, et al. Missed diagnoses of acute cardiac ischemia in the emergency department. N Engl J Med. 2000; 342:1163–70. 3. Counselman FL, Sanders A, Slovis CM, Danzl D, Binder LS, Perina DG. The status of bedside ultrasonography training in emergency medicine residency programs. Acad Emerg Med. 2003; 10:37–42. 4. Zappa MJ, Smith M, Li S. How well do emergency physicians interpret ECGs? [abstract]. Ann Emerg Med. 1991; 20:463. 5. Kuhn M, Morgan MT, Hoffman JR. Quality assurance in the emergency department: evaluation of the ECG review process. Ann Emerg Med. 1992; 21:10–5. 6. Westdorp EJ, Gratton MC, Watson WA. Emergency department interpretation of electrocardiograms. Ann Emerg Med. 1992; 21: 541–4. 7. Todd KH, Hoffman JR, Morgan MT. Effect of cardiologist ECG review on emergency department practice. Ann Emerg Med. 1996; 27:16–21. 8. Telecommunications for Remote Work and Learning: Adult Learning Online. Available at: http://www.cybercorp.net/ ;tammy/lo/oned2.html. Accessed Mar 8, 2004.