Musculoskeletal Ultrasound Training and ... - Wiley Online Library

4 downloads 359896 Views 409KB Size Report
Received January 8, 2013, from Boston University School of Medicine, Boston ... Department of Rheumatology, University of Florida,. Jacksonville, Florida USA ...
3210jum_online_Layout 1 9/19/13 8:45 AM Page 1735

ORIGINAL RESEARCH

Musculoskeletal Ultrasound Training and Competency Assessment Program for Rheumatology Fellows Eugene Y. Kissin, MD, Jingbo Niu, DSc, Peter Balint, MD, David Bong, MD, Amy Evangelisto, MD, Janak Goyal, MD, Jay Higgs, MD, Daniel Malone, MD, Midori J. Nishio, MD, Carlos Pineda, MD, Wolfgang A. Schmidt, MD, Ralf G. Thiele, MD, Karina D. Torralba, MD, MACM, Gurjit S. Kaeley MBBS

Article includes CME test

Received January 8, 2013, from Boston University School of Medicine, Boston, Massachusetts USA (E.Y.K., J.N.); National Institute of Rheumatology and Physiotherapy, Budapest, Hungary (P.B.); Instituto Poal de Reumatlogia, Barcelona, Spain (D.B.); Arthritis, Rheumatic, and Back Disease Associates, Voorhees, New Jersey USA (A.E.); Raritan Bay Medical Center, Perth Amboy, New Jersey USA (J.G.); Department of Rheumatology, Brooke Army Medical Center, Lackland Air Force Base, San Antonio, Texas USA (J.H.); Excel Orthopedics, Beaver Dam, Wisconsin USA (D.M.); Nishio and Sharma MDs, Walnut Creek, California USA (M.J.N.); Department of Biomedical Research, Instituto Nacional de Rehabilitacion, Mexico City, Mexico (C.P.); Medical Center for Rheumatology Berlin-Buch, Berlin, Germany (W.A.S.); Department of Rheumatology, University of Rochester, Rochester, New York USA (R.G.T.); University of Southern California Keck School of Medicine, Los Angeles, California USA (K.D.T.); and Department of Rheumatology, University of Florida, Jacksonville, Florida USA (G.S.K.). Revision requested February 14, 2013. Revised manuscript accepted for publication March 12, 2013. We thank Drs Sebastian Arriola, Araceli Bernal, Efrén Canul, Mario Chavez, Jurgen Craig-Muller, Aleksander Feoktistov, Moises Gutiérrez, Cristina Hemández-Díaz, Fritz Hofmann, Robert Ike, Rafael Kieval, Paul Monach, Carlos Moya, Raúl Pacheco, Hugo Peña, Pedro Rodriguez, Burton Sack, Jonathan Samuels, Carla Solano, Darren Tabechian, Leobardo Terán, Lucio Ventura, Alvin Wells, and Thomas Wiegemann for help with the creation of the written examination; Drs Rany Al Haj and Paul DeMarco for help with the Angoff procedure; and Biosound Esaote, Diagnostic Instruments, Inc, GE Healthcare, and SonoSite, Inc, for supplying ultrasound equipment and technical expertise for the practical examination. Funding for this project was provided by an American College of Rheumatology Research and Education Foundation Clinician Scholar Educator Award. The views expressed herein are those of the authors and do not reflect the official policy or position of Brooke Army Medical Center, the US Army Medical Department, the US Army Office of the Surgeon General, the US Department of the Army, the US Air Force, the US Department of Defense, or the US Government. Address correspondence to Eugene Y. Kissin, MD, Arthritis Center, Boston University School of Medicine, 72 East Concord St, Evans 506, Boston, MA 02118 USA. E-mail: [email protected] doi:10.7863/ultra.32.10.1735

Objectives—The purpose of this study was to establish standards for musculoskeletal ultrasound competency through knowledge and skills testing using criterion-referenced methods. Methods—Two groups of rheumatology fellows trained in musculoskeletal ultrasound through a standardized curriculum, which required submission of ultrasound studies for review over 8 months. Both groups then completed written and practical examinations in musculoskeletal ultrasound. Instructors, advanced users, and intermediate users of musculoskeletal ultrasound served as comparison groups. A passing score (competency) was established for the written examination by the Angoff procedure and for the practical examination by the borderline method. Results—Thirty-eight fellows (19 in each group) took the final examination. Five fellows failed the written examination, and 1 failed the practical examination, whereas none of the advanced users failed. Written examination scores did not differ between the two fellow groups (74% versus 70%; P > .05), were reliable, and were able to discriminate between the intermediate and advanced groups. Practical and written examination results correlated in both groups (first group, r = 0.70; P = .0008; second group, r = 0.59; P = .009). Conclusions—Criterion-referenced methods were used for the first time to determine fellow musculoskeletal ultrasound competency. The examination used to determine competency was reproducible, was reliable, and could differentiate musculoskeletal ultrasound users with different levels of experience. Most rheumatology fellows completing our program passed the written and practical examinations, suggesting achievement of basic musculoskeletal ultrasound competency. Key Words—competency; education; examination; musculoskeletal; ultrasound

U

se of musculoskeletal ultrasound by rheumatologists has grown rapidly in the United States.1 Musculoskeletal ultrasound has been shown to detect tendon and joint abnormalities earlier and with greater precision than the physical examination and to increase the accuracy of joint and soft tissue injections, making it valuable to the rheumatologist.2 Despite increased use, most fellowship programs have not implemented a standardized curriculum in musculoskeletal ultrasound. Published literature does not reflect agreement on standards for competency testing in musculoskeletal ultrasound for practitioners of any specialty.3

©2013 by the American Institute of Ultrasound in Medicine | J Ultrasound Med 2013; 32:1735–1743 | 0278-4297 | www.aium.org

3210jum_online_Layout 1 9/19/13 8:45 AM Page 1736

Kissin et al—Musculoskeletal Ultrasound Competency

Competence is generally defined as the state of having the requisite ability to perform a job properly.4 In criterion sampling, one defines “observable behaviors” related to job performance by a group of individuals who perform a job well,5 and competence is defined relative to these behaviors. The process of ensuring competency in musculoskeletal ultrasound is critical for optimizing patient care as well as containing musculoskeletal ultrasound overuse.6 As a general principle, the assessment of competency must reflect both the knowledge and skill that are required for job performance.7 Applying the Dreyfus model8 to musculoskeletal ultrasound, we propose that the competent sonographer should be able to obtain standard images and interpret images of common pathology, whereas a proficient sonographer should be able to obtain these images in a defined period and interpret pathology and anatomic variants in patients with a wider array of disease processes. Currently, competency guidelines have proposed amounts of training based on the opinions of leaders in the field, which vary considerably from one organization to another, ranging from 500 musculoskeletal ultrasound scans (for nonradiologists) by the American College of Radiology9 to 150 musculoskeletal ultrasound scans by the American Institute of Ultrasound Medicine.10 However, many experts in the field also understand that the amount of training required for competency varies widely among trainees.3 The main objective of this study was to establish standards for fellow musculoskeletal ultrasound competency through criterion-referenced knowledge and skills testing.

Fellow Recruitment In the 2009 and 2010 academic years, rheumatology fellows in the United States were invited to participate in an 8-month curriculum for musculoskeletal ultrasound training followed by a standardized examination of musculoskeletal ultrasound knowledge and skill. Invitation to participate in the program was sent to all fellowship program directors of rheumatology subspecialty training programs in the country. Inclusion criteria were as follows: (1) access to equipment for musculoskeletal ultrasound imaging; and (2) consent of the applicant’s program director for participation. The study was given exempt status by the Boston University Institutional Review Board.

10 faculty members (American rheumatologists with >4 years of experience in musculoskeletal ultrasound and previous experience in teaching musculoskeletal ultrasound workshops). The curriculum consisted of the following: (1) studying standardized scanning and pathology guides specific for rheumatic diseases; (2) reading assigned rheumatic disease literature on musculoskeletal ultrasound; (3) submitting protocol-driven12 ultrasound studies of 8 peripheral joint regions (hand, wrist, elbow, shoulder, hip, knee, ankle, and foot/toe) for remote critique, with an established goal of 45 studies overall and 5 studies per joint region; and (4) attending a 2.5-day, 21-hour musculoskeletal ultrasound workshop with half of the time dedicated to supervised hands-on training. For example, in the hip joint, the trainee was expected to adjust the equipment and produce images of the acetabulum, labrum, femoral head and neck, joint capsule, and iliopsoas muscle in longitudinal and transverse planes anteriorly, as well as to demonstrate the greater trochanter and the insertion of the gluteus medius tendon laterally. The total number of requested studies was derived from data published by Filippucci et al,11 which showed that in students performing between 1 and 5 musculoskeletal ultrasound studies per week, slightly more than half passed the final examination, whereas all of those submitting fewer studies failed the examination, and almost all of those submitting more passed the examination. We thus set the number at which fellows would have a reasonable chance of passing the examination while maximizing the likelihood of their reaching the desired threshold of submitted scans. The curriculum differed between 2008–2009 and 2009–2010 in that feedback on submitted studies was given on just 10% of submissions in 2008–2009 (2009 group) and 50% of submissions in 2009–2010 (2010 group). This change was made in response to trainee feedback after the first year of the program and was made possible by additional mentor time availability. Feedback aimed to improve 3 features of the submitted images: (1) machine settings (adjustment of depth, frequency, focal zones, gain, pulse repetition frequency, and transducer type used); (2) transducer positioning (rotation, angulation, pressure, and placement over the correct bony landmarks); and (3) image interpretation (normal, abnormal, or equivocal findings). At the completion of the curriculum, fellow evaluation of the program and their own skills in musculoskeletal ultrasound was solicited.

Teaching Curriculum The curriculum was based on a model previously described by Filippucci et al11 and developed by consensus among

Assessment Multiple-Choice Question Examination The intent of the multiple-choice question examination

Materials and Methods

1736

J Ultrasound Med 2013; 32:1735–1743

3210jum_online_Layout 1 9/19/13 8:45 AM Page 1737

Kissin et al—Musculoskeletal Ultrasound Competency

was to assess trainees’ knowledge of rheumatologic musculoskeletal ultrasound pathology. Ten faculty members created multiple-choice questions consisting of ultrasound images for interpretation, a question stem and 5 answer choices. The multiple-choice questions covered 8 peripheral joint regions and conditions common in rheumatology practice. By consensus, 107 questions were kept for the examination. The remaining multiple-choice questions were tested in 2 preliminary groups: 5 international rheumatologists with expertise (average of >6 years of experience) in musculoskeletal ultrasound and 3 rheumatologists who were novices in musculoskeletal ultrasound (no formal training or scanning experience in musculoskeletal ultrasound). Questions answered correctly by all 3 novice rheumatologists were excluded because the questions did not require musculoskeletal ultrasound training to determine the correct answer. Questions answered incorrectly by most of the international rheumatologists were excluded because the questions may have had unclear wording, had unclear images, or covered esoteric topics leading to wrong answers. Thus, 76 multiple-choice questions remained for the final examination. The examination was validated in 4 groups of participants: 2009 fellows (n = 19); 2010 fellows (n = 19); postfellowship rheumatologists with 6 months of training, participation in a basic musculoskeletal ultrasound course, and at least 100 ultrasound studies, designated as “intermediate” through an alternative training program13 (intermediate group; n = 7); and rheumatologists with more than 1000 ultrasound studies performed or greater than 2 years of experience in musculoskeletal ultrasound (advanced group; n = 12). These last 2 groups were used to establish criterion validity of the written examination by evaluating its capacity to discriminate between groups of examinees with different levels of musculoskeletal ultrasound experience. The validation process thus included steps to ensure that the examination reflected the knowledge required to perform musculoskeletal ultrasound studies, exclude unclear or esoteric questions, and discriminate between partially and fully trained practitioners. Practical Examination The focus of the practical examination was to evaluate the trainees’ competency in generating adequate ultrasound images. An 8-station practical examination used the following methods to maximize objectivity and standardization in all 8 normal joint areas. Each fellow rotated though every station, where a healthy volunteer was examined for 15 minutes, using a previously taught scanning protocol

J Ultrasound Med 2013; 32:1735–1743

while being directly observed by a faculty proctor. In addition to the fellows, 2 musculoskeletal ultrasound instructors also rotated through the stations. The station proctor ensured standardization of the protocol and blinded labeling of the images to hide the identity of the scanning physician for the obtained images from the reviewers. To avoid bias of trainee experience with a particular brand of ultrasound machine, trainees adjusted machine settings (M-Turbo [SonoSite, Inc, Bothell, WA], LOGIQ e [GE Healthcare, Waukesha, WI], MyLab 25 [Biosound Esaote, Indianapolis, IN], and MindRay M5 [Diagnostic Instruments, Inc, Sterling Heights, MI]) verbally through the proctor. Following the examination, each ultrasound image was independently graded by 2 instructors in a blinded manner on a predefined 0- to 5-point scale (Figure 1). The grading was predicated on the following image qualities: clarity of the bone contour, avoidance of anisotropy (darkening of the echogenicity of a structure produced when the transducer is angled obliquely to it), transducer positioning, pressure and orientation relative to structures of interest, and equipment settings. Due to technical equipment problems, the results from the ankle station in 2009 were excluded from analysis. Setting of the Competency/Passing Score The Angoff procedure was used to determine the passing score on the multiple-choice examination.14 The passing score was calculated as the average of all subject matter experts’ scores for all questions. Pass-fail scores determined by the Angoff procedure are reasonably stable across time and across panels of well-qualified judges.15 For the practical examination, the borderline group method was used to set the passing score, as it was specifically developed for performance-based examinations and is reliable and appropriate for this use.16,17 The average composite checklist score for each practical station that received an average global score of 2 (barely adequate image) was used to establish the borderline passing score. Statistical Analysis Median scores are reported. The significance of the program completion rate comparison and practical examination station failure rate were determined with a χ2 test. The significance of other comparisons between groups was determined with a Wilcoxon 2-sample test. The reliability of the multiple-choice examination was calculated by the Cronbach α coefficient. Correlations between multiplechoice and practical examination results, as well as between examination scores and training characteristics, were determined with the Spearman correlation coefficient. The

1737

3210jum_online_Layout 1 9/19/13 8:45 AM Page 1738

Kissin et al—Musculoskeletal Ultrasound Competency

comparison between fellow and instructor practical examination results was made with a Wilcoxon 2-sample test. All statistical analyses were performed with the SAS version 9.2 statistical software package (SAS Institute Inc, Cary, NC).

Results Multiple-Choice Question Examination Contents The multiple-choice questions included all joint regions and topics of normal anatomy, artifacts, synovitis, tendinopathy, effusion, erosion, calcification, gout, bursitis, osteophytes, soft tissue edema, and median nerve entrapment. Sixty-one of the questions related to musculoskeletal ultrasound pathology. Sixty-two fell under the competency designation of “must know,” 4 under “should know,” and 10 under “could know” according to a study by Brown et al.18 Alter-

natively, under the musculoskeletal ultrasound training levels as defined by the European League Against Rheumatism, 43 of the questions fell under “basic,” 30 under “intermediate,” and 3 under “advanced” designations.19 Standard Setting The Angoff procedure resulted in setting the passing mark at 63% correct for the multiple-choice examination. The borderline method resulted in setting the passing mark at 2.7 overall (0–5 scale) for the practical examination, with the following passing marks for each station: hand, 2.99; wrist, 2.49; elbow, 2.74; shoulder, 3.12; hip, 2.71; knee, 2.52; ankle, 2.27; and foot/toe, 2.78. Thus, a participant could fail a particular station but still pass the practical examination overall as long as the total practical score for all stations was 2.7 or higher. Fellow Participation

Figure 1. Examples of global grading for an anterior longitudinal scan of the hip (2009 exam). Image deficiencies are as follows: A, acetabulum not shown, femoral neck barely visible, capsule not visible; B, acetabulum not shown, femoral neck indistinct, capsule indistinct; C, capsule indistinct; D, all features shown well, but depth could be decreased slightly. The panels were graded as follows: A, 1 (image not adequate for drawing clinical conclusions); B, 2 (image barely adequate for drawing clinical conclusions); C, 3 (adequate, average image); D, 4 (adequate, good image). There were no studies of the hip graded 5 (adequate, publication-quality image) or 0 (missing). Ac indicates acetabulum; Ca, joint capsule; FH, femoral head; and FN, femoral neck. A

B

C

D

1738

J Ultrasound Med 2013; 32:1735–1743

3210jum_online_Layout 1 9/19/13 8:45 AM Page 1739

Kissin et al—Musculoskeletal Ultrasound Competency

A total of 59 fellows from 26 fellowship programs enrolled (a volunteer convenience sample), of whom 27 were firstyear fellows, and 23 had previously taken a workshop on musculoskeletal ultrasound. Thirty-eight fellows (64%) representing 23 fellowship programs completed the training period and the final examination, with completion rates of 78% among first-year fellows and 53% among second- and third-year fellows (P < .05, χ2 test). Conversely, first-year fellows constituted 55% of completers and 28% of noncompleters. Prior precourse workshops in musculoskeletal ultrasound were taken by 45% of completers and 57% of noncompleters. Fifty-eight percent of course completers were female, whereas 43% of noncompleters were female. The most common reason for dropping out was lack of sufficient time or motivation (13), followed by lack of functional equipment availability (3). Of the 38 fellows who took the final examination, 19 (50%) submitted at least 45 studies for review during the program, with an average of 41 studies submitted in both the 2009 and 2010 groups (Table 1). Feasibility Implementation of the training and testing procedure required time investments from the faculty as follows: (1) Web-based ultrasound study review and feedback generation, 5 to 10 minutes per study; (2) 21-hour hands-on workshop, 1 faculty member per 5 fellow participants; and (3) practical examination, 8 faculty members to proctor the examination stations for 1 day. Each fellow required 2 hours for the practical examination; thus, no more than 32 fellows could be tested in 1 day with 8 faculty members. Multiple-Choice Question Examination Results The multiple-choice examination was reliable (Cronbach α = .70)20 and discriminated between the intermediate and advanced groups (median, 63% versus 79%; P = .003, Wilcoxon 2-sample test; Figure 2). Three fellows in the

2009 group and 2 fellows in the 2010 group scored equal to or below the competency passing score (13% failure rate). Four of 7 intermediate-group participants failed the written examination, whereas none of the advanced participants failed. The fellows’ median multiple-choice examination score was higher than that of the intermediate group (72% versus 63%; P = .01). Fellows’ median multiplechoice examination scores in the 2009 and 2010 groups were not significantly different (74% versus 70%; P > .05), did not differ between first-year and second/third-year rheumatology fellows (71% versus 72%; P > .05) and did not correlate with either the amount of preprogram workshop hours of training in musculoskeletal ultrasound or the number of total ultrasound studies submitted for review during the program. Practical Examination Results The median practical examination scores were not significantly different between the faculty members and fellows in either year (2009 group, ankle excluded, 3.34 versus 3.07; P = .08; 2010 group, 3.50 versus 3.45; P = .60; Figure 3). Performance on the practical examination correlated with the written examination results in both years (2009 group, r = 0.70; P = .0008; 2010 group, r = 0.59; P = .009). The 2010 group scored significantly higher than the 2009 group on the practical examination (3.45 versus 3.08; P < .0001; Table 1). The practical examination stations had Cronbach α reliabilities ranging from .59 in the wrist to .75 in the foot/toe, with a median of 0.65. One fellow scored significantly lower than the faculty on the practical examination overall and failed the practical examination based on the borderline method passing score. This fellow passed the multiple-choice examination. On average, fellows scored similarly to experts in the various joint regions (Table 2). There was suggestion of a correlation between the

Table 1. Participant Characteristics and Outcomes

Participants

Fellows Enrolled

Fellows Graduated

Studies Submitted

Fellows With ≥8 h MSUS CME Before Program

1st-y Fellows

MCQ Results

33 26 27 32 59

19 19 21 17 38

41 ± 9 41 ± 16a 42 ± 11 39 ± 15a 41

16 7 7 16 23

15 12 NA NA 27

72 ± 10 70 ± 7a 71 ± 10 72 ± 7a 71 ± 8

2009 fellows 2010 fellows 1st-y rheumatology fellows 2nd/3rd-y rheumatology fellows All fellows

PrE Results 3.08 ± 0.18 3.45 ± 0.15b 3.22 ± 0.24 3.32 ± 0.24a 3.26 ± 0.25

Values are presented as median ± SD where applicable. CME indicates continuing medical education; MCQ, multiple-choice question examination; MSUS, musculoskeletal ultrasound; NA, not applicable; and PrE, practical scanning examination. aP > .05 (not significant, Wilcoxon 2-sample test). bP < .0001 versus 2009 fellows.

J Ultrasound Med 2013; 32:1735–1743

1739

3210jum_online_Layout 1 9/19/13 8:45 AM Page 1740

Kissin et al—Musculoskeletal Ultrasound Competency

number of studies submitted and the practical examination results in the 2010 group (Spearman r = 0.55; P = .04) but not in the 2009 group (Spearman r = –0.03; P > .05). The practical examination results did not correlate with the number of precourse workshop hours of training for either group (r = –0.18; P > .05; r = 0.21; P > .05) or with the duration of rheumatology fellowship training (median ± SD, 3.23 ± 0.24 versus 3.35 ± 0.24; P > .05). Fellow Feedback An anonymous fellow feedback questionnaire after the final examination was returned by 37 of 38 fellows. Eight of 37 felt comfortable independently scanning all 8 joint regions; 22 of 37 were comfortable scanning 6 or 7 of the 8 regions; and 7 of 37 were comfortable with less than 6 of the 8 regions. The regions most frequently listed by fellows as difficult were the ankle (22 of 37), shoulder (14 of 37), and hip (13 of 37). All participants thought that the examination accurately reflected the material taught during the course.

Discussion In this work, criterion-referenced methods were used for the first time to determine fellow musculoskeletal ultrasound competency. The examination used to determine competency was reproducible, was reliable, and could differentiate musculoskeletal ultrasound users with different levels of experience. Most fellows who completed the 8-month program achieved a level of essential musculoskeletal ultrasound knowledge (image interpretation) comparable to the advanced musculoskeletal ultrasound group (as assessed by the multiple-choice examination), and their level of basic scanning skill (image generation) for normal anatomy was similar to their mentors’ criterion-referenced competency (as assessed by the practical examination) for those tasks. The fellows also achieved competency according to the adapted Dreyfus model8 by generating adequate images in a defined period and interpreting heterogeneous pathologic images comparable to proficient reference groups. Figure 3. Results of the practical examination by year for fellows (n = 19 each) and by faculty experts (n = 2 each). The passing mark of 2.70 was determined by the borderline method.

Figure 2. Results of the multiple-choice question examination for the following groups: 2010 fellows (n = 19), 2009 fellows (n = 19), rheumatologists with intermediate musculoskeletal ultrasound training (n = 7), and rheumatologists with more than 1000 ultrasound studies performed or greater than 2 years of experience in musculoskeletal ultrasound (Advanced; n = 7). The passing mark of 63 was determined by the Angoff procedure.

1740

J Ultrasound Med 2013; 32:1735–1743

3210jum_online_Layout 1 9/19/13 8:45 AM Page 1741

Kissin et al—Musculoskeletal Ultrasound Competency

Although similar programs have been developed previously in Europe,11,21 this work differed primarily in the attempt to develop and validate an assessment of musculoskeletal ultrasound competency through a process of criterion sampling for both image acquisition and interpretation, as well as using previously defined and accepted methods for determining passing/competency benchmarks. Furthermore, we developed a testing instrument that can reflect changes in musculoskeletal ultrasound learning, as the examination could discriminate between gradations of experience in musculoskeletal ultrasound. The examination procedure was repeated in 2 successive years with 2 cohorts of rheumatology fellows to ensure reproducibility. The correlation we found between the written (multiple-choice) and practical portions of the Table 2. Detailed Results of the Practical Examination: Fellow-Instructor Score Comparison and Fellow Failure Rates

Station

2009 Fellows 2010 Fellows (19 Fellows, (19 Fellows, 2 Instructors) 2 Instructors) Median n Failure/ Median n Failure/ Score n Total, % Score n Total, %

Hand Fellow Instructor Wrist Fellow Instructor Elbow Fellow Instructor Shoulder Fellow Instructor Hip Fellow Instructor Knee Fellow Instructor Ankle Fellow Instructor Foot/toe Fellow Instructor Total (excluding ankle) Fellow Instructor

3.33 3.46

5

3.83 3.71

0

2.41 2.80

63

3.15 3.16

5

2.92 3.00

32

3.35 3.54

5

3.21 3.58

47

3.26 3.46

32

2.83 3.28

42

3.60 3.46

0

3.08 4.01

0

3.54 3.79

0

2.17a 2.12a

84a

3.09 3.35

5

3.73 3.25

5

3.48 3.50

0

3.07 3.34

28

3.47 3.52

6

There were no statistical differences between fellow and instructor scores in any of the stations except the knee station in the 2009 group (P < .05, Wilcoxon 2-sample test). aThe ankle station was excluded from analysis due to technical difficulties with the ankle station in 2009.

J Ultrasound Med 2013; 32:1735–1743

examination was stronger than found in most studies comparing performance on written and practical examinations,22,23 giving the examination criterion validity, as did the finding that 57% of the intermediate group failed the multiple-choice examination, whereas none of the advanced group failed. The 2009 group got minimal feedback and the 2010 group got moderate feedback during the course, and although these groups performed similarly on the multiple-choice examination, the practical examination score was significantly higher in the group that received more feedback. Furthermore, although there was no correlation between the number of studies submitted and the practical examination results in the 2009 group, there was a trend toward a correlation between the number of studies submitted and the practical examination results in the 2010 group. Although these differences could have been due to other factors, the major distinction between the two groups of fellows was the degree of feedback given to them throughout the program, implying that mentor feedback is more important for performance on the practical examination (skill) than on the written examination (knowledge). Previous studies of the effects of feedback on physicians’ clinical performance have been equivocal, but a systematic review of this subject found that most studies did find a positive effect when feedback was provided by an authoritative source over an extended period.24 That review did not attempt to distinguish the effects of feedback on physician knowledge from the effects on physician skill. Only 1 randomized controlled trial demonstrated improved physician performance in the group allocated to multisource feedback, and that study found an improvement in communication skills.25 Additional strengths include generalizability to the scanning of normal musculoskeletal anatomy, as the practical examination tested patient positioning, transducer manipulation, and machine setting adjustment in all of the joint regions routinely examined by musculoskeletal ultrasound. Limitations of the study included the lack of detailed information about the local ultrasound mentors at the fellows’ respective institutions and the degree of their involvement with fellow training. Also, fellow skill in practical scanning of pathologic conditions was not assessed, and practical examination scores between the fellow groups and the instructors might have been different in this setting.26 However, a standardized practical examination of musculoskeletal pathology that represents a wide enough spectrum of disease to be generalizable and reproducible would not be practical. Our study also suffered from a small

1741

3210jum_online_Layout 1 9/19/13 8:45 AM Page 1742

Kissin et al—Musculoskeletal Ultrasound Competency

sample size: 59 fellows started the program and 38 finished, representing 23 fellowship programs. Relative to the 108 adult rheumatology fellowship programs in the United States27 and the 179 fellowship spots in rheumatology training programs obtained through the National Resident Matching Program in 2011,28 the fellows who completed the ultrasound program and took the examination represent approximately 21% of fellowship programs and 11% of rheumatology fellows in the United States. The program completion rate of 64% was slightly higher than would be expected based on prior reports: In the Belfast musculoskeletal ultrasound course,21 5 of 10 participants completed the course (50% dropout rate), whereas in the study by Filippucci et al,11 only 30% of the initial candidates submitted ultrasound studies for review. It is also important to note that our results can only be extrapolated to fellows who complete a training program in musculoskeletal ultrasound on a volunteer basis. Fellows who are required to complete such a program may have less interest and motivation and thus may not perform as well on the final examination. Finally, our analysis was limited by the lack of pre–musculoskeletal ultrasound training test results. However, pretesting and posttesting pose an inherent threat to validity through a “response shift bias.” Although the practical and written examination data were fairly well correlated, there were still significant differences in fellow performance between the two parts of the examination, and in fact, the fellow who failed the practical examination would have passed the total examination if it had included only the multiple-choice portion. Although this was only 1 fellow out of 38, the fellows also knew that they would be tested with both written and practical examinations and thus prepared for both. Had the examination consisted of only a written portion, the participants may have expended less effort on learning the ultrasound scanning skills, and fewer would have become competent in this component of musculoskeletal ultrasound use. The value of the practical examination also lies in linking the examination more closely to the actual job tasks of performing musculoskeletal ultrasound studies than a multiple-choice written examination can. Although further research is needed to establish the optimal methods for teaching musculoskeletal ultrasound in a competency-based curriculum, this study suggests a method for establishing a standardized, blended teaching program in musculoskeletal ultrasound that could be used by rheumatology fellows, with or without local mentors, from across the United States in the context of their other duties over 8 months. This approach contrasts with the 4 months of full-time dedication to all aspects of ultrasound

1742

training taught in 1-month blocks over a 4-year radiology residency program29 and implies that high-quality training in musculoskeletal ultrasound can be accomplished using primarily Web-based instruction to augment short handson workshops in other specialties using point-of-care ultrasound, where a local mentor may not be available. We will need to determine how musculoskeletal ultrasound practitioners trained in this way perform on musculoskeletal ultrasound certification examinations administered by other agencies. Most importantly, this study demonstrates that the results of a musculoskeletal ultrasound training program can be evaluated with a criterion-referenced examination (borderline method and Angoff procedure) to determine basic musculoskeletal ultrasound competency, and the results of such testing distinguish fellows who perform a task similarly to experienced practitioners—the ultimate criterion reference. As indicated by our results, there was a significant variability in individual trainee performance on both parts of the examination each year despite standardized teaching methods and training time. Credentialing requirements based on the length of training rather than competency-based milestones may hold back trainees who are ready to be credentialed as well as credentialed trainees who are not yet ready. Instead, criterion-referenced passing bars should thus be used for the establishment of musculoskeletal ultrasound competency by agencies currently undertaking the task of offering certification examinations.

References 1.

2.

3.

4. 5. 6. 7.

Samuels J, Abramson SB, Kaeley GS. The use of musculoskeletal ultrasound by rheumatologists in the United States. Bull NYU Hosp Jt Dis 2010; 68:292–298. American College of Rheumatology Musculoskeletal Ultrasound Task Force. Ultrasound in American rheumatology practice: report of the American College of Rheumatology Musculoskeletal Ultrasound Task Force. Arthritis Care Res (Hoboken) 2010; 62:1206–1219. Brown AK, Roberts TE, O’Connor P J, Wakefield RJ, Karim Z, Emery P. The development of an evidence-based educational framework to facilitate the training of competent rheumatologist ultrasonographers. Rheumatology (Oxford) 2007; 46:391–397. Competence. Merriam-Webster.com website; 2011. http://www. merriam-webster.com. McClelland DC. Testing for competence rather than for “intelligence.” Am Psychol 1973; 28:1–14. Nazarian LN, Alexander AA. Denying payments for musculoskeletal ultrasound: how did we get here? J Am Coll Radiol 2010; 7:553–556. Browning A, Bugbee A, Mullins M. Certification: A NOCA Handbook. Washington, DC: National Organization for Competency Assurance; 1996. J Ultrasound Med 2013; 32:1735–1743

3210jum_online_Layout 1 9/19/13 8:45 AM Page 1743

Kissin et al—Musculoskeletal Ultrasound Competency

8.

9.

10.

11.

12.

13. 14. 15. 16.

17.

18.

19.

20. 21.

22.

23. 24.

Batalden P, Leach D, Swing S, Dreyfus H, Dreyfus S. General competencies and accreditation in graduate medical education. Health Aff (Millwood) 2002; 21:103–111. American College of Radiology. ACR-SPR-SRU practice guideline for performing and interpreting diagnostic ultrasound examinations. American College of Radiology website; 2011. http://www.acr.org/ ~/media/ACR/Documents/PGTS/guidelines/US_Performing_ Interpreting.pdf American Institute of Ultrasound in Medicine. Training guidelines for physicians who evaluate and interpret diagnostic musculoskeletal ultrasound examinations. American Institute of Ultrasound in Medicine website; 2013. http://www.aium.org/officialStatements/51 Filippucci E, Meenagh G, Ciapetti A, Iagnocco A, Taggart A, Grassi W. E-learning in ultrasonography: a web-based approach. Ann Rheum Dis 2007; 66:962–965. Backhaus M, Burmester GR, Gerber T, et al. Guidelines for musculoskeletal ultrasound in rheumatology. Ann Rheum Dis 2001; 60:641– 649. Pineda C, Filippucci E, Chávez-López M, et al. Ultrasound in rheumatology: the Mexican experience. Clin Exp Rheumatol 2008; 26:929–932. Angoff WH. Proposal for theoretical and applied development in measurement. Appl Meas Educ 1988; 1:215–222. Norcini JJ, Shea JA. The reproducibility of standards over groups and occasions. Appl Meas Educ 1992; 5:63–72. Smee SM, Blackmore DE. Setting standards for an objective structured clinical examination: the borderline group method gains ground on Angoff. Med Educ 2001; 35:1009–1010. Kramer A, Muijtjens A, Jansen K, Düsman H, Tan L, van der Vleuten C. Comparison of a rational and an empirical standard setting procedure for an OSCE: objective structured clinical examinations. Med Educ 2003; 37:132–139. Brown AK, O’Connor PJ, Roberts TE, Wakefield RJ, Karim Z, Emery P. Ultrasonography for rheumatologists: the development of specific competency based educational outcomes. Ann Rheum Dis 2006; 65:629–636. Naredo E, Bijlsma JW, Conaghan PG, et al. Recommendations for the content and conduct of European League Against Rheumatism (EULAR) musculoskeletal ultrasound courses. Ann Rheum Dis 2008; 67:1017–1022. Nunnaly JC. Psychometric Theory. 2nd ed. New York, NY: McGraw-Hill; 1978. Taggart AJ, Wright SA, Ball E, Kane D, Wright G. The Belfast musculoskeletal ultrasound course. Rheumatology (Oxford) 2009; 48:1073– 1076. Kramer AW, Jansen JJ, Zuithoff P, et al. Predictive validity of a written knowledge test of skills for an OSCE in postgraduate training for general practice. Med Educ 2002; 36:812–819. Garry RC. Correlation between intelligence tests and written and practical examinations. Br Med J 1930; 2:217. Veloski J, Boex JR, Grasberger MJ, Evans A, Wolfson DB. Systematic review of the literature on assessment, feedback and physicians’ clinical performance: BEME Guide No. 7. Med Teach 2006; 28:117–128.

J Ultrasound Med 2013; 32:1735–1743

25. Brinkman WB, Geraghty SR, Lanphear BP, et al. Effect of multisource feedback on resident communication skills and professionalism: a randomized controlled trial. Arch Pediatr Adolesc Med 2007; 161:44–49. 26. D’Agostino MA, Maillefert JF, Said-Nahal R, Breban M, Ravaud P, Dougados M. Detection of small joint synovitis by ultrasonography: the learning curve of rheumatologists. Ann Rheum Dis 2004; 63:1284–1287. 27. American College of Rheumatology. Adult and pediatric rheumatology training programs. American College of Rheumatology website; 2013. http://www.rheumatology.org/education/training/programs.asp. 28. National Resident Matching Program website; 2012. http://www. nrmp.org. 29. Roemer FW, van Holsbeeck M, Genant HK. Musculoskeletal ultrasound in rheumatology: a radiologic perspective. Arthritis Rheum 2005; 53:491– 493.

1743