9 Fitzwilliam Street Milton Keynes MK3 6DF adrian ...

168 downloads 0 Views 870KB Size Report
Apr 7, 2017 - Hardman Street ... (4.k) suggests the university is legally bound to co-operate with public ... Legal Implications of GMC Standards: 2009 to 2015.
9 Fitzwilliam Street Milton Keynes MK3 6DF [email protected] 07 April 2017 Dr Colin Melville Director of Education Standards General Medical Council Hardman Street Manchester, M3 3AW Dear Colin, RE: UNIVERSITY OF BUCKINGHAM MEDICAL SCHOOL (UBMS) ASSESSMENTS The GMC’s response dated 2 April 2017 has prompted further questions surrounding medical school assessment practices and the law, as well as UBMS assessments. The GMC enforced new assessment standards for medical schools in January 2016 which replaced standards outlined in Tomorrow’s Doctors (2009). It is therefore useful to consider the legal implications of medical school assessment practices before and after January 2016. The present correspondence highlights six issues relevant to statutory requirements, GMC inspections, as well as UBMS assessment practices. It concludes with questions to the GMC. 1. Legal Status of University Assessments The Further and Higher Education Act 1992 suggests assessments which do not meet the standards of public bodies may be unlawful. S76.1-76.2 states that degrees may be awarded to persons who ‘complete an appropriate course of study and satisfy an appropriate assessment’ where ‘assessment’ includes ‘examination and test’. S76.6 states that degrees can only be awarded in accordance with ‘instruments to or regulating the institution…and the assessments’. These instruments refer to universities’ charters and statutes. Universities’ charters and statutes typically require institutions to comply with guidance from public regulating bodies pertaining to assessments. For example, the University of Buckingham’s charter (4.k) suggests the university is legally bound to co-operate with public bodies such as the GMC and the Quality Assurance Agency for Higher Education (QAA) regarding the conduct of examinations. University exams which fail to meet the requirements of relevant regulators may therefore be unlawful. 2. Legal Implications of GMC Standards: 2009 to 2015 The GMC’s Tomorrow’s Doctors (2009) and its accompanying Assessment in Undergraduate Medical Education (2009) provided medical schools with over one hundred assessment standards. These standards were applicable until December 31 2015, after which the GMC’s Promoting Excellence (2015) was enforced. There is reason to believe that medical school assessment practices which were inconsistent with the GMC’s standards from 2009 to 2015 were unlawful.

Page 1 of 5

3. Legal Implications of GMC Standards after January 2016 The legal status of medical school assessments following the January 2016 enforcement of Promoting Excellence (2015) is less clear. The comprehensive standards contained within Tomorrow’s Doctors (2009) and its accompanying Assessment in Undergraduate Medical Education (2009) were replaced with four standards in this new document (R4.4-R4.8). R5.6 appears most relevant in the context of this correspondence, stating ‘Medical schools must set fair, reliable and valid assessments that allow them to decide whether medical students have achieved the learning outcomes required for graduates.’ This standard suggests that, from January 2016, medical schools are required to demonstrate that their assessments are fair, reliable and valid to themselves. The GMC’s role in enforcing this standard is therefore unclear. Standard R5.8 of the GMC’s Promoting Excellence (2015) is also noteworthy, stating ‘Assessments must be carried out by someone with appropriate expertise in the area being assessed, and who has been appropriately selected, supported and appraised. They are responsible for honestly and effectively assessing the medical student’s performance and being able to justify their decision.’ It is unclear whether medical schools are required by the GMC to employ assessment personnel who are experts in the science of assessment. 4. Legal Implications of QAA Assessment Standards after January 2016 Evidence suggests that medical schools are legally obligated to comply with the QAA’s assessment standards, before and after January 2016. S8.3 of the QAA’s Benchmarking Statement for Medicine1 states ‘Assessment strategies and methods must ensure that the knowledge, understanding, skills and attitudes set out previously are sufficiently covered. Methods must be both valid and reliable. Appropriate procedures for standard setting should be employed. Clinical competence must be rigorously assessed so as to identify those who are not yet fit for practice.’ The QAA’s guidance suggests medical schools are required to demonstrate reliability and validity using external criteria such as industry standards. In contrast, the GMC’s guidance appears to suggest institutions need only demonstrate reliability and validity to satisfy their internal perceptions of whether medical students have achieved the learning outcomes required for graduates. It is therefore plausible that medical schools may comply with the GMC’s standards while simultaneously failing to meet the QAA’s. Evidence suggests failure to meet the QAA’s assessment standards may be unlawful.

1

http://www.qaa.ac.uk/en/Publications/Documents/Subject-benchmark-statement-Medicine.pdf Page 2 of 5

5. GMC Inspections Leading to Accreditation Issues surrounding the GMC’s inspection methodology leading to accreditation deserve attention. Two statements from the GMC’s correspondence dated 2 April 2017 are highlighted below, namely ‘In the case of Buckingham our visit reports have highlighted areas for improvement and focus. These will be kept under review in subsequent visits.’ and ‘Our programme of quality assurance visits continues until the year of graduation of the first cohort of students. At that point we decide if a school will be added to the list of awarding bodies, therefore meeting the standards listed in Promoting Excellence. It is also then that a final decision will be taken on the overall programme of assessment and assessment processes and whether we consider them to meet our standards.’ These statements suggest the GMC may choose to withhold criticisms of a new medical school’s assessment practices until the time of final accreditation. It is therefore unclear whether accreditation may be declined or delayed due to longstanding assessment inconsistencies that were not highlighted by the GMC during routine visits. 6. UBMS Assessment Inconsistencies A consideration of the GMC’s 2015 UBMS inspection report2 prompts several questions surrounding the GMC’s endorsement of this institution’s assessment practices. The UBMS 2015 inspection visit was ‘part of the GMC's remit to ensure medical schools are complying with the standards and outcomes as set out in Tomorrow’s Doctors 2009’ and the associated Assessment in Undergraduate Medical Education (2009). The GMC visited UBMS three times in 2015: 27-28 May, 1 December and 17 December. A relevant extract from the GMC’s inspection report is quoted below, ‘The visit carried out in May 2015 was the fourth visit to Buckingham Medical School (the School) since agreeing to commit resources to begin the process of a multi-year quality assurance review, after the University of Buckingham (the University) applied to set up a new medical school. In this visit we met the first cohort of students who started on the programme in January 2015 for the second time. In addition to this, we visited the School two more times in December 2015, to observe the first summative OSCEs and the consequent Assessment Board. We will continue the rolling cycle of annual quality assurance visits to the School following the first cohort of students through to graduation and their first year of practice.’ This paragraph appears to confirm these inspections were intended to ensure compliance with the GMC’s published requirements, including assessment standards.

2

http://www.gmc-uk.org/Final_first_year_report_for_Buckingham_.pdf_66142597.pdf Page 3 of 5

The UBMS Assessment Code of Practice (1.1.4)3 states that ‘the medical school will use internationally recognised methods of standard setting to determine which students are graded satisfactorily for each assessment’. However, a comparison of this UBMS document and the GMC’s standards applicable at that time revealed several inconsistencies. These inconsistences were highlighted in previous correspondence to the GMC, and are summarised in Table 1 below for convenient reference. Table 1: Summary of queried UBMS assessment practices Exam UBMS Practice GMC Requirement Standard setting methods should use absolute Term Fixed overall pass standards. Medical schools should not employ Written mark fixed pass marks. Term  Medical schools must use evidence from OSCE research into best-practice. Station-level pass  Systems to set appropriate standards should be mark setting method in place. Standard setting methods should be justifiable and defensible. Standard setting methods should use absolute Fixed overall pass standards. Medical schools should not employ mark fixed pass marks. Phase 1  Medical schools must use evidence from Resits Composite written & research into best-practice. OSCE scores  Decisions about students should be evidencebased, defensible and consistent.  Medical schools must use evidence from research into best-practice. OSCE pass mark  Systems to set appropriate standards should be setting in place. Standard setting methods should be justifiable and defensible. Standard setting methods should use absolute Fixed overall pass standards. Medical schools should not employ mark fixed pass marks. Compensatory mechanisms should not allow Cut-score progression without demonstrating competence compensation in all the outcomes. Consistency of It is important to ensure that resits are of the standards same standard as the main examinations. While Table 1 suggests UBMS assessment practices were inconsistent the GMC’s requirements which were relevant at that time, there is no mention of any such inconsistences in the GMC’s UBMS inspection report. Yet, it is plausible that assessment practices which gained initial GMC approval were unlawful. It is also possible that UBMS’s current assessment practices comply with the GMC’s standards while simultaneously failing to meet the QAA’s assessment requirements outlined in section four of this correspondence. UBMS assessment practices from January 2016 onward may therefore be unlawful.

3

http://medvle.buckingham.ac.uk/ Page 4 of 5

Summary Evidence suggests that medical schools whose assessments do not meet GMC and QAA standards may be failing to meet their legal obligations. Joint consideration of UBMS’s assessment practices as well as the GMC’s responses gives rise to the following questions:      





 

What is the relationship between GMC and QAA standards and medical schools’ legal obligations surrounding assessments? Are medical schools whose assessment practices are inconsistent with the GMC’s standards acting unlawfully? Are medical schools whose assessment practices are inconsistent with the QAA’s assessment standards acting unlawfully? Does the GMC require medical school assessments to be carried out by someone with appropriate expertise in the science of assessment? Why might the GMC choose not to highlight assessment practices which appear inconsistent with their published standards in the relevant inspection report? Does the GMC reserve the right to decline or delay a new medical school’s accreditation due to longstanding assessment inconsistencies which were not highlighted by the GMC to the institution at the time? Were UBMS’s planned assessment practices which gained initial approval consistent with the GMC’s standards outlined in Tomorrow’s Doctors (2009) and Assessment in Undergraduate Medical Education (2009)? Were UBMS’s assessment practices from January to December 2015 consistent with the GMC’s standards outlined in Tomorrow’s Doctors (2009) and Assessment in Undergraduate Medical Education (2009)? Are UBMS’s assessment practices from January 2016 onward consistent with GMC’s standards? Should UBMS’s past and present assessment decisions be regarded as fair to students?

A response to each question is required. If the GMC is unable to respond to any of these queries, please provide the contact details of the person or professional body that can provide a response. Best wishes,

Adrian Anthony Husbands Psychometrician Bold Metrics cc: Professor John Clapham Professor Stewart Petersen Professor Jill Schofield Sir Anthony Seldon Dr Harry Witchell Mr Frank Donlon Mr Douglas Blackstock Rt Hon Justine Greening MP Rt Hon Jeremy Hunt MP Enclosures Page 5 of 5

ENCL 1

4 April March 2017 9 Fitzwilliam Street Milton Keynes MK3 6DF [email protected]

Dear Mr Husbands University of Buckingham Medical School Assessments Thank you for your letter of 27 February. The University of Buckingham is subject to a programme of regular monitoring visits consistent with our approach to supporting developing new medical schools. The visiting team draws from expertise across all the relevant domains including assessment. Our reports are published in full on our web site. We note the points you raise in your letter and will consider them alongside any other data we hold. However, it might be useful to note: •







Our quality assurance process is underpinned by Promoting Excellence: standards for medical education and training (2016). This supersedes the standards in Tomorrow’s Doctors (2009). The supplementary guidance Assessment in Undergraduate Medical Education which forms the basis of the points you make throughout your letter is additional guidance about how to meet our standards. It does not include new requirements or aim to impose uniformity in undergraduate approaches to assessment. Our programme of quality assurance visits continues until the year of graduation of the first cohort of students. At that point we decide if a school will be added to the list of awarding bodies, therefore meeting the standards listed in Promoting Excellence. It is also then that a final decision will be taken on the overall programme of assessment and assessment processes and whether we consider them to meet our standards. In the case of Buckingham our visit reports have highlighted areas for improvement and focus. These will be kept under review in subsequent visits.

ENCL 1

Given your declared expertise in the area of assessment, I am sure you appreciate the challenge in standard setting and the varying methods to determine the threshold for pass. We continue to work in consultation with others to ensure we keep up to date with current thinking in this field. Any assessment decision must uphold patient safety as its over-riding concern, but after that comes a measure of judgement (method and mathematical adjustment of SEM) in setting the passing threshold and consideration of fairness to students. I trust this reassures you in respect of the University of Buckingham Medical School. Yours sincerely

Dr Colin Melville Director of Education and Standards General Medical Council Hardman Street, Manchester, M3 3AW e: [email protected] t: 0161 923 6772 w: www.gmc-uk.org cc:

Professor John Clapham Professor Stewart Petersen Professor Alistair Alcock Sir Anthony Seldon

ENCL 2

9 Fitzwilliam Street Milton Keynes MK3 6DF [email protected]

P00000001/000001

27-02-2017

Niall Dickson Chief Executive and Registrar General Medical Council Regent’s Place 350 Euston Rd London NW1 3JN (NW13JN1AT)

Dear GMC representative, RE: UNIVERSITY OF BUCKINGHAM MEDICAL SCHOOL (UBMS) ASSESSMENTS A comparison of the UBMS Code of Practice for Assessment, the GMC’s published assessment guidance, and internationally recognised testing standards reveal various inconsistencies. This correspondence explores these issues, and concludes with questions regarding the GMC’s initial approval and ongoing support for UBMS’s assessment practices. BACKGROUND The GMC provides objective assessment requirements to medical schools to ensure compliance with regulatory standards. Details of an institution’s assessment process must be GMC-approved before admitting the first students. The GMC is also responsible for monitoring a medical school’s assessment practices. The GMC’s published assessment requirements are summarised as follows: 

APFIELHLBKDMHKALEICK ABOLBKDMHCAPNOFDDDEK AIHOEACOMLGGJEAJMHFK AAHPEDNEODHPAFFOIAJK AAOOOCIIAOEAIAAGGCEK

P000N  

P00000001/000001/1/2





 

Medical schools must use evidence from research into best-practice to decide how to plan and organise their assessments, and be able to demonstrate how their assessment arrangements meet relevant criteria1. Decisions about individual students should be evidence-based, defensible and consistent2. Assessments should be fit for purpose: valid, reliable, generalisable, feasible and fair 1,2,3. Systems to set appropriate standards should be in place1. Standard setting methods should be justifiable and defensible2. Standard setting methods should use absolute standards. Medical schools should not employ fixed pass marks2. Compensatory mechanisms should not allow progression without demonstrating competence in all the outcomes 1,2. It is important to ensure that resits are of the same standard as the main examinations 2. Medical schools must have mechanisms to ensure comparability of standards with other institutions 1.

UBMS submitted assessment plans to the GMC before admitting the first students in January 2015. The GMC’s 20144 and 20155 inspections of UBMS’s operations revealed no assessment irregularities before or after the first students were enrolled. The proceeding section compares the UBMS Code of Practice for Assessment6 to the GMC’s assessment requirements. 1

GMC (2009). Tomorrow’s Doctors. GMC (2009). Assessment in Undergraduate Medical Education. 3 GMC (2015). Promoting Excellence: Standards for Medical Education and Training. 4 http://www.gmc-uk.org/Buckingham_Medical_School_visit_report_v4.0.pdf_60201766.pdf 5 http://www.gmc-uk.org/Final_first_year_report_for_Buckingham_.pdf_66142597.pdf 6 http://medvle.buckingham.ac.uk/ 2

Page 1 of 4

ENCL 2

QUERIED UBMS ASSESSMENT PRACTICES 1. Written Exam Standard Setting UBMS written exam results are ultimately determined by fixed pass marks (CoP 4.1.2). For example, the passing threshold for the ETA3 written assessment is revealed as 16/24 question sets (66.7%). This practice appears to contradict the GMC’s requirements that standard setting methods should use absolute standards, as well as prohibiting fixed pass marks. 2. OSCE Standard Setting 2.1. Station-level Pass Mark Setting The UBMS CoP (4.2.4) states that OSCE standard setting will be undertaken ‘either using Borderline Groups or Borderline Regression’. Published details of Borderline methodologies define a common two-step process which involves: (1) station-level pass-mark setting using scores of borderline students, and (2) exam-level pass mark setting by averaging the stationlevel pass marks obtained in step one7,8. The UBMS practice excludes step 2 and is therefore incompatible with industry-wide definitions of Borderline methodologies. A literature review revealed no published evidence supporting UBMS’s OSCE standard setting method. This lack of supporting evidence appears to contradict the GMC’s guidance relating to evidence-based and defensible assessment practices. 2.2. Exam-level Pass Mark Setting UBMS written exam results are ultimately determined by fixed pass marks (CoP 4.2.5). For example, the minimum passing criteria for the Phase 1 ETA3 OSCE is 10/12 stations or 83.3%. This practice appears to contradict the GMC’s requirements that standard setting methods should use absolute standards, as well as prohibiting fixed pass marks. 3. Phase 1 Resit Exams 3.1. Composite Exam Scores Knowledge and clinical skill items are aggregated to determine Phase 1 resit results (CoP 3.1.13.1.2). Industry-wide best-practice requires that distinct assessments which are aggregated for decision-making should measure the same underlying skill as a prerequisite for valid and defensible scores 9,10. For example, a typical drivers’ test requires prospective licence-holders to master both theory and practical (driving) components separately, to ensure mastery of both aspects. This is in the interest of public safety. Similarly, medical students are usually required to demonstrate mastery of medical knowledge and clinical skills separately in the interest of patient safety. This principle appears to be demonstrated in assessment processes for the corresponding UBMS main exams.

7

Cizek & Bunch. Standard setting: a guide to establishing and evaluating performance standards on tests. 2007. Wood et al. Standard setting in a small scale OSCE: A comparison of the modified borderline-group method and the borderline regression method. Advances in Health Sciences Education. 2006;11 (2). 9 Bobko et al. The Usefulness of Unit Weights in Creating Composite Scores: A Literature Review, Application to Content Validity, and Meta-Analysis. Organisational Research Methods. 2007;10 (4). 10 Kelly & Chang. Estimation of and Confidence Interval Formation for Reliability Coefficients of Homogeneous Measurement Instruments. Methodology. 2012;8 (2). 8

Page 2 of 4

ENCL 2

A literature review revealed no supporting evidence for the score aggregation methodology used to determine the UBMS Phase 1 resit exam results. This practice therefore appears to contradict GMC guidance relating to evidence-based and defensible assessments. 3.2. OSCE Station-level Pass-Mark Setting As previously explained in section 2.1 of this correspondence, a literature review revealed no evidence to support UBMS’s OSCE standard setting methodology. This lack of supporting evidence appears to contradict GMC guidance relating to evidence-based and defensible assessment practices. 3.3. Exam-level Pass-Mark Setting Phase 1 resit exams comprise of 24 written and 12 OSCE items, with a fixed passing threshold of 75%, i.e. 27/36 of the total number of test items (CoP 4.1.3). This practice appears to contradict the GMC’s requirements that standard setting methods should use absolute standards, as well as prohibiting fixed pass marks. 3.4. Score Compensation UBMS’s fixed pass mark for the combined written and OSCE resits potentially allows passing students to compensate lower clinical skills with higher knowledge skills and vice-versa. This is illustrated in Figures A and B.

P000N

100.0 83.3 66.7 25.0 KNOWLEDGE

P00000001/000001/2/2

Main Exam

CLINICAL SKILLS Resit Exam

Figure B: Compensating lower knowledge with higher clinical skills in UBMS resits.

Min Passing Score (%)

APFIELHLBIBOHKAKEICK ABOLBKDMHCAONKEDBHFK AIHOEACPMNHJCAMCIPNK AAHOEEMCBFOJFIMFEDJK AAOCCCOOIGEAIIOMOKEK

Min Passing Score (%)

Figure A: Compensating lower clinical skills with higher knowledge in UBMS resits.

100.0 83.3 66.7

62.5

KNOWLEDGE Main Exam

Resitting students can progress by passing 24/24 written test items (100% of knowledge) and 3/12 OSCE items (25% clinical skills).

CLINICAL SKILLS Resit Exam

Resitting students can progress by passing 15/24 written test items (62.5% of knowledge) and 12/12 OSCE items (100% clinical skills).

This appears to contradict the GMC’s requirements that compensatory mechanisms should not allow progression without demonstrating competence in all outcomes. 3.5. Consistency of Standards Figures A and B also illustrate the inconsistent application of pass marks between the resit and main exams. A resitting UBMS Phase 1 student appears able to progress by passing less knowledge or clinical skill items compared to the equivalent main exam. Inconsistencies regarding pass mark differentials and component vs aggregate scores appear to contradict the GMC’s requirements that resit and main exams should be of the same standard.

Page 3 of 4

ENCL 2

SUMMARY The UBMS Assessment CoP (1.1.4) states that ‘the medical school will use internationally recognised methods of standard setting to determine which students are graded satisfactorily for each assessment’. In contrast, many of the UBMS assessment practices appear inconsistent with the GMC’s published requirements and global assessment standards, as summarised in Table 1 below. Table 1: Summary of queried UBMS assessment practices Exam UBMS Practice GMC Requirement Term Written Term OSCE

Phase 1 Resits

Fixed overall pass mark Station-level pass mark setting method Fixed overall pass mark Composite written & OSCE scores

Standard setting methods should use absolute standards. Medical schools should not employ fixed pass marks.  Medical schools must use evidence from research into best-practice.  Systems to set appropriate standards should be in place. Standard setting methods should be justifiable and defensible. Standard setting methods should use absolute standards. Medical schools should not employ fixed pass marks.  Medical schools must use evidence from research into best-practice.  Decisions about students should be evidence-based, defensible and consistent.  Medical schools must use evidence from research into best-practice. OSCE pass mark  Systems to set appropriate standards should be in place. Standard setting setting methods should be justifiable and defensible. Fixed overall pass Standard setting methods should use absolute standards. Medical mark schools should not employ fixed pass marks. Cut-score Compensatory mechanisms should not allow progression without compensation demonstrating competence in all the outcomes. Consistency of It is important to ensure that resits are of the same standard as the main standards examinations.

Conclusions and inferences based on test scores can be called into question without defensible, evidence-based methods. The GMC is therefore asked to expand on the basis upon which UBMS assessment practices gained initial GMC approval before the programme commenced, and continue to meet the GMC’s requirements from 2015 to the present day. The GMC is also asked to expand on whether UBMS’s past and present assessment decisions should be regarded as fair to students. I look forward to your response. Sincerely,

Adrian Anthony Husbands Psychometrician Bold Metrics cc: Professor John Clapham Professor Stewart Petersen Professor Alistair Alcock Sir Anthony Seldon

Page 4 of 4