Home > Research > Publications & Outputs > Variation in passing standards for graduation-l...

Electronic data

  • medical_education_manuscript_accepted_version_18oct16

    Rights statement: This is the peer reviewed version of the following article:Taylor, C. A., Gurnell, M., Melville, C. R., Kluth, D. C., Johnson, N. and Wass, V. (2017), Variation in passing standards for graduation-level knowledge items at UK medical schools. Med Educ, 51: 612–620. doi:10.1111/medu.13240 which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1111/medu.13240/abstract This article may be used for non-commercial purposes in accordance With Wiley Terms and Conditions for self-archiving.

    Accepted author manuscript, 253 KB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Variation in passing standards for graduation-level knowledge items at UK Medical Schools

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Variation in passing standards for graduation-level knowledge items at UK Medical Schools. / Taylor, Celia; Gurnell, Mark; Melville, Colin Randolph et al.
In: Medical Education, Vol. 51, No. 6, 06.2017, p. 612-620.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Taylor, C, Gurnell, M, Melville, CR, Kluth, D, Johnson, N & Wass, V 2017, 'Variation in passing standards for graduation-level knowledge items at UK Medical Schools', Medical Education, vol. 51, no. 6, pp. 612-620. https://doi.org/10.1111/medu.13240

APA

Taylor, C., Gurnell, M., Melville, C. R., Kluth, D., Johnson, N., & Wass, V. (2017). Variation in passing standards for graduation-level knowledge items at UK Medical Schools. Medical Education, 51(6), 612-620. https://doi.org/10.1111/medu.13240

Vancouver

Taylor C, Gurnell M, Melville CR, Kluth D, Johnson N, Wass V. Variation in passing standards for graduation-level knowledge items at UK Medical Schools. Medical Education. 2017 Jun;51(6):612-620. Epub 2017 Mar 13. doi: 10.1111/medu.13240

Author

Taylor, Celia ; Gurnell, Mark ; Melville, Colin Randolph et al. / Variation in passing standards for graduation-level knowledge items at UK Medical Schools. In: Medical Education. 2017 ; Vol. 51, No. 6. pp. 612-620.

Bibtex

@article{fd1f4e110d844a78bf14ef91e29d37bc,
title = "Variation in passing standards for graduation-level knowledge items at UK Medical Schools",
abstract = "Objectives Given the absence of a common passing standard for students at UK medical schools, this paper compares independently set standards for common {\textquoteleft}one from five{\textquoteright} single-best-answer (multiple-choice) items used in graduation-level applied knowledge examinations and explores potential reasons for any differences. Methods A repeated cross-sectional study was conducted. Participating schools were sent a common set of graduation-level items (55 in 2013–2014; 60 in 2014–2015). Items were selected against a blueprint and subjected to a quality review process. Each school employed its own standard-setting process for the common items. The primary outcome was the passing standard for the common items by each medical school set using the Angoff or Ebel methods. Results Of 31 invited medical schools, 22 participated in 2013–2014 (71%) and 30 (97%) in 2014–2015. Schools used a mean of 49 and 53 common items in 2013–2014 and 2014–2015, respectively, representing around one-third of the items in the examinations in which they were embedded. Data from 19 (61%) and 26 (84%) schools, respectively, met the inclusion criteria for comparison of standards. There were statistically significant differences in the passing standards set by schools in both years (effect sizes (f2): 0.041 in 2013–2014 and 0.218 in 2014–2015; both p < 0.001). The interquartile range of standards was 5.7 percentage points in 2013–2014 and 6.5 percentage points in 2014–2015. There was a positive correlation between the relative standards set by schools in the 2 years (Pearson's r = 0.57, n = 18, p = 0.014). Time allowed per item, method of standard setting and timing of examination in the curriculum did not have a statistically significant impact on standards. Conclusions Independently set standards for common single-best-answer items used in graduation-level examinations vary across UK medical schools. Further work to examine standard-setting processes in more detail is needed to help explain this variability and develop methods to reduce it.",
author = "Celia Taylor and Mark Gurnell and Melville, {Colin Randolph} and David Kluth and Neil Johnson and Val Wass",
note = "This is the peer reviewed version of the following article:Taylor, C. A., Gurnell, M., Melville, C. R., Kluth, D. C., Johnson, N. and Wass, V. (2017), Variation in passing standards for graduation-level knowledge items at UK medical schools. Med Educ, 51: 612–620. doi:10.1111/medu.13240 which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1111/medu.13240/abstract This article may be used for non-commercial purposes in accordance With Wiley Terms and Conditions for self-archiving.",
year = "2017",
month = jun,
doi = "10.1111/medu.13240",
language = "English",
volume = "51",
pages = "612--620",
journal = "Medical Education",
issn = "0308-0110",
publisher = "Wiley-Blackwell",
number = "6",

}

RIS

TY - JOUR

T1 - Variation in passing standards for graduation-level knowledge items at UK Medical Schools

AU - Taylor, Celia

AU - Gurnell, Mark

AU - Melville, Colin Randolph

AU - Kluth, David

AU - Johnson, Neil

AU - Wass, Val

N1 - This is the peer reviewed version of the following article:Taylor, C. A., Gurnell, M., Melville, C. R., Kluth, D. C., Johnson, N. and Wass, V. (2017), Variation in passing standards for graduation-level knowledge items at UK medical schools. Med Educ, 51: 612–620. doi:10.1111/medu.13240 which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1111/medu.13240/abstract This article may be used for non-commercial purposes in accordance With Wiley Terms and Conditions for self-archiving.

PY - 2017/6

Y1 - 2017/6

N2 - Objectives Given the absence of a common passing standard for students at UK medical schools, this paper compares independently set standards for common ‘one from five’ single-best-answer (multiple-choice) items used in graduation-level applied knowledge examinations and explores potential reasons for any differences. Methods A repeated cross-sectional study was conducted. Participating schools were sent a common set of graduation-level items (55 in 2013–2014; 60 in 2014–2015). Items were selected against a blueprint and subjected to a quality review process. Each school employed its own standard-setting process for the common items. The primary outcome was the passing standard for the common items by each medical school set using the Angoff or Ebel methods. Results Of 31 invited medical schools, 22 participated in 2013–2014 (71%) and 30 (97%) in 2014–2015. Schools used a mean of 49 and 53 common items in 2013–2014 and 2014–2015, respectively, representing around one-third of the items in the examinations in which they were embedded. Data from 19 (61%) and 26 (84%) schools, respectively, met the inclusion criteria for comparison of standards. There were statistically significant differences in the passing standards set by schools in both years (effect sizes (f2): 0.041 in 2013–2014 and 0.218 in 2014–2015; both p < 0.001). The interquartile range of standards was 5.7 percentage points in 2013–2014 and 6.5 percentage points in 2014–2015. There was a positive correlation between the relative standards set by schools in the 2 years (Pearson's r = 0.57, n = 18, p = 0.014). Time allowed per item, method of standard setting and timing of examination in the curriculum did not have a statistically significant impact on standards. Conclusions Independently set standards for common single-best-answer items used in graduation-level examinations vary across UK medical schools. Further work to examine standard-setting processes in more detail is needed to help explain this variability and develop methods to reduce it.

AB - Objectives Given the absence of a common passing standard for students at UK medical schools, this paper compares independently set standards for common ‘one from five’ single-best-answer (multiple-choice) items used in graduation-level applied knowledge examinations and explores potential reasons for any differences. Methods A repeated cross-sectional study was conducted. Participating schools were sent a common set of graduation-level items (55 in 2013–2014; 60 in 2014–2015). Items were selected against a blueprint and subjected to a quality review process. Each school employed its own standard-setting process for the common items. The primary outcome was the passing standard for the common items by each medical school set using the Angoff or Ebel methods. Results Of 31 invited medical schools, 22 participated in 2013–2014 (71%) and 30 (97%) in 2014–2015. Schools used a mean of 49 and 53 common items in 2013–2014 and 2014–2015, respectively, representing around one-third of the items in the examinations in which they were embedded. Data from 19 (61%) and 26 (84%) schools, respectively, met the inclusion criteria for comparison of standards. There were statistically significant differences in the passing standards set by schools in both years (effect sizes (f2): 0.041 in 2013–2014 and 0.218 in 2014–2015; both p < 0.001). The interquartile range of standards was 5.7 percentage points in 2013–2014 and 6.5 percentage points in 2014–2015. There was a positive correlation between the relative standards set by schools in the 2 years (Pearson's r = 0.57, n = 18, p = 0.014). Time allowed per item, method of standard setting and timing of examination in the curriculum did not have a statistically significant impact on standards. Conclusions Independently set standards for common single-best-answer items used in graduation-level examinations vary across UK medical schools. Further work to examine standard-setting processes in more detail is needed to help explain this variability and develop methods to reduce it.

U2 - 10.1111/medu.13240

DO - 10.1111/medu.13240

M3 - Journal article

VL - 51

SP - 612

EP - 620

JO - Medical Education

JF - Medical Education

SN - 0308-0110

IS - 6

ER -