Developing, evaluating and validating a scoring rubric for written case reports.

Document Type

Article

Publication Date

2-1-2014

Institution/Department

CORE; Family Medicine

Journal Title

International Journal of Medical Education

MeSH Headings

Clinical Clerkship, Educational Measurement, Family Practice, Humans, Observer Variation, Reproducibility of Results, Students, Medical, Writing

Abstract

OBJECTIVE: The purpose of this study was to evaluate Family Medicine Clerkship students' writing skills using an anchored scoring rubric. In this study, we report on the assessment of a current scoring rubric (SR) used to grade written case description papers (CDP) for medical students, describe the development of a revised SR with examination of scoring consistency among faculty raters, and report on feedback from students regarding SR revisions and written CDP.

METHODS: Five faculty members scored a total of eighty-three written CDP using both the Original SR (OSR) and the Revised SR1 (RSR1) during the 2009-2010 academic years.

RESULTS: Overall increased faculty inter-rater reliability was obtained using the RSR1. Additionally, this subset analysis revealed that the five faculty using the Revised SR2 (RSR2) had a high measure of inter-rater reliability on their scoring of this subset of papers (as measured by intra-class correlation (ICC) with ICC = 0.93, p = 0.001.

CONCLUSIONS: Findings from this research have implications for medical education, by highlighting the importance of the assessment and development of reliable evaluation tools for medical student writing projects.

ISSN

2042-6372

First Page

18

Last Page

23

Share

COinS