Validation of a detailed scoring checklist for use during advanced cardiac life support certification

Simul Healthc. 2012 Aug;7(4):222-35. doi: 10.1097/SIH.0b013e3182590b07.

Abstract

Introduction: Defining valid, reliable, defensible, and generalizable standards for the evaluation of learner performance is a key issue in assessing both baseline competence and mastery in medical education. However, before setting these standards of performance, the reliability of the scores yielding from a grading tool must be assessed. Accordingly, the purpose of this study was to assess the reliability of scores generated from a set of grading checklists used by nonexpert raters during simulations of American Heart Association (AHA) Megacodes.

Methods: The reliability of scores generated from a detailed set of checklists, when used by 4 nonexpert raters, was tested by grading team leader performance in 8 Megacode scenarios. Videos of the scenarios were reviewed and rated by trained faculty facilitators and a group of nonexpert raters. The videos were reviewed "continuously" and "with pauses." The grading made by 2 content experts served as the reference standard, and 4 nonexpert raters were used to test the reliability of the checklists.

Results: Our results demonstrate that nonexpert raters are able to produce reliable grades when using the checklists under consideration, demonstrating excellent intrarater reliability and agreement with a reference standard. The results also demonstrate that nonexpert raters can be trained in the proper use of the checklist in a short amount of time, with no discernible learning curve thereafter. Finally, our results show that a single trained rater can achieve reliable scores of team leader performance during AHA Megacodes when using our checklist in a continuous mode because measures of agreement in total scoring were very strong [Lin's (Biometrics 1989;45:255-268) concordance correlation coefficient, 0.96; intraclass correlation coefficient, 0.97].

Conclusions: We have shown that our checklists can yield reliable scores, are appropriate for use by nonexpert raters, and are able to be used during continuous assessment of team leader performance during the review of a simulated Megacode. This checklist may be more appropriate for use by advanced cardiac life support instructors during Megacode assessments than the current tools provided by the AHA.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, U.S. Gov't, Non-P.H.S.
  • Validation Study

MeSH terms

  • Advanced Cardiac Life Support / standards*
  • Certification*
  • Checklist*
  • Clinical Competence / standards*
  • Humans
  • Patient Simulation*
  • Reproducibility of Results
  • Software
  • Task Performance and Analysis
  • Video Recording*