Critical analysis of a computer-assisted tutorial on ECG interpretation and its ability to determine competency

Med Teach. 2008;30(2):e41-8. doi: 10.1080/01421590801972471.

Abstract

Background: We developed a computer-based tutorial and a posttest on ECG interpretation for training residents and determining competency.

Methods: Forty residents, 6 cardiology fellows, and 4 experienced physicians participated. The tutorial emphasized recognition and understanding of abnormal ECG features. Active learning was promoted by asking questions prior to the discussion of ECGs. Interactivity was facilitated by providing rapid and in-depth rationale for correct answers. Responses to questions were recorded and extensively analyzed to determine the quality of questions, baseline knowledge at different levels of training and improvement of grades in posttest. Posttest grades were used to assess improvement and to determine competency.

Results: The questions were found to be challenging, fair, appropriate and discriminative. This was important since the quality of Socratic questions is critical for the success of interactive programs. The information on strengths and weakness in baseline knowledge at different levels of training were used to adapt our training program to the needs of residents. The posttest revealed that the tutorial contributed to marked improvement in feature recognition. Competency testing distinguished between residents with outstanding grades and those who needed remediation.

Conclusions: The strategy for critical evaluation of our computer program could be applied to any computer-based educational program, regardless of topic.

Publication types

  • Evaluation Study
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Clinical Competence*
  • Computer-Assisted Instruction / methods*
  • Electrocardiography*
  • Humans
  • Internship and Residency
  • Physicians
  • Surveys and Questionnaires
  • United States