Skip to main content
Log in

Self-monitoring and its relationship to medical knowledge

  • Published:
Advances in Health Sciences Education Aims and scope Submit manuscript

Abstract

In the domain of self-assessment, researchers have begun to draw distinctions between summative self-assessment activities (i.e., making an overall judgment of one’s ability in a particular domain) and self-monitoring processes (i.e., an “in the moment” awareness of whether one has the necessary knowledge or skills to address a specific problem with which one is faced). Indeed, previous research has shown that, when responding to both short answer and multiple choice questions, individuals are able to assess the likelihood of answering questions correctly on a moment-by-moment basis, even though they are not able to generate an accurate self-assessment of overall performance on the test. These studies, however, were conducted in the context of low-stakes tests of general “trivia”. The purpose of the present study was to further this line of research by investigating the relationship between self-monitoring and performance in the context of a high stakes test assessing medical knowledge. Using a recent administration of the Medical Council of Canada Qualifying Examination Part I, we examined three measures intended to capture self-monitoring: (1) the time taken to respond to each question, (2) the number of questions a candidate flagged as needing to be considered further, and (3) the likelihood of changing one’s initial answer. Differences in these measures as a function of the accuracy of the candidate’s response were treated as indices of each candidate’s ability to judge his or her likelihood of responding correctly. The three self-monitoring indices were compared for candidates at three different levels of overall performance on the exam. Relative to correct responses, when examinees initially responded incorrectly, they spent more time answering the question, were more likely to flag the question for future consideration, and were more likely to change their answer before committing to a final answer. These measures of self-monitoring were modulated by candidate performance in that high performing examinees showed greater differences on these indices relative to poor performing examinees. Furthermore, reliability analyses suggest that these difference measures hold promise for reliably differentiating self-monitoring at the level of individuals, at least within a given content area. The results suggest that examinees were self-monitoring their knowledge and skills on a question by question basis and altering their behavior appropriately in the moment. High performing individuals showed stronger evidence of accurate self-monitoring than did low performing individuals and the reliability of these measures suggests that they have the potential to differentiate between individuals. How these findings relate to performance in actual clinical settings remains to be seen.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Colliver, J. A., Verhulst, S. J., & Barrows, H. S. (2005). Self-assessment in medical practice: A further concern about the conventional research paradigm. Teaching and Learning and Medicine, 17, 200–201.

    Article  Google Scholar 

  • Davis, D. A., Mazmanian, P. E., Fordis, M., Harrison, R. V., Thorpe, K. E., & Perrier, L. (2006). Accuracy of physician self-assessment compared with observed measures of competence: A systematic review. JAMA, 296, 1094–1102.

    Article  Google Scholar 

  • Di Milia, L. (2007). Benefits from multiple-choice exams: The positive impact of answer switching. Educational Psychology, 27, 607–615.

    Article  Google Scholar 

  • Dunning, D., Heath, C., & Suls, J. (2004). Flawed self-assessment: Implications for health, education, and the workplace. Psychological Science in the Public Interest, 5, 69–106.

    Article  Google Scholar 

  • Eva, K. W., & Regehr, G. (2005). Self-assessment in the health professions: A reformulation and research agenda. Academic Medicine, 80, S46–S54.

    Article  Google Scholar 

  • Eva, K. W., & Regehr, G. (2007). Knowing when to look it up: A new conception of self-assessment ability. Academic Medicine, 82, S81–S84.

    Article  Google Scholar 

  • Eva, K. W., & Regehr, G. (2008). I’ll never play professional football and other fallacies of self-assessment. Journal of Continuing Education in the Health Professions, 28, 14–19.

    Article  Google Scholar 

  • Eva, K. W., & Regehr, G. (2010). Exploring the divergence between self-assessment and self-monitoring. Advances in Health Sciences Education [epub ahead of publication].

  • Ferguson, K. J., Kreiter, C. D., Peterson, M. W., Roawt, J. A., & Elliott, S. T. (2002). Is that your final answer? Relationship of changed answers to overall performance on a computer-based medical school course examination. Teaching and Learning in Medicine, 14, 20–23.

    Article  Google Scholar 

  • Fischer, M. R., Herrmann, S. M., & Kopp, V. (2005). Answering multiple choice questions in high-stakes medical examinations. Medical Education, 39, 890–894.

    Article  Google Scholar 

  • Geiger, M. A. (1997). An examination of the relationship between answer changing, testwiseness, and examination performance. Journal of Experimental Education, 66, 49–60.

    Article  Google Scholar 

  • Gordon, M. J. (1991). A review of the validity and accuracy of self-assessments in health professions training. Academic Medicine, 66, 762–769.

    Article  Google Scholar 

  • Handfield-Jones, R. S., Mann, K. V., Challis, M. E., Hobma, S. O., Klass, D. J., McManus, I. C., et al. (2002). Linking assessment to learning: A new route to quality assurance in medical practice. Medical Education, 36, 949–958.

    Article  Google Scholar 

  • Higham, P. A., & Gerrard, C. (2005). Not all errors are created equal: Metacognition and changing answers on multiple-choice tests. Canadian Journal of Experimental Psychology, 59, 28–34.

    Article  Google Scholar 

  • Hodges, B., Regehr, G., & Martin, D. (2001). Difficulties in recognizing one’s own incompetence: Novice physicians who are unskilled and unaware of it. Academic Medicine, 76, S87–S89.

    Article  Google Scholar 

  • Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: Difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77, 1121–1134.

    Article  Google Scholar 

  • Moulton, C. E., Regehr, G., Mylopoulos, M., & MacRae, H. M. (2007). Slowing down when you should: A new model of expert judgment. Academic Medicine, 82, S109–S116.

    Article  Google Scholar 

  • Moulton, C. E., Regehr, G., Lingard, L., Merritt, C., & MacRae, H. (2010a). ‘Slowing down when you should’: Initiators and influences of the transition from the routine to the effortful. Journal of Gastrointestinal Surgery, 14, 1019–1026.

    Article  Google Scholar 

  • Moulton, C. E., Regehr, G., Lingard, L., Merritt, C., & MacRae, H. (2010b). Slowing down to stay out of trouble in the operating room: Remaining attentive in automaticity. Academic Medicine, 85, 1571–1577.

    Article  Google Scholar 

  • Norman, G. R., Rosenthal, D., Brooks, L. R., Allen, S. W., & Muzzin, L. J. (1989). The development of expertise in dermatology. Archives of Dermatology, 125, 1063–1068.

    Article  Google Scholar 

  • Regehr, G., & Eva, K. W. (2006). Self-assessment, self-direction, and the self-regulating professional. Clinical Orthopaedics and Related Research, 449, 34–38.

    Google Scholar 

  • Regehr, G., Hodges, B., Tiberius, R., & Lofchy, J. (1996). Measuring self-assessment skills: An innovative relative ranking model. Academic Medicine, 71, S52–S54.

    Article  Google Scholar 

  • Rogers, W. T., & Bateson, D. J. (1991). Verification of a model of test-taking behaviour of high school seniors. Journal of Experimental Education, 59, 331–350.

    Google Scholar 

  • Salthouse, T. A., & Berish, D. E. (2005). Correlates of within-person (across-occasion) variability in reaction time. Neuropsychology, 19, 77–87.

    Article  Google Scholar 

  • Ward, M., Gruppen, L., & Regehr, G. (2002). Measuring self-assessment: Current state of the art. Advances in Health Sciences Education, 7, 63–80.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Meghan M. McConnell.

Rights and permissions

Reprints and permissions

About this article

Cite this article

McConnell, M.M., Regehr, G., Wood, T.J. et al. Self-monitoring and its relationship to medical knowledge. Adv in Health Sci Educ 17, 311–323 (2012). https://doi.org/10.1007/s10459-011-9305-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10459-011-9305-4

Keywords

Navigation