Article Text
Abstract
Objectives Teaching clinical skills is an important component of educational programmes for medical undergraduates. However, the extension of the interval between the completion of the course and qualification examination affects the performance of students in the skill examination. This study established a multisource evaluation system to determine whether formative assessment can enhance the instruction of clinical skills.
Methods Formative assessment was introduced to the entire training course on clinical skills, in which diversified methods were used to observe the performance of students during training. Students in the experimental group received training for clinical skills using formative assessment (class of 2019, n=128), while students in the control group received traditional training without formative assessment (class of 2018, n=123). Both groups participated in the Objective Structured Clinical Examination (OSCE) conducted by Tongji Medical College, and the exam scores were taken as the objective measure of course outcome. After completing the course, all students in the experimental group were instructed to fill in a questionnaire to evaluate their experience in the training programme, as a subjective measure of course outcome.
Results Compared with the control group, students in the experimental group received significantly better practical scores in the four clinical skills tested by the OSCE. The questionnaire results revealed that the majority of students who were trained using formative assessment methods considered the course helpful for learning, and appreciated the course for the clinical skills they had gained, and the opportunity to receive and give feedback to the instructors.
Conclusions The findings of this study suggest that formative assessment methods are beneficial for learning clinical skills through simulated teaching, as shown by the improved objective clinical skills evaluated by the structured clinical examination, and the self-reported satisfaction with the learning process.
- education & training (see medical education & training)
- medical education & training
- general medicine (see internal medicine)
Data availability statement
Data are available on reasonable request. The datasets used and/or analysed during the study are available from the corresponding author on reasonable request.
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.
Statistics from Altmetric.com
- education & training (see medical education & training)
- medical education & training
- general medicine (see internal medicine)
STRENGTHS AND LIMITATIONS OF THIS STUDY
The study revealed the formative assessment methods are beneficial for learning clinical skills by simulating teaching.
The results for the structured clinical examination and self-reported satisfaction with the learning process were evaluated.
The present finding provides a more appropriate teaching process in the teaching of clinical skills.
The sample size might not be large enough to exclude other possible incidents that may affect the results.
The study compared two consecutive groups, which poses a limitation in the study design.
Introduction
Teaching clinical skills is an important component of educational programmes for medical undergraduates. In China, clinical skills are taught mainly in the form of lectures with a limited effect.1 In order to enhance the instruction of clinical skills, most medical schools in China have attempted to apply innovative teaching methods, such as problem-based learning2 3 and simulated teaching.4 These methods were considered to improve the performance of students in examinations and clinical competence.5–7 However, regardless of the teaching mode, it was observed that the performance of students on the skill examinations suffered when the time gap between the course and examination was extended.
Research has suggested that assessments may be more important for clinical skills learning than any specific instructional format.8 From the practice standpoint, assessment is critical for learners to develop judgements for learning beyond the task at hand, which helps prepare medical students for future clinical practice.9 Indeed, the effectiveness of teaching is greatly enhanced by formative assessments, namely, assessments that evaluate a student’s learning during the teaching process. Formative assessment, which is a developmental assessment based on continuous observation and the detailed recording of a student’s learning process, takes place when information on a student’s achievement is elicited and used by instructors to make better decisions on the next steps of instruction.10 This is a comprehensive evaluation of a student’s performance and achievements in daily learning, and the reflected emotions, attitudes and learning strategies.11 12 Formative assessment is an important educational activity, because this gives learners feedback while their learning is still taking place.13 Wood14 argued that ongoing assessments should be integral to the entire educational enterprise, rather than delayed to the end of a course. Formative assessment contributes to a student’s learning, both by influencing the learning process (how students learn) and affecting the learning outcome (what they learn).15
Central to formative assessment is feedback, which was defined by Black and Wiliam16 as ‘any information provided to the performer of any action about that performance’. Hattie and Timperley conceptualised feedback as a means to reduce the discrepancy between the desired learning goal and the present status, and identified three questions addressed by effective feedback, ‘Where am I going?’, ‘How am I going?’ and ‘Where to next?’, which can be executed at four levels of operation, namely, task, processing, self-regulation and the self as a person.17 This model of feedback implicates both the teacher (the giver of feedback) and student (the receiver of feedback) as integral participants in the feedback process, and postulates that the assessment should provide information to both parties to allow the three feedback questions to be effective in facilitating learning. In line with this model, the outstanding feature of formative assessment is the mutual feedback between the evaluator and evaluatee,18 through which problems or defects in teaching are discovered, allowing both the instructors and students to adjust their performance prior to the subsequent teaching activity.
However, to our knowledge, information is limited on how to best implement formative assessment when teaching clinical skills. Building on studies of sustainable assessment, Boud and Molloy19 proposed that feedback should be repositioned as a central theme of curriculum design, rather than simply as a mechanistic measure in teaching. Rather than passively receiving information, students should assume key roles in driving their own learning by generating, seeking and using feedback. They further proposed elements for a feedback-centric curriculum, including activities that encourage learners’ work production, self-regulation, feedback solicitation and feedback generation in incremental tasks. To this effect, a curriculum that used diverse, continuous feedback-based components to teach clinical skills was designed.
This study aimed to determine whether formative assessment can enhance the teaching of clinical skills. Formative assessment was introduced into a simulated course on clinical skills by establishing a multisource evaluation system. The effects of the formative assessment on students’ learning and instructors’ success in teaching were evaluated, in terms of objective performance in the structured clinical skills examination and subjective perception of the course, as reported in the end-of-course questionnaire. It was hypothesised that students who received the formative assessment in the clinical skill course would be able to perform better in the structured examination, when compared with students who received traditional training without the formative assessment. In addition, it was hypothesised that these students would favourably view the course, in terms of personal satisfaction with their learning experience.
Materials and methods
Patient and public involvement
The patients and members of the public were not involved in the design of this study.
Educational context
A curriculum of clinical skill simulation was offered to fourth-year undergraduates in the 5-year clinical medical programme at Tongji Medical College of Huazhong University of Science and Technology (online supplemental table 1). These undergraduates have completed the study of basic medicine and were about to begin clinical practice. None of these students received prior training in basic clinical skills. The original purpose of the course was to familiarise students with important basic clinical skills before they enter clinical practice. The curriculum objectives were based on those for training clinical medical students, and the requirements for the qualification examination of medical practitioners. At the end of the course, these students were required to complete a centralised online examination, in order to demonstrate their theoretical mastery of clinical skills. Approximately 4 months after completing the course, all students were scheduled to undergo the Objective Structured Clinical Examination (OSCE) held by Tongji Medical College. The OSCE stations were designed by experts of Tongji Medical College. Some instructors are regular instructors, and new instructors receive standardised teaching training before the beginning of each semester.
Supplemental material
Course delivery
Before the restructuring, a simulation course for clinical skills was offered in the fall semester (starting in September) every year, and this has been successfully run in Union Hospital for 4 years (2014–2018). However, the effectiveness of the course has become unsatisfactory due to the increase in student enrolment, and the reduction in student-instructor in-person contact hours. Therefore, in order to improve the students’ grasp of clinical skills, the course was reviewed and restructured. In September 2018, formative assessment was introduced to the course to determine whether this can improve the efficiency of clinical skills learning. The differences between the traditional and present course are listed in table 1.
Summary of the traditional and present curriculum
The new curriculum capitalised on feedback at all stages of learning, in order to allow students to use timely feedback to attain knowledge and improve skills, and foster active feedback generation and solicitation in the process. Diversified methods were used to assess the performance of students during the training, which included online and in-class quizzes, video feedback, report writing and centralised online examinations (online supplemental table 2). Feedback to and from the students was provided during the ongoing course activities, with the intention of increasing the learning quality of students and improving the teaching ability of instructors.
Participants
A total of 128 students were enrolled in the clinical skills simulation course with formative assessment methods in 2019. All students agreed to participate in the study. The course was compulsory for degree requirements, and the teaching plan was explained to the students before the start of the course. There was no coercion in recruitment. The demographics, which included gender and age, were collected. There was no significant difference between students in the experimental group and control group (online supplemental table 3). An attendance register was kept to record the student participation, both online and in-class. At the end of the course, the instructor confirmed whether the tasks for the lesson were satisfactorily completed. In order to exclude any bias due to changes in academic ability, the examination results for basic medicine before entering clinical practice were compared.
Preparation before the in-person course
Online platform for the clinical skills course
A website was constructed for the clinical skills course, in order to assist in the self-study of students. The website was designed as a teaching resource database of the clinical skill course, which included course wares and videos, platforms for interactive operation, and online practice and examinations. In addition, an interactive video feedback platform enabled students to submit video assignments after class and obtain feedback from the instructor.
Self-study
The self-study section enabled students to cultivate self-regulation, and develop skills for seeking and using feedback through the interactive discussion platform. The students were required to independently study through the website for 2 weeks before attending the centralised course session, which included learning the course wares, watching videos, completing the self-practice and discussion with the instructors. During the self-study, the students can post their questions on the interactive platform at any time, and the instructors would promptly and effectively answer. After completing the online theoretical learning, the students were instructed to practice their clinical skills via the online interactive simulation system on the website. At the end of each course, an online quiz (approximately 10 questions/course) was conducted to preliminarily evaluate the self-learning of students.
In-person course
Centralised instructions for clinical skills
In-class lectures were removed, because the students completed the self-study. Centralised instruction was conducted, as outlined in online supplemental table 4.
Group practice
Group practice offered opportunities for students to engage in hands-on practice, give and receive feedback related to the clinical training, and exercise self-regulation. At 1 week after the centralised session, the students were divided into several groups (12–16 students per group) for the procedural training of the skills taught in the centralised instruction. Each group was separately trained. The simulated training lasted for 120 min, during which the students were required to practice their clinical skills on models. In the meantime, two students were randomly selected to film the demonstration of one clinical skill activity. After the training, the students participated in 15 min of centralised video feedback. All students discussed the videos taken in class, and each student immediately completed two reports to summarise the clinical skill procedures and analyse the problems they encountered in practice. The instructor assessed the reports and released their feedback after 1 week.
Tasks after class
Video-aided feedback was used to encourage the student self-reflection of clinical procedures. After each class, each student was instructed to make an appointment to film the entire procedure of the clinical skills trained in the group practice, and upload the video to the online platform within 2 weeks after the group practice. The instructor graded the uploaded videos on the platform, according to a unified scoring rubric, and provided detailed comments for each video. Thus, the students can immediately receive the feedback. After completing the tasks, the students took the online examinations.
Examination of theory
The final score of the course was derived from the weighted sum of each assessment. The score was intended to reflect the students’ mastery of theories and correct implementation of clinical skills, and evaluate their performance in practice. The scores for theory were weighted, as follows: 10% each for the online quiz, in-class quiz, reports on basic steps, and reports on problem analysis; 20% for the video assignment; 40% for the online examination.
Evaluation of satisfaction
In order to determine the subjective outcome of the formative assessment methods, an online questionnaire was administered to obtain the students’ subjective feedback on the clinical simulation course. The questionnaire was designed using Sojump (http://www.wjx.cn/), which comprised quantitative ratings and qualitative comments. The ratings questions covered three main aspects: student’s self-evaluation and improvement, feedback on the formative assessment, and desire for future use. Seven questions were rated on a 5-point Likert scale: 1 point, strongly disagree; 2 points, disagree; 3 points, neutral; 4 points, agree and 5 points, strongly agree. In addition to the quantitative ratings, the students were also invited to make spontaneous, open-ended comments on the best feature of the formative assessment, in order to capture the students’ sentiment and retrieve subjective feedback not covered by the quantitative ratings.
The students were instructed to complete the following assignment: ‘Describe your feeling about the formative assessment during your clinical skills training in one or two sentences’. All respondents were instructed to complete the survey online, and their responses were anonymously captured and aggregated in Sojump. Cronbach’s alpha test was used to determine the internal consistency of the responses.
Objective Structured Clinical Examination
The OSCE performance was used to evaluate the objective outcome of the formative assessment method. Tongji Medical College administers an OSCE for medical students every year, and this is required for students who have practised in a clinic for 1 year. The OSCE comprises 14 workstations, with 6 min allocated for each, separated by 1 min gaps between stations. The examination covers the following topics: consultation, doctor–patient communication, cardiopulmonary auscultation, four items of physical examination and four items of clinical skills. The items for clinical skills included cardiopulmonary resuscitation, dressing change and suture removal, paediatric physical measurement and four Leopold manoeuvres. These four items were evaluated based on the preparation, patient evaluation, doctor–patient communication, procedures, postoperative treatments, humanistic care and medical ethics, and according to the scoring standards of the National Medical Practitioner Examination.
The experimental group consisted of students who experienced formative assessment methods and attended the OSCE in 2019, while the control group comprised students who were traditionally trained and attended the OSCE in 2018. The test results for the four clinical skills assessed in the OSCE in 2018 and 2019 were compared.
Data analysis
Quantitative analysis of the clinical skills test
The quantitative data obtained from the OSCE were analysed using the GraphPad Prism V.8 software package (GraphPad Software, San Diego, California, USA). The data were analysed using an independent samples t-test, with the significance level set at 95%. The results were presented as mean±SE of the mean. The results were considered to be significantly different at p<0.05, and extremely significantly different at p<0.01.
Evaluation of satisfaction
Two coders (the authors) independently performed the content analysis of the qualitative data obtained from the questionnaire. Thematic analysis of the responses to the open-ended questions was performed using common coding techniques, in which the answers were read, and the main themes were identified within the answers. The keywords obtained from the respondents’ answers were classified as positive, negative or indifferent. Positive responses indicated that the students improved in clinical skills and confidence, and that the course was important and helpful. Negative responses indicated the avoidance of the teaching method, poor outcome or that the course was an unnecessary change from the traditional course. Indifferent answers indicated no preference, lack of sureness or a combination of positive and negative comments. Selected verbatim quotations were extracted for the illustration of key findings.
Results
Participant demographics
A total of 128 students, who enrolled in the clinical skills simulation course, participated in the study. Among these students, 44% were female and 56% were male, and 98% of these students were above the age of 20. The demographics of the students who enrolled in the course did not significantly differ between the experimental group (present year) and control group (previous year).
OSCE results
The OSCE results for the four clinical skills were compared for 2018 and 2019. Students who attended the OSCE in 2018 were assigned as the control group, while students who attended the OSCE in 2019 were assigned as the experimental group. The grades for the four clinical skills were significantly higher in the experimental group, when compared with the control group (figure 1, table 2).
Scores for the four clinical skills tested in the OSCE in the experimental and control groups, mean±SE of the mean
Mean scores of the four clinical skills tested in the OSCE. Two groups of medical graduates trained with traditional and formative assessment methods underwent the OSCE. The bar graph shows the comparison of the four clinical skills ((A)o CPR; (B) dressing change and suture removal; (C) paediatric physical measurement; (D) four manoeuvres of Leopold); *p<0.05, **p<0.01, ****p<0.001. OSCE, Objective Structured Clinical Examination.
End-of-course survey
Among the 128 students in the study, 122 students (95% response rate) participated in the survey at the end of the course. This was the first time these students enrolled in a course that involved formative assessment methods. The quantitative ratings focused on three main topics: self-evaluation and improvement, feedback and future use of formative assessment. The ratings were positive for all topics (table 3).
Ratings of student respondents on their experience with the multisource formative assessment method and clinical skills training (mean±SD*)
The qualitative data accounted for whether the respondents had a positive, negative or indifferent experience with the formative assessment. Selected verbatim quotations were extracted for the illustration of key findings (figure 2). The majority (81.4%) of students expressed positive sentiment, which highlighted the improvement in clinical skills, confidence and the importance and helpfulness of the course. The typical response was, as follows: ‘I am grateful for the opportunity to participate in the formative assessment during the training of clinical skills. By receiving feedback from the assessments at different stages, I became confident in my clinical skills, and could not wait to apply these skills in real clinical work’. A minor proportion of the responses (13.7%) were indifferent, such as: ‘I am not sure whether the formative assessment is necessary’. Negative comments (5.7%) indicated the avoidance or rejection of the course, the feeling that the formative assessment was unnecessary, or the reluctance to take part in the activity.
The percentage of positive, negative and indifferent responses by respondents to the questions in the open-ended survey.
Discussion
Highly specialised knowledge, training and continued education are required in clinical medicine. Clinical skill performance is considered a core proficiency that is crucial to professionalism in medical practice, and contributes to successful outcomes in patient care.7 Formative assessment is an effective, developmental mode of instruction that uses students’ performance in learning to guide the upcoming steps of teaching. The goals of this study were to determine whether formative assessment, online resources and feedback can add value to the instruction of clinical skills by promoting student engagement at every stage of the curriculum, and enhancing student performance in practical clinical skills.
Miller’s pyramid is one of the most widely used frameworks for evaluating clinical competence, which consists of several tiers that move from cognitive to behavioural performance. The framework starts with factual knowledge (Knows) at the base, followed by applied knowledge (Knows how), before going to skills performance in a structured environment (Shows how) and ultimately the translation into clinical practice (Does). Formative assessment aligns with the Miller’s pyramid, using multiple modes of feedback and assessment to facilitate a trajectory from knowing, to showing, to showing how, to doing.
Using the OSCE as a progress test can address some of the limitations of workplace-based assessment by providing opportunities for trainees to be directly observed in a standardised and objective manner, and receive feedback on their performance over time.20 The results of the OSCE revealed that students who participated in the formative assessment during the clinical skills simulated course (in 2019) achieved significantly higher scores in the clinical skills exam, when compared with students who received traditional training (in 2018) (figure 1). This finding suggests that introducing formative assessment methods in clinical skills simulated teaching can improve the clinical performance of medical students. In addition, it was important to analyse the students’ perspectives and self-perception of their engagement and learning experience, which was assessed in the end-of-course questionnaire designed to investigate the course effectiveness and efficiency, and satisfaction of students. The student survey indicated a high level of satisfaction with the newly introduced formative assessment method during the clinical skills simulated teaching. The majority of the students agreed that the formative assessment helped them learn clinical skills and increase their confidence in clinic.
Motivation to learn
Ideally, a teaching mode should be able to motivate students to learn, in addition to imparting students with knowledge. Duvivier et al suggested that a more active and student-centred approach to clinical skills instruction might be more suitable, when compared with a more traditional approach, such as instructor-centred learning.21 Formative assessment can help students progressively discover their strengths and weaknesses in knowledge, skills, thinking, ability and attitude. Thus, students can be encouraged to think about the areas where they lack knowledge and practice, and initiate solutions to their problems. Furthermore, formative assessment methods compel students to develop a systematic and effective learning plan, and engage in active, rather than passive, learning. More importantly, through formative assessment, students would be more conscious of the learning process, which can help them better improve their abilities.
Feedback
Feedback, which is an outstanding advantage of formative assessment, can make a substantial impact on learning.22 The formative-assessment curriculum designed in this study fully incorporated feedback as the fundamental core of the course framework, as advocated by Boud and Molloy.19 By structuring incremental tasks over time, from online self-study to group clinical practice sessions, the course stringed together a sequence of learning activities that enabled continuous, sustainable feedback. After implementing the changes in clinical simulation teaching, the following question was asked: ‘Do students use the feedback to improve?’ In the formative assessment system constructed in this study, feedback was immediately given in the online and in-class quizzes, report writing and video assignment. This immediate feedback helped the students better understand what they learnt, highlighted the deficiencies from the previous study and training, and resolved problems with the help of the instructors before the students entered the next learning stage.
The video assignment after group practice added depth to the performance evaluation. This is valuable to teachers and students as a self-reflective tool on the feedback strategy and content.23 Video feedback provides an avenue for both the student and teacher to engage in dialogue, which is a co-constructed feature that is essential for effective feedback in learning.19 Furthermore, this provides a comprehensive and accurate assessment of the students’ clinical skills. As students learn step-by-step through feedback, their ability to study would independently and gradually improve.
The crucial purpose and advantage of feedback practices is to foster active learning, and cultivate the ability to seek, provide and use feedback, not only in immediate learning tasks, but also in future scenarios beyond the classroom.22 This is particularly conducive to medical students, who can transfer the capacities in identifying, using and making judgements about feedback to their future clinical practice. The present curriculum was designed with opportunities for eliciting and evaluating feedback throughout the course. Students were encouraged to set their own goals, pace their learning, engage in reciprocal feedback, and reflect on the progress and skills that they can carry beyond educational settings. Such acquired capabilities in active learning may explain why students who received the formative assessment curriculum performed better in the OSCE exams, even though past studies have suggested that performance in clinical skill tests suffered with the passage of time.
In the traditional teaching mode, interactions between students and instructors are very limited. The instructors are available to students only when they are in a large class or group practice, and few students have the opportunity to be personally guided. Therefore, students cannot obtain timely feedback or may even receive no feedback, when learning clinical skills. In addition, from the increased interaction with students, instructors can discover common problems in their lesson plans or methods. Accordingly, the teaching content of future stages can be adjusted. Overall, the incorporation of formative assessment can improve the quality of education and boost the students’ enthusiasm to learn.
There were limitations in the study that should be acknowledged. First, two different cohorts were included as the experimental and control groups. Although there were no significant differences in demographics between these groups, other confounds unaccounted for could have affected the results. The study compared two consecutive groups, which poses a limitation in the study design. In the future, a randomised control experiment would be conducted in a single cohort to verify these findings, in which students are randomised into intervention and control groups, and evaluated with a non-summative assessment. Second, the self-report may be affected by demand characteristics or social desirability bias, in which a student attempts to understand the purpose of the study and answer in a conforming way, or wants to avoid giving negative comments about the subject. This may be mitigated by incorporating other behavioural or physiological measures. Third, further research is needed to explore the time demand of formative assessment. Future studies would examine the time required for using these tools and its impact on learning outcomes.
Conclusion
This study explored the objective and subjective benefits of the formative assessment introduced in the clinical skill simulated course. Several evaluative methods were implemented to enable the formative assessment. These can be used to effectively monitor the various stages of undergraduate clinical skill curriculums. Formative assessment was found to enhance learning enthusiasm, promote active learning and improve clinical practice ability. Nevertheless, further research is required to better understand formative assessment, and establish an effective evaluation and feedback mechanism for clinical skills teaching.
Data availability statement
Data are available on reasonable request. The datasets used and/or analysed during the study are available from the corresponding author on reasonable request.
Ethics statements
Patient consent for publication
Ethics approval
The experimental protocol was established, according to the ethics guidelines of the Declaration of Helsinki, and was approved by the Medical Ethics Committee of Tongji Medical College, Huazhong University of Science and Technology (number: 2021-S157), China. Before the study was conducted, a written informed consent was obtained from each subject, and approval was obtained for the study.
Supplementary materials
Supplementary Data
This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.
Footnotes
XL and GY contributed equally.
Contributors WY: funding acquisition, formal analysis and project administration roles/writing of the original draft; XL: Validation and visualisation; MR: methodology; JG: data curation; MP: software; ZW: data curation; WX: investigation; GY: conceptualisation, formal analysis and writing of the manuscript–review and editing. XL and GY act as guarantors.
Funding The Teaching Research Project was approved by the University (grant number: N/A).
Disclaimer The funding bodies had no role in the design of the study, the collection, analysis, or interpretation of the data, or the writing of the manuscript.
Competing interests None declared.
Patient and public involvement Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.
Provenance and peer review Not commissioned; externally peer reviewed.
Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.