Şükrü Aydın Düzgün1, Sezgin Zeren2, Zülfü Bayhan2

1Department of General Surgery, Jimer Hospital, Bursa, Turkey
2Department of General Surgery Dumlupınar University School of Medicine, Kütahya, Turkey

Abstract

Objective: The objectivity and reliability of examining methods are controversial. We subjected fourth-year medical students to a specially designed verbal exam which we called objectively structured verbal examination. We aimed to evaluate feedback from students about objectively structured verbal examination as an assessment instrument for gauging their surgical knowledge.
Material and Methods: Objectively structured verbal examination modules were developed according to the learning goals of the surgical clerkship. Upon finishing surgery rotation, the students were subjected to objectively structured verbal examination as part of their final evaluation. The students’ perception of objectively structured verbal examination was assessed by their responses to a questionnaire.
Results: Forty-two of 58 students returned filled questionnaires. Objectively structured verbal examination was accepted by 72% of the students as an objective tool, and 86% of them found it enabled unbiased evaluation. Overall, most students expressed positive feedback regarding objectively structured verbal examination.
Conclusion: The feedback received from students showed that objectively structured verbal examination is a reliable and objective method to assess their knowledge. This feedback reflects that objectively structured verbal examination merits further development and enhancement.

Keywords: Medical education, exam, clinical reasoning

Introduction

Medical education has changed dramatically over the years. Currently, the transfer of hands-on experience from master to pupil has largely been replaced by formal lectures, case studies, and practical applications (1, 2). Students’ knowledge, clinical reasoning, and problem solving skills obtained in the course of surgical clerkship programs can be assessed in several ways. Oral exams with open ended questions (OEQ), multiple choice questions (MCQ), and essay writing are commonly used. However, the objectivity and reliability of these examining methods remain controversial (3). We subjected students to a specially designed verbal exam which we called objectively structured verbal examination (OSVE).

Objectively structured verbal examination consists of several exam modules; each module addresses a specific surgical problem with a hypothetical case scenario. The main objective of OSVE is assessing the student’s problem-solving efficiency when confronting different surgical problems. In the present study, the students’ perception of OSVE was determined by their responses to a questionnaire, and its reliability was discussed.

Material and Methods

This study was performed in the Department of Surgery of the School of Medicine. Ethical approval was obtained from the ethical committee of the institution. The participants’ approval was obtained as stated in the questionnaire. Our undergraduate medical education program requires a 9-week surgical clerkship in the fourth year. Upon completing their surgery rotations, students were subjected to OSVE as part of the final evaluation of their performance. Briefly, several exam modules were developed according to the objectives of the surgical clerkship. All modules were designed to evaluate students’ understanding, diagnostic approach, and decision-making regarding basic surgical problems such as abdominal pain, breast mass, thyroid nodule, and gastrointestinal bleeding. Each module starts with the complaints of a hypothetical patient with a surgical problem and continues with sections that disclose differential diagnosis, diagnostic workup, and management. Each section has a list of expected answers. The examiner is asked to tick each correct answer that is given by the student. Each module is assigned a score that consists of the sum of the scores that are collected from each answer. An example module is given in Table 1. The students were assigned to examiners; each examiner offered modules to the student and calculated the total score according to the correct answers. After the evaluation period was finished and the student’s scores were announced, the students were given a questionnaire to express their perceptions of OSVE. The questionnaire consisted of 10 questions, 6 of which expressed positive opinions and 4 of which expressed negative opinions. Students were asked to specify their impressions by scoring from 1 to 5, as described in Table 2.

Results

A total of 58 students were subjected to OSVE modules. Forty-two students returned filled questionnaires. Scores 5 and 4 were accepted as “agreement,” 3 was “not sure,” and 2 and 1 were “disagreement.” Seventy-two percent of the students found the exam was objective, and 86% of them affirmed that the exam enabled unbiased evaluation. Ninety percent of the students agreed that the exam was well organized and 85% of them said that the hypothetical cases were relevant to real life. Thirty percent of the students said they were unfamiliar with this type of exam and had difficulty responding to the questions. However, 65% of the students expressed the opposite opinion. Thirty percent of the students found the exam was stressful. The results are summarized in Table 3.

Discussion

Assessing the clinical reasoning skills of students is an essential part of evaluating the surgical clerkship period (4). There are many ways to evaluate medical students’ knowledge and their ability to solve clinical problems (5). Our surgery department uses three steps to evaluate students’ accomplishments during their surgery clerkship. First, the students maintain a logbook in which the teaching activities they are involved in during the clerkship period are recorded, such as taking histories, examining patients, and conducting basic procedures. Second, they are subjected to a specially designed verbal exam, which we called OSVE. Finally, they take a multiple choice test.

What a student is expected to know, understand, and/or be able to demonstrate after completion of the surgical clerkship period describes the learning outcomes (6). As a part of the Bologna process, our department has described learning goals for surgical undergraduate education. With respect to the determined learning goals, we developed several OSVE modules. Each module was based on a specific surgical problem. For example, the abdominal pain module evaluates the standard approach when a patient presents with acute onset abdominal pain, including what questions should be asked, what signs are expected, the differential diagnoses, and required tests.

We surveyed students’ appreciation of OSVE using a questionnaire. Most of the students agreed that the exam allows unbiased and objective measuring of their knowledge. Most of the students affirmed that the hypothetical cases were consistent with real-life medical situations they observed during clerkship. Some students declared that they experienced some anxiety; however, most students found the cases and questions were easy to understand. Taking all the answers together, most of the students expressed positive feedback regarding OSVE.

The objectivity and reliability of examining methods are crucial. Traditional methods, MCQ and OEQ, have been used extensively. MCQ provides objective assessment of knowledge and facilitates the evaluation of large numbers of students. However, MCQ provides written options, may limit students’ creative thoughts, and cannot measure their bedside clinical problem-solving abilities (7). Oral exams with OEQ may evaluate students’ competence in clinical reasoning and problem solving; however, the objectivity of the examiner is arguable, and examiners’ bias may occur (8). We propose that OSVE provides two important elements to gauge knowledge. First, students should express their knowledge without being given options; therefore, they must provide their postulations freely, which requires mental effort to demonstrate their knowledge in solving a given medical problem. Second, proposed answers are written on the examiner’s sheet; thus, the objectivity of the exam is assured and examiner’s bias is prevented.

For assessment of medical knowledge, structured examinations such as objectively structured clinical examination (OSCE) and objectively structured practical examination (OSPE) have been previously developed, used, and reported (9, 10). These methods have been found to be objective, valid, and reliable tools for assessment that eliminate examiner bias (11). We developed OSVE as a modification of OSCE and OSPE. The major difference is that real patients are not used in OSVE. Although testing the students’ knowledge when facing real patients is extremely valuable, it is not always practical in every setting. The number of students may be a limiting factor, it is not easy to provide enough patients on the exam day, and it may impair patients’ rights and be uncomfortable for some patients. For this reason, we spread our testing of the bedside performance of our students throughout the clerkship period, as they confronted real patients during rounds or out-patient visits, and used OSVE at the end of the period to evaluate the students’ problem-solving abilities.

This study evaluates only the students’ viewpoint for a specific style of examining method. We do not know exactly how their performance in OSVE can be extrapolated to their true extent of knowledge. The national residency entrance exam performance of the students may provide some data regarding whether their OSVE scores had any impact on their overall exam success (12). Because our students have not graduated, we do not have any information about this. However, written exams are unable to evaluate medical graduates’ clinical abilities; these exams only measure solid, statistical knowledge (13). Therefore, continuing to use methods such as OSVE to pursue our determined learning goals seems appropriate.

Conclusion

From the students’ perspective, OSVE provides reliable and objective measurement of knowledge. The feedback we received from students in the questionnaire may lead us to use OSVE more widely with improvements in the future. This feedback is considered valuable for further development and enhancement of OSVE.

Cite this paper as: Düzgün ŞA, Zeren S, Bayhan Z. Objectively structured verbal examination to assess surgical clerkship education: an evaluation of students’ perception. Turk J Surg 2018; 34: 9-12.

Ethics Committee Approval

Ethics committee approval was received for this study from the ethics committee of Dumlupınar University School of Medicine.

Peer Review

Externally peer-reviewed.

Author Contributions

Concept - S.A.D.; Design - S.A.D.; Supervision - S.A.D., S.Z., Z.B.; Resource - S.A.D., S.Z., Z.B.; Materials - S.A.D., S.Z., Z.B.; Data Collection and/or Processing - S.A.D., S.Z., Z.B.; Analysis and/or Interpretation - S.A.D., S.Z., Z.B.; Literature Search - S.A.D.; Writing Manuscript - S.A.D.; Critical Reviews - S.Z., Z.B.

Conflict of Interest

No conflict of interest was declared by the authors.

Financial Disclosure

The authors declared that this study has received no financial support.

Acknowledgments

We thanks to our students for their enrolling the study and valuable contributions during the planning and development of this work.

References

  1. Goldstein EA, Maestas RR, Fryer-Edwards K, Wenrich MD, Oelschlager AM, Baernstein A, et al. Professionalism in medical education: an institutional challenge. Acad Med 2006; 81: 871-876.
  2. Norman G. Fifty years of medical education research: waves of migration. Med Educ 2011; 45: 785-791.
  3. Larsen DP, Butler AC, Roediger HL 3rd. Test-enhanced learning in medical education. Med Educ 2008; 42: 959-966.
  4. Elstein AS. Analytic methods and medical education. Problems and prospects. Med Decis Making 1983; 3: 279-284.
  5. Schuwirth L, Cantillon P. The need for outcome measures in medical education. BMJ 2005; 331: 977-978.
  6. van der Vleuten C. Improving medical education. BMJ 1993; 306: 284-285.
  7. Monroe KS. The relationship between assessment methods and self-directed learning readiness in medical education. Int J Med Educ 2016; 7: 75-80.
  8. Dijksterhuis MG, Schuwirth LW, Braat DD, Teunissen PW, Scheele F. A qualitative study on trainees’ and supervisors’ perceptions of assessment for learning in postgraduate medical education. Med Teach 2013; 35: 396-402.
  9. Patricio MF, Juliao M, Fareleira F, Carneiro AV. Is the OSCE a feasible tool to assess competencies in undergraduate medical education? Med Teach 2013; 35: 503-514.
  10. Harden RM. Evolution or revolution and the future of medical education: replacing the oak tree. Med Teach 2000; 22: 435-442.
  11. Petrusa ER. Structuring clinical medical education: a problem specific, performance based framework. Annu Conf Res Med Educ 1981; 20: 175-180.
  12. De Champlain AF, Cuddy MM, Scoles PV, Brown M, Swanson DB, Holtzman K, et al. Progress testing in clinical science education: results of a pilot project between the National Board of Medical Examiners and a US Medical School. Med Teach 2010; 32: 503-508.
  13. Carney PA, Palmer RT, Fuqua Miller M, Thayer EK, Estroff SE, Litzelman DK, et al. Tools to Assess Behavioral and Social Science Competencies in Medical Education: A Systematic Review. Acad Med 2016; 91: 730-742.