Skip to main content
Log in

Case-Based Teaching: Does the Addition of High-Fidelity Simulation Make a Difference in Medical Students’ Clinical Reasoning Skills?

  • Original research
  • Published:
Medical Science Educator Aims and scope Submit manuscript

Abstract

Context

Situativity theory posits that learning and the development of clinical reasoning skills are grounded in context. In case-based teaching, this context comes from recreating the clinical environment, through emulation, as with manikins, or description. In this study, we sought to understand the difference in student clinical reasoning abilities after facilitated patient case scenarios with or without a manikin.

Methods

Fourth-year medical students in an internship readiness course were randomized into patient case scenarios without manikin (control group) and with manikin (intervention group) for a chest pain session. The control and intervention groups had identical student-led case progression and faculty debriefing objectives. Clinical reasoning skills were assessed after the session using a 64-question script concordance test (SCT). The test was developed and piloted prior to administration. Hospitalist and emergency medicine faculty responses on the test items served as the expert standard for scoring.

Results

Ninety-six students were randomized to case-based sessions with (n = 48) or without (n = 48) manikin. Ninety students completed the SCT (with manikin n = 45, without manikin n = 45). A statistically significant mean difference on test performance between the two groups was found (t = 3.059, df = 88, p = .003), with the manikin group achieving higher SCT scores.

Conclusion

Use of a manikin in simulated patient case discussion significantly improves students’ clinical reasoning skills, as measured by SCT. These results suggest that using a manikin to simulate a patient scenario situates learning, thereby enhancing skill development.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Batalden P, Leach D, Swing S, Dreyfus H, Dreyfus S. General competencies and accreditation in graduate medical education. Health Aff (Millwood). 2002;21(5):103–11.

    Article  Google Scholar 

  2. Englander R, Cameron T, Ballard AJ, Dodge J, Bull J, Aschenbrener CA. Toward a common taxonomy of competency domains for the health professions and competencies for physicians. Acad Med. 2013;88(8):1088–94.

    Article  Google Scholar 

  3. Lyss-Lerman P, Teherani A, Aagaard E, Loeser H, Cooke M, Harper GM. What training is needed in the fourth year of medical school? Views of residency program directors. Acad Med. 2009;84(7):823–9.

    Article  Google Scholar 

  4. Angus S, Vu TR, Halvorsen AJ, Aiyer M, McKown K, Chmielewski AF, et al. What skills should new internal medicine interns have in July? A national survey of internal medicine residency program directors. Acad Med. 2014 Mar;89(3):432–5.

    Article  Google Scholar 

  5. Englander R, Flynn T, Call S, et al. Core entrustable professional activities for entering residency: Curriculum developers guide. Washington: Association of American Medical Colleges MedEdPORTAL iCollaborative; 2014. Available at: https://icollaborative.aamc.org/resource/887/. Accessed 12/3, 2018

    Google Scholar 

  6. McEvoy MD, Dewaay DJ, Vanderbilt A, Alexander LA, Stilley MC, Hege MC, et al. Are fourth-year medical students as prepared to manage unstable patients as they are to manage stable patients? Acad Med. 2014;89(4):618–24.

    Article  Google Scholar 

  7. Young JS, Dubose JE, Hedrick TL, Conaway MR, Nolley B. The use of “war games” to evaluate performance of students and residents in basic clinical scenarios: a disturbing analysis. J Trauma. 2007;63(3):556–64.

    Article  Google Scholar 

  8. Fessler HE. Undergraduate medical education in critical care. Crit Care Med. 2012;40(11):3065–9.

    Article  Google Scholar 

  9. Brown JS, Collins A, Duguid P. Situated cognition and the culture of learning. Educ Res. 1989;18(1):32–42.

    Article  Google Scholar 

  10. Durning SJ, Artino AR. Situativity theory: a perspective on how participants and the environment can interact: AMEE Guide no. 52. Med Teach. 2011;33(3):188–99.

    Article  Google Scholar 

  11. Patel R, Sandars J, Carr S. Clinical diagnostic decision-making in real life contexts: a trans-theoretical approach for teaching: AMEE Guide No. 95. Med Teach. 2015;37(3):211–27.

    Article  Google Scholar 

  12. Ten Eyck RP, Tews M, Ballester JM. Improved medical student satisfaction and test performance with a simulation-based emergency medicine curriculum: a randomized controlled trial. Ann Emerg Med. 2009;54(5):684–91.

    Article  Google Scholar 

  13. Smithburger PL, Kane-Gill SL, Ruby CM, Seybert AL. Comparing effectiveness of 3 learning strategies: simulation-based learning, problem-based learning, and standardized patients. Simul Healthc. 2012;7(3):141–6.

    Article  Google Scholar 

  14. Cleave-Hogg D, Morgan PJ. Experiential learning in an anaesthesia simulation Centre: analysis of students’ comments. Med Teach. 2002;24(1):23–6.

    Article  Google Scholar 

  15. Gordon JA. The human patient simulator: acceptance and efficacy as a teaching tool for students. The Medical Readiness Trainer Team. Acad Med. 2000;75(5):522.

    Article  Google Scholar 

  16. Steadman RH, Coates WC, Huang YM, Matevosian R, Larmon BR, McCullough L, et al. Simulation-based training is superior to problem-based learning for the acquisition of critical assessment and management skills. Crit Care Med. 2006;34(1):151–7.

    Article  Google Scholar 

  17. Littlewood KE, Shilling AM, Stemland CJ, Wright EB, Kirk MA. High-fidelity simulation is superior to case-based discussion in teaching the management of shock. Med Teach. 2013;35(3):e1003–10.

    Article  Google Scholar 

  18. Ten Eyck RP, Tews M, Ballester JM, Hamilton GC. Improved fourth-year medical student clinical decision-making performance as a resuscitation team leader after a simulation-based curriculum. Simul Healthc. 2010 Jun;5(3):139–45.

    Article  Google Scholar 

  19. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86(6):706–11.

    Article  Google Scholar 

  20. Lorello GR, Cook DA, Johnson RL, Brydges R. Simulation-based training in anaesthesiology: a systematic review and meta-analysis. Br J Anaesth. 2014;112(2):231–45.

    Article  Google Scholar 

  21. Schwartz LR, Fernandez R, Kouyoumjian SR, Jones KA, Compton S. A randomized comparison trial of case-based learning versus human patient simulation in medical student education. Acad Emerg Med. 2007;14(2):130–7.

    Article  Google Scholar 

  22. Morgan PJ, Cleave-Hogg D, McIlroy J, Devitt JH. Simulation technology: a comparison of experiential and visual learning for undergraduate medical students. Anesthesiology. 2002;96(1):10–6.

    Article  Google Scholar 

  23. Wenk M, Waurick R, Schotes D, Wenk M, Gerdes C, Van Aken HK, et al. Simulation-based medical education is no better than problem-based discussions and induces misjudgment in self-assessment. Adv Health Sci Educ Theory Pract. 2009;14(2):159–71.

    Article  Google Scholar 

  24. Cook DA, Brydges R, Hamstra SJ, Zendejas B, Szostek JH, Wang AT, et al. Comparative effectiveness of technology-enhanced simulation versus other instructional methods: a systematic review and meta-analysis. Simul Healthc. 2012;7(5):308–20.

    Article  Google Scholar 

  25. Charlin B, Brailovsky C, Leduc C, Blouin D. The diagnosis script questionnaire: a new tool to assess a specific dimension of clinical competence. Adv Health Sci Educ Theory Pract. 1998;3(1):51–8.

    Article  Google Scholar 

  26. Fournier JP, Demeester A, Charlin B. Script concordance tests: guidelines for construction. BMC Med Inform Decis Mak. 2008;8:18–6947 -8-18.

    Article  Google Scholar 

  27. Lubarsky S, Dory V, Duggan P, Gagnon R, Charlin B. Script concordance testing: from theory to practice: AMEE guide no. 75. Med Teach. 2013;35(3):184–93.

    Article  Google Scholar 

  28. Moreau KA. Has the new Kirkpatrick generation built a better hammer for our evaluation toolbox? Med Teach. 2017;39(9):999–1001.

    Google Scholar 

  29. Mahtani K, Spencer EA, Brassey J, Heneghan C. Catalogue of bias: observer bias. BMJ Evid Based Med. 2018;23(1):23–4.

    Article  Google Scholar 

  30. Chang TP, Kessler D, McAninch B, Fein DM, Scherzer DJ, Seelbach E, et al. Script concordance testing: assessing residents’ clinical decision-making skills for infant lumbar punctures. Acad Med. 2014;89(1):128–35.

    Article  Google Scholar 

  31. Carriere B, Gagnon R, Charlin B, Downing S, Bordage G. Assessing clinical reasoning in pediatric emergency medicine: validity evidence for a script concordance test. Ann Emerg Med. 2009;53(5):647–52.

    Article  Google Scholar 

  32. Humbert AJ, Besinger B, Miech EJ. Assessing clinical reasoning skills in scenarios of uncertainty: convergent validity for a Script Concordance Test in an emergency medicine clerkship and residency. Acad Emerg Med. 2011;18(6):627–34.

    Article  Google Scholar 

  33. Goulet F, Jacques A, Gagnon R, Charlin B, Shabah A. Poorly performing physicians: does the Script Concordance Test detect bad clinical reasoning? J Contin Educ Heal Prof. 2010;30(3):161–6.

    Article  Google Scholar 

  34. Beal MD, Kinnear J, Anderson CR, Martin TD, Wamboldt R, Hooper L. The effectiveness of medical simulation in teaching medical students critical care medicine: a systematic review and meta-analysis. Simul Healthc. 2017;12(2):104–16.

    Article  Google Scholar 

  35. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. Revisiting ‘A critical review of simulation-based medical education research: 2003-2009’. Med Educ. 2016;50(10):986–91.

    Article  Google Scholar 

  36. Bandura A. Self-efficacy mechanism in human agency. Am Psychol. 1982;37:122–47.

    Article  Google Scholar 

  37. Eva KW, Cunnington JP, Reiter HI, Keane DR, Norman GR. How can I know what I don’t know? Poor self assessment in a well-defined domain. Adv Health Sci Educ Theory Pract. 2004;9(3):211–24.

    Article  Google Scholar 

  38. Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005 Oct;80(10 Suppl):S46–54.

    Article  Google Scholar 

Download references

Acknowledgments

We thank Dr. Meredith Thompson for her significant contribution to this project.

Funding

This project was supported by an internal Educational Fellowship Award at the University of Virginia School of Medicine.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study conception and design. Material preparation and data collection were performed by M. Kathryn Mutter, James R. Martindale, Neeral Shah, and Stephen J. Wolf. Data analysis was performed by James R. Martindale and M. Kathryn Mutter. Supervision was provided by James R. Martindale, Stephen J. Wolf, and Maryellen E. Gusic. The first draft of the manuscript was written by M. Kathryn Mutter, and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Mary Kathryn Mutter.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical Approval

This research project was performed in accordance with the ethical standards as laid down in the 1964 Declaration of Helsinki and its later amendments or comparable ethical standards. The University of Virginia Institutional Review Board for Social and Behavioral Sciences declared the study exempt (Reference number 2018006700).

Informed Consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

ESM 1

(PDF 236 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mutter, M.K., Martindale, J.R., Shah, N. et al. Case-Based Teaching: Does the Addition of High-Fidelity Simulation Make a Difference in Medical Students’ Clinical Reasoning Skills?. Med.Sci.Educ. 30, 307–313 (2020). https://doi.org/10.1007/s40670-019-00904-0

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40670-019-00904-0

Keywords

Navigation