Skip to main content

Studying Relationships of Muscle Representations and Levels of Interactivity in a Canine Anatomy VR Environment

  • Conference paper
  • First Online:
HCI International 2019 - Posters (HCII 2019)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1033))

Included in the following conference series:

  • 2011 Accesses

Abstract

Virtual Reality (VR) is at the forefront of modern technology; revolutionizing current methods for conducting activities such as gaming, training simulations, and education. When considering anatomy education specifically, students must learn form, function, and movement of various bones, muscles, muscle tendons, ligaments, and joints within the body. Historically, cadaver dissection is believed to be the most optimal method of study, but it is not always accessible. We created a VR canine thoracic limb application that allows students to learn about musculoskeletal movements, while dynamically interacting with anatomical visualization. We aimed at increasing memory retention in a more immersive and engaging way. In our study, three major factors were considered: (1) spatial visualization ability of learners, (2) visualization styles of muscles, and (3) levels of interactivity of the application. Participants of differing spatial abilities (high and low) studied a virtual thoracic limb in one of two visual conditions (realistic muscles or symbolic muscles) and one of two interactive conditions (interactive manipulation or non-interactive viewing). We tested these against each other to determine which method of muscle representation holds the most effective form of memory retention, and what role interactivity plays in this retention. Before the experiment, we gathered data pertaining to student’s spatial visualization ability via a mental rotation test to create a baseline. After the experiment, we interviewed the participants to gather qualitative data about the application’s effectiveness and usability. We observed through 24 user studies that low spatial visualization users gained an advantage through dynamic visualization learning to almost perform as well as their high spatial visualization counterparts. Realistic muscles assisted participants with identifying anatomical views more efficiently, and therefore had a significantly better average compared to the symbolic representation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Albanese, M.: The gross anatomy laboratory: a prototype for simulation-based medical education. Med. Educ. 44(1), 7–9 (2010)

    Article  Google Scholar 

  • Argelaguet, F., Hoyet, L., Trico, M., Lecuyer, A.: The role of interaction in virtual embodiment: effects of the virtual hand representation. In: 2016 IEEE Virtual Reality (VR), Greenville, SC, pp. 3–10 (2016)

    Google Scholar 

  • Berney, S., Betrancourt, M., Molinari, G., Hoyek, N.: How spatial abilities and dynamic visualizations interplay when learning functional anatomy with 3D anatomical models. Anat. Sci. Educ. 8(5), 452–462 (2015)

    Article  Google Scholar 

  • Nguyen, N., Mulla, A., Nelson, A.J., Wilson, T.D.: Visuospatial anatomy comprehension: the role of spatial visualization ability and problem‐solving strategies. Nguyen Anatomical Sciences Education - Wiley Online Library (2013)

    Google Scholar 

  • Seo, J., Storey, J., Chavez, J., Reyna, D., Suh, J., Pine, M.: ARnatomy: tangible AR app for learning gross anatomy. In: ACM SIGGRAPH. ACM, New York (2014). Article 25

    Google Scholar 

  • Juanes, J., Ruisoto, P., Briz-Ponce, L.: Immersive visualization anatomical environment using virtual reality devices. In: Proceedings of the Fourth International Conference on Technological Ecosystems for Enhancing Multiculturality, pp. 473–477 (2016)

    Google Scholar 

  • Lu, W., et al.: Virtual interactive human anatomy. In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 429–432 (2017)

    Google Scholar 

  • Yoon, S.Y.: Psychometric Properties of the Revised Purdue Spatial Visualization Tests: Visualization of Rotations (The Revised PSVT-R). ProQuest LLC (2011)

    Google Scholar 

  • Roussou, M., Oliver, M., Slater, M.: Virtual reality 10, 227 (2006). https://doi.org/10.1007/s10055-0060035-5

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Ben Heymann , Preston White or Jinsil Hwaryoung Seo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Heymann, B., White, P., Seo, J.H. (2019). Studying Relationships of Muscle Representations and Levels of Interactivity in a Canine Anatomy VR Environment. In: Stephanidis, C. (eds) HCI International 2019 - Posters. HCII 2019. Communications in Computer and Information Science, vol 1033. Springer, Cham. https://doi.org/10.1007/978-3-030-23528-4_53

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-23528-4_53

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-23527-7

  • Online ISBN: 978-3-030-23528-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics