Skip to main content

A Perceptual Study of the Relationship between Posture and Gesture for Virtual Characters

  • Conference paper
Motion in Games (MIG 2012)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 7660))

Included in the following conference series:

Abstract

Adding expressive body motions that synchronizes with gestures appropriately is a key element in creating a lively, intelligent virtual character. However, little is known about the relationship between body motions and gestures or how sensitive humans are to the errors generated by desynchronized gesture and body motion. In this paper, we investigated the motion splicing technique used for aligning body motion and studied people’s sensitivity to desynchronized body motions through two experiments. The first experiment is designed to see whether audio will affect people’s sensitivity to desynchronization errors and explore the role of Posture-Gesture Mergers in the transferability of body motion. A motion distance metric for measuring the distance between stylistically varied body motions is proposed and evaluated in a second experiment. The experiments revealed that audio does not affect the recognition rate, but the presence of posture gesture mergers in the source motion lowers output quality, and people’s sensitivity to motion realism is related to an energy distance metric.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hartmann, B., Mancini, M., Pelachaud, C.: Formational parameters and adaptive prototype installation for MPEG-4 compliant gesture synthesis. In: Proc. Computer Animation 2002, pp. 111–119 (2002)

    Google Scholar 

  2. Chi, D.M., Costa, M., Zhao, L., Badler, N.I.: The EMOTE model for effort and shape. In: Proc. SIGGRAPH 2000, pp. 173–182 (2000)

    Google Scholar 

  3. Kopp, S., Wachsmuth, I.: Synthesizing multimodal utterances for conversational agents. Computer Animation and Virtual Worlds 15, 39–52 (2004)

    Article  Google Scholar 

  4. Neff, M., Kipp, M., Albrecht, I., Seidel, H.P.: Gesture modeling and animation based on a probabilistic re-creation of speaker style. ACM Transactions on Graphics 27(1), 5:1–5:24 (2008)

    Google Scholar 

  5. Lamb, W.: Posture and gesture: an introduction to the study of physical behavior. Duckworth, London (1965)

    Google Scholar 

  6. Lamb, W., Watson, E.: Body code: The meaning in movement. Londres

    Google Scholar 

  7. Nann Winter, D., Widell, C., Truitt, G., George-Falvy, J.: Empirical studies of posture-gesture mergers. Journal of Nonverbal Behavior 13(4), 207–223 (1989)

    Article  Google Scholar 

  8. Egges, A., Molet, T., Magnenat-Thalmann, N.: Personalised real-time idle motion synthesis. In: 12th Pacific Conference on Computer Graphics and Applications, pp. 121–130 (October 2004)

    Google Scholar 

  9. Cassell, J., Nakano, Y., Bickmore, T., Sidner, C., Rich, C.: Annotating and generating posture from discourse structure in embodied conversational agents. In: Workshop on Representing, Annotating, and Evaluating Non-Verbal and Verbal Communicative Acts to Achieve Contextual Embodied Agents, Autonomous Agents 2001 Conference (2001)

    Google Scholar 

  10. Luo, P., Kipp, M., Neff, M.: Augmenting Gesture Animation with Motion Capture Data to Provide Full-Body Engagement. In: Ruttkay, Z., Kipp, M., Nijholt, A., Vilhjálmsson, H.H. (eds.) IVA 2009. LNCS, vol. 5773, pp. 405–417. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  11. Harrison, J., Rensink, R.A., van de Panne, M.: Obscuring length changes during animated motion. ACM Transactions on Graphics 23(3), 569–573 (2004)

    Article  Google Scholar 

  12. Reitsma, P., Pollard, N.: Perceptual metrics for character animation: sensitivity to errors in ballistic motion. ACM Transactions on Graphics (TOG) 22, 537–542 (2003)

    Article  Google Scholar 

  13. Ennis, C., McDonnell, R., O’Sullivan, C.: Seeing is believing: body motion dominates in multisensory conversations. ACM Trans. Graph. 29(4), 91:1–91:9 (2010)

    Google Scholar 

  14. Levine, S., Theobalt, C., Koltun, V.: Real-time prosody-driven synthesis of body language. ACM Transactions on Graphics (TOG) 28(5), 1–10 (2009)

    Article  Google Scholar 

  15. Levine, S., Krähenbühl, P., Thrun, S., Koltun, V.: Gesture controllers. ACM Transactions on Graphics (TOG) 29(4), 124 (2010)

    Article  Google Scholar 

  16. Ikemoto, L., Forsyth, D.: Enriching a motion collection by transplanting limbs. In: Proceedings of the 2004 ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 99–108. Eurographics Association (2004)

    Google Scholar 

  17. Heck, R., Kovar, L., Gleicher, M.: Splicing upper-body actions with locomotion. Computer Graphics Forum 25, 459–466 (2006)

    Article  Google Scholar 

  18. Zhao, L., Badler, N.: Acquiring and validating motion qualities from live limb gestures. Graphical Models 67(1), 1–16 (2005)

    Article  Google Scholar 

  19. Hartmann, B., Mancini, M., Pelachaud, C.: Implementing Expressive Gesture Synthesis for Embodied Conversational Agents. In: Gibet, S., Courty, N., Kamp, J.-F. (eds.) GW 2005. LNCS (LNAI), vol. 3881, pp. 188–199. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  20. Castellano, G., Camurri, A., Mazzarino, B., Volpe, G.: A mathematical model to analyse the dynamics of gesture expressivity. In: Proc. of AISB (2007)

    Google Scholar 

  21. Müller, M., Röder, T., Clausen, M.: Efficient content-based retrieval of motion capture data. ACM Transactions on Graphics (TOG) 24, 677–685 (2005)

    Article  Google Scholar 

  22. Kovar, L., Gleicher, M.: Automated extraction and parameterization of motions in large data sets. ACM Transactions on Graphics 23(3), 559–568 (2004)

    Article  Google Scholar 

  23. Onuma, K., Faloutsos, C., Hodgins, J.: Fmdistance: A fast and effective distance function for motion capture data. Short Papers Proceedings of EUROGRAPHICS 2 (2008)

    Google Scholar 

  24. Neff, M., Kim, Y.: Interactive editing of motion style using drives and correlations. In: Proceedings of the 2009 ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 103–112. ACM (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Luo, P., Neff, M. (2012). A Perceptual Study of the Relationship between Posture and Gesture for Virtual Characters. In: Kallmann, M., Bekris, K. (eds) Motion in Games. MIG 2012. Lecture Notes in Computer Science, vol 7660. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-34710-8_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-34710-8_24

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-34709-2

  • Online ISBN: 978-3-642-34710-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics