Skip to main content

Human-Like Face and Head Mechanism

  • Reference work entry
  • First Online:
Humanoid Robotics: A Reference

Abstract

Nonverbal information plays an important role in the face-to-face communication between humans. Facial expressions, especially, convey a large amount of information. Moreover, in the case of robots, the ability to produce facial expressions is essential for conveying their inner states to humans. In fact, many types of robots with the ability to produce facial expressions have already been developed.

There are 46 mimic muscles on the human face. Facial expressions are produced owing to the movement of skin by these muscles. However, it is not practical to imitate all these muscles on a robotic head, mainly because of the limitation on the size of the actuators and the mechanisms. Therefore, many approaches have attempted to configure these limited degrees of freedom (DoFs) to achieve an expressive robotic head.

There are two styles of facial expressions on the robotic head: humanlike expressions and expressions that are symbolic. The robotic head with humanlike facial expressions has skin on its surface and makes facial expressions by deforming it. Many approaches have tried to deform the skin as naturally as possible. On the other hand, the robotic head with symbolic facial expressions has a rigid surface, and the facial parts on this surface are moved to show expressions. In this regard, many researchers have tried to achieve simple and expressive mechanisms. Moreover, some robots have the ability to make facial expressions not seen on human faces; these additional expressions are inspired by comics, animation, or animals.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 899.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 1,099.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. R.L. Birdwhistell, Kinesics and Context: Essays on Body Motion Communication (University of Pennsylvania Press, Philadelphia, 1970)

    Google Scholar 

  2. A. Mehrabian, Silent Messages: Implicit Communication of Emotions and Attitudes, 2nd edn. (Wadsworth, Belmont, 1981)

    Google Scholar 

  3. M. Mori, K.F. MacDorman, The uncanny valley robotics & automation magazine. IEEE 19(2), 98–100 (2012)

    Google Scholar 

  4. S. Nishio, H. Ishiguro, N. Hagita, Geminoid: teleoperated android of an existing person, in Humanoid Robots: New Developments, ed. by A. de Pina Filho (I-Tech Education and Publishing, Vienna, 2007)

    Google Scholar 

  5. J.H. Oh, D. Hanson, W.S. Kim, I.Y. Han, J.Y. Kima, I.W. Park, Design of android type humanoid robot albert HUBO, in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006, pp. 1428–1433

    Google Scholar 

  6. S. Nakaoka, F. Kanehiro, K. Miura, M. Morisawa, K. Fujiwara, K. Kaneko, S. Kajita, H. Hirukawa, Creating facial motions of cybernetic human HRP-4C, in Proceedings of IEEE-RAS International Conference on Humanoid Robots, 2009, pp. 561–567

    Google Scholar 

  7. W.B. Knox, P. Stone, C. Breazeal, Training a robot via human feedback: a case study, in Social Robotics, ed. by G. Herrmann, M.J. Pearson, A. Lenz, et al. (Springer International Publishing, Heidelberg, 2013), pp. 460–470

    Chapter  Google Scholar 

  8. A. Breemen, X. Yan, B. Meerbeek, iCat: an animated user-interface robot with personality, in Fourth International Conference on Autonomous Agents & Multi Agent Systems. 2005, pp. 143–144

    Google Scholar 

  9. R. Beira, M. Lopes, M. Praca, Design of the robotcub (iCub) head, in Proceedings of the IEEE International Conference on Robotics and Automation, 2006, pp. 94–100

    Google Scholar 

  10. E. Guizzo, E. Ackerman, How Rethink Robotics Built Its New Baxter Robot Worker, in IEEE SPECTRUM (2012), Available via DIALOG. http://spectrum.ieee.org/robotics/industrial-robots/rethink-robotics-baxter-robot-factory-worker. Accessed 10 Sep 2015

  11. M.D. Binder, N. Hirokawa, U. Windhorst, Encyclopedia of Neuroscience (Springer, Heidelberg, 2009)

    Book  Google Scholar 

  12. G.K. Noorden, E.C. Campos, Binocular Vision and Ocular Motility: Theory and Management of Strabismus, vol 6 (Mosby, London, 2001)

    Google Scholar 

  13. T. Kishi, T. Otani, N. Endo, P. Kryczka, K. Hashimoto, K. Nakata, A. Takanishi, Development of expressive robotic head for bipedal humanoid robot with wide moveable range of facial parts, facial color, in Romansy 19 – Robot Design, Dynamics and Control, ed. by V. Padois, P. Bidaud, O. Khatib (Springer, Vienna, 2013), pp. 151–158

    Chapter  Google Scholar 

  14. T. Villgrattner, H. Ulbrich, Optimization and dynamic simulation of a parallel three degree–of–freedom camera orientation system, in Proceedings of the 23rd IEEE/RSJ 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2010, pp. 2829–2836

    Google Scholar 

  15. W. Platzer, Color Atlas of Human Anatomy: Vol. 1: Locomotor System, 7th edn. (Thieme, New York, 2014)

    Google Scholar 

  16. P.M. Prendergast, Anatomy of the Face and Neck, in Cosmetic Surgery, (Springer, Berlin, 2012), pp. 29–45

    Google Scholar 

  17. P. Ekman, W.V. Friesen, Measuring facial movement. Environ. Psychol. Nonverbal Behav. 1, 56–75 (1976)

    Article  Google Scholar 

  18. P. Ekman, W.V. Friesen, Investigator’s Guide: part Two. Facial Action Coding System (Consulting Psychologists Press, Palo Alto, 1978)

    Google Scholar 

  19. H. Ishihara, Y. Yoshikawa, M. Asada, Realistic child robot ”affetto” for understanding the caregiver-child attachment relationship that guides the child development, in IEEE International Conference on Development and Learning, and Epigenetic Robotics, 2011

    Google Scholar 

  20. T. Hashimoto, S. Hiramatsu, H. Kobayashi, Development of face robot for emotional communication between human and robot, in Proceedings of the 2006 IEEE International Conference on Mechatronics and Automation, 2006, pp. 25–30

    Google Scholar 

  21. H.S. Ahn, D. Lee, D. Choi, D. Lee, M. Hur, H. Lee, Designing of android head system by applying facial muscle mechanism of humans, in Proceedings of the 2012 IEEE-RAS International Conference on Humanoid Robots, 2012, pp. 799–804

    Google Scholar 

  22. I. Lütkebohle, F. Hegel, S. Schulz, M. Hackel, B. Wrede, S. Wachsmuth, G. Sagerer, The Bielefeld anthropomorphic robot head “flobi”, in IEEE International Conference on Robotics and Automation, 2010, pp. 3384–3391

    Google Scholar 

  23. C.L. Breazeal, Designing Sociable Robots (MIT Press, Cambridge, MA, 2002)

    MATH  Google Scholar 

  24. T. Kishi, H. Futaki, G. Trovato, N. Endo, M. Destephe, S. Cosentino, K. Hashimoto, A. Takanishi, A robotic head that displays Japanese ‘Manga’ Marks, in Advances on Theory and Practice of Robots and Manipulators, ed. by M. Ceccarelli, V.A. Glazunov (Springer International Publishing, Cham, 2014), pp. 245–253

    Google Scholar 

  25. C.Y. Lin, L.C. Cheng, C.K. Tseng, H.Y. Gu, K.L. Chung, C.S. Fahn, K.J. Lu, C.C. Chang, A face robot for autonomous simplified musical notation reading and singing. Robot. Auton. Syst. 59(11), 943–953 (2011)

    Article  Google Scholar 

  26. M. Hashimoto, C. Yokogawa, T. Sadoyama, Development and control of a face robot imitating human muscular structures, in Proceedings of IEEE International Conference on Intelligent Robots and Systems, 2006, pp. 1855–1860

    Google Scholar 

  27. K. Itoh, Y. Onishi, S. Takahashi., T. Aoki, K. Hayashi., A. Takanishi, Development of face robot to express various face shapes by moving the parts and outline, in Proceedings of the 2nd Biennial IEEE/RAS–EMBS International Conference on Biomedical Robotics and Biomechatronics, 2008, pp. 439–444

    Google Scholar 

  28. H. Kozima, Infanoid: a Babybot that explores the social environment, in Socially Intelligent Agent, ed. by K. Dautenhahn et al. (Kluwer, Dordrecht, 2002), pp. 157–164

    Google Scholar 

  29. Parmiggiani A, Randazzo M, Maggiali M, Elisei F, Bailly G, Metta G (2014) An articulated talking face for the iCub In: Proceedings of the 14th IEEE-RAS International Conference on Humanoid Robots, pp. 1–6

    Google Scholar 

  30. R. Reilink, L.C. Visser, D.M. Brouwer, R. Carloni, S. Stramigioli, Mechatronic design of the twente humanoid head. Intell. Serv. Robot. 4(2), 107–118 (2011)

    Article  Google Scholar 

  31. R. Gockley, A. Bruce, J. Forlizzi, M. Michalowski, A. Mundell, S. Rosenthal, B. Sellner, R. Simmons, K. Snipes, A.C .Schultz, & J. Wang, Designing robots for long-term social interaction, in Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robotics and Systems, 2005, pp. 2199–2204

    Google Scholar 

  32. F. Delaunay, J.D. Greeff, T. Belpaeme , Towards retro-projected robot faces: an alternative to mechatronic and android faces, in Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication, 2009, pp. 306–311

    Google Scholar 

  33. M. Hashimoto, D. Morooka, Facial expression of a robot using a curved surface display in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005, pp. 2532–2537

    Google Scholar 

  34. K. Sasabuchi, Y. Kakiuchi, K. Okada, M. Inaba, Design and implementation of multi-dimensional flexible antena-like hair motivated by ‘Aho-Hair’ in Japanese anime cartoons: internal state expressions beyond design limitations, in Proceedings of the 24th IEEE International Symposium on Robot and Human Interactive Communication, 2015, pp. 223–228

    Google Scholar 

  35. T. Ribeiro, A. Paiva, The illusion of robotic life: principles and practices of animation for robots, in Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction, 2012, pp 383–390

    Google Scholar 

  36. B. Kühnlenz, S. Sosnowski, M. Buß, et al., Increasing helpfulness towards a robot by emotional adaption to the user. Int. J. Soc. Robot. 5, 457–476 (2013). https://doi.org/10.1007/s12369-013-0182-2

    Article  Google Scholar 

  37. K. Goris, J. Saldien, I. Vanderniepen, D. Lefeber, The Huggable Robot Probo, a multi-disciplinary research platform, in Research and Education in Robotics — EUROBOT 2008, ed. by A. Gottscheber, S. Enderle, D. Obdrzalek (Springer, Berlin/Heidelberg, 2008), pp. 29–41

    Google Scholar 

  38. H. Kobayashi, T. Tsuji, K. Kikuchi, Study of a face robot platform as a kansei medium, in Proceedings of the 26th Annual Conference of the IEEE Industrial Electronics Society, 2000, pp. 481–486

    Google Scholar 

  39. Y. Tadesse, D. Hong, S. Priya, Twelve degree of freedom baby humanoid head using shape memory alloy actuators. J. Mech. Robot. 3(1), 01108 (2011)

    Article  Google Scholar 

  40. D. Hanson, G. Pioggla, Y.B. Cohen, D.R. Disney , Androids: application of EAP as artificial muscles to entertainment industry, in Proceedings of SPIE Symposium on Smart Structure and Materials, 2001, pp. 375–379

    Google Scholar 

  41. J.W. Kwak, H.J. Chi, K.M. Jung, J.C. Koo, J.W. Jeon, Y. Lee, J. Nam, Y. Ryew, H.R. Choi, A face robot actuated with artificial muscle based on dielectric elastomer. Int. J. Mech. Sci. Technol. 19(2), 578–588 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tatsuhiro Kishi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature B.V.

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Kishi, T., Hashimoto, K., Takanishi, A. (2019). Human-Like Face and Head Mechanism. In: Goswami, A., Vadakkepat, P. (eds) Humanoid Robotics: A Reference. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-6046-2_89

Download citation

Publish with us

Policies and ethics