Skip to main content

3D Visibility Check in Webots for Human Perspective Taking in Human-Robot Interaction

  • Conference paper
Robot Intelligence Technology and Applications 3

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 345))

Abstract

The rapid development of intelligent robotics would facilitate humans and robots will live and work together at a human workspace in the near future. It means research on effective human-robot interaction is essential for future robotics. The most common situation of human-robot interaction is that humans and robots work cooperatively, and robots should give proper assistance to humans for achieving a goal. In the workspace there are several objects including tools and a robot should identify the human intended objects or tools. There might be situational differences between a robot’s perspective and a human perspective because of several obstacles in environment. Thus, a robot needs to take the human perspective and simulates the situation from the human perspective to identify the human intended object. For human perspective taking, first of all a robot needs to check its own visibility for the environment. To address this challenge, this paper develops a 3D visibility check method by using a depth image in Webots. By using the developed method, a robot can determine whether each point in the environment is visible or invisible at its posture and detect objects if they are visible.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Arkin, R.C., et al.: An ethological and emotional basis for human-robot interaction. Robotics and Autonomous Systems 42(3), 191–201 (2003)

    Article  MATH  Google Scholar 

  2. Ros, R., et al.: Adaptive human-robot interaction in sensorimotor task instruction: From human to robot dance tutors. Robotics and Autonomous Systems 62(6), 707–720 (2014)

    Article  MathSciNet  Google Scholar 

  3. Severinson-Eklundh, K., et al.: Social and collaborative aspects of interaction with a service robot. Robotics and Autonomous Systems 42(3), 223–234 (2003)

    Article  MATH  Google Scholar 

  4. Sorbello, R., et al.: Telenoid android robot as an embodied perceptual social regulation medium engaging natural human-humanoid interaction. Robotics and Autonomous Systems (2014), http://dx.doi.org/10.1016/j.robot.2014.03.017

  5. Schmidt, P.A., et al.: A sensor for dynamic tactile information with applications in human-robot interaction and object exploration. Robotics and Autonomous Systems 54(12), 1005–1014 (2006)

    Article  Google Scholar 

  6. Fritsch, J., et al.: Multi-modal anchoring for human-robot interaction. Robotics and Autonomous Systems 43(2), 133–147 (2003)

    Article  Google Scholar 

  7. Cifuentes, C.A., et al.: Human-robot interaction based on wearable IMU sensor and laser range finder. Robotics and Autonomous Systems (2014), http://dx.doi.org/10.1016/j.robot.2014.06.001

  8. Trafton, J.G., et al.: Enabling effective human-robot interaction using perspective-taking in robots. IEEE Trans. Systems, Man, and Cybernetics-Part A: Systems and Humans 35(4), 460–470 (2005)

    Article  Google Scholar 

  9. Webots: robot simulator, http://www.cyberbotics.com

  10. Henry, P., Krainin, M., Herbst, E., Ren, X., Fox, D.: RGB-D mapping: Using depth cameras for dense 3D modeling of indoor environments. In: Khatib, O., Kumar, V., Sukhatme, G. (eds.) Experimental Robotics. STAR, vol. 79, pp. 477–491. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  11. Ramey, A., et al.: Integration of a low-cost RGB-D sensor in a social robot for gesture recognition. In: Proc. 6th International Conference on HRI, pp. 229–230 (2011)

    Google Scholar 

  12. Hu, G., et al.: A robust RGB-D slam algorithm. In: Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1714–1719 (2012)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ji-Hyeong Han .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Han, JH., Kim, JH. (2015). 3D Visibility Check in Webots for Human Perspective Taking in Human-Robot Interaction. In: Kim, JH., Yang, W., Jo, J., Sincak, P., Myung, H. (eds) Robot Intelligence Technology and Applications 3. Advances in Intelligent Systems and Computing, vol 345. Springer, Cham. https://doi.org/10.1007/978-3-319-16841-8_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-16841-8_24

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-16840-1

  • Online ISBN: 978-3-319-16841-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics