Skip to main content
Log in

A HCI interface based on hand gestures

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

Human–computer interaction, or HCI for short, concerning how people interact with computers, has long been an important and popular research field. Though not completely realistic, fancy HCI applications such as those shown in the science fiction movies Minority Report and Iron Man have impressively demonstrated the potential and trend of HCI technologies that will be very soon made available. As one can very often observe, compared with traditional keyboard/mouse interfaces, the exclusive use of hands has distinguished itself by enjoying a more intuitive and natural way for communication. Furthermore, the increasingly popular concept of ubiquitous computing has called for convenient and portable input devices, thus making hand gesture inputs even more attractive. For example, a smart phone equipped with the capability of hand gesture recognition could be a good input substitute for its intrinsically small touch screen or keypad. Rather than data gloves, which transfer hand gestures through relatively expensive electronic devices, we are more interested in recognizing the gestures of a bare hand. In this regard, there exist works that can track a 2D articulated hand model. In this paper, we make further improvement in computation efficiency and propose novel interfaces to be coupled with the hand tracking system for more user-friendliness.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Blake, A., Isard, M.: Active Contours. Springer, Berlin (1998)

    Book  Google Scholar 

  2. Bresenham, J.E.: Algorithm for computer control of a digital plotter. IBM Syst. J. 4(1), 25–30 (1965)

    Article  Google Scholar 

  3. Fahn, C.S., Yang, C.N.: Two-hand fingertip identification and gesture recognition techniques applied for human-computer interaction systems in real time. Master’s thesis, National Taiwan University of Science and Technology (2009)

  4. Hardenberg, C.V., Berard, F.: Bare-hand human-computer interacion. In: Proceedings of the ACM Workshop on Perceptive User Interfaces (2001)

  5. Isard, M., Blake, A.: Condensation—conditional density propagation for visual tracking. Int. J. Comput. Vis. 28(1), 5–28 (1998)

    Article  Google Scholar 

  6. Isard, M., MacCormick, J.: Hand tracking for vision-based drawing. Technical report, Visual Dynamics Group, Department of Engineering Science, University of Oxford (2000)

  7. Letessier, J., Berard, F.: Visual tracking of bare fingers for interactive surfaces. In: UIST ’04: 17th Annual ACM symposium on User Interface Software and Technology, pp. 119–122 (2004)

  8. MacCormick, J., Isard, M.: Partitioned sampling, articulated objects, and interface-quality hand tracking. In: European Conference on Computer Vision (2000)

  9. Manresa, C., Varona, J., Mas, R., Perales, F.J.: Hand tracking and gesture recognition for human-computer interaction. Electron. Lett. Comput. Vis. Image Anal. 5(3), 96–104 (2005)

    Google Scholar 

  10. Rehg, J.: Visual analysis of high dof articulated objects with application to hand tracking. Ph.D. thesis, Carnegie Mellon University (1995)

  11. Rowley, H., Baluja, S., Kanade, T.: Neural network-based face detection. IEEE Pattern Anal. Mach. Intell. 20, 22–38 (1998)

    Google Scholar 

  12. Tosas, M.: Visual articulated hand tracking for interactive surfaces. Ph.D. thesis, Nottingham University (2006)

  13. Viola, P., Jones, M.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)

    Article  Google Scholar 

  14. Wang, R.Y., Popović, J.: Real-time hand-tracking with a color glove. ACM Trans. Graph. 28(3) (2009). http://dl.acm.org/citation.cfm?id=1531369

  15. Wu, X., Xu, L., Zhung, B., Ge, Q.: Hand detection based on self-organizing map and motion information. In: IEEE International Conference on Neural Networks and Signal Processing (2003)

  16. Yuan, X., Lu, J.: Virtual programming with bare-hand-based interaction. In: Proceedings of the IEEE International Conference on Mechatronics and Automation (2005)

  17. Zaletelj, J., Perhavc, J., Tasic, J.F.: Vision-based human-computer interface using hand gestures. In: Eight International Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS’ 07) (2007)

Download references

Acknowledgments

This work was supported in part by the National Science Council under the Grant NSC 97-2221-E-011-109.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chuan-Kai Yang.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yang, CK., Chen, YC. A HCI interface based on hand gestures. SIViP 9, 451–462 (2015). https://doi.org/10.1007/s11760-013-0462-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-013-0462-1

Keywords

Navigation