Skip to main content

Advertisement

Log in

Hand gesture recognition-based non-touch character writing system on a virtual keyboard

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

The non-touch system is a modern approach of computer-interface technology capable of revolutionizing human-computer interaction. The interface allows the user to input data and interact with a human, machine or robot in an uncontrolled environment, treatment or industrial life. However, it is challenging to input data into the machine and interact with man and machine with a variety of complexities such as cluttered environment, gesture tracking, and speed. There are many evolving systems, for example, aerial handwriting, sign language recognition, and finger alphabet recognition require substantial effort for all character learning and overhead processing, thence the accuracy of the classification is reduced. Therefore, this paper proposes a non-touch character writing system that allows users to interact and manage the on-screen virtual keyboards in a secure and healthy way by recognizing few hand gestures. We divide this work into two parts: a) hand gesture recognition; and b) gestural flick input using a virtual keyboard. A user-friendly keyboard interface is displayed on the monitor, which uses a flick input method. A deep learning method with CNN is used to extract the features of a gesture. To determine these features, color segmentation is used to detect the hand; color pixels can be obtained by extracting a particular HSV (hue, saturation, value) and applying threshold masking to the input image. Finally, a support vector machine is used to give a more accurate classification of the hand gestures. The user uses a gestural flick input system to perform non-touch character input and enters the character by viewing the virtual keyboard. The character input is executed based on the recognition of the user’s hand gestures. Character input is evaluated based on the average classification accuracy of hand gestures and character recognition, and the accuracy and speed of input. Then, the system is compared with the state-of-the-art algorithms. The experimental results show that the proposed system can recognize seven typical gesture functions and input characters with 97.93% accuracy, which demonstrate the superiority compared to the state-of-the-art algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21

Similar content being viewed by others

References

  1. Yin H, Zhou A, Liu L, Wang N, Ma H (2019) Ubiquitous writer: robust text input for small mobile devices via acoustic sensing. IEEE Internet Things J 6(3):5285–5296

    Article  Google Scholar 

  2. Kumar P, Saini R, Behera SK, Dogra DP, Roy PP (2017) Real-time recognition of sign language gestures and air-writing using leap motion. In: Machine vision applications (MVA), IEEE 15th IAPR international conference on, pp. 157–160

  3. Ben Jmaa A, Mahdi W, Ben Jemaa Y, Ben Hamadou A (2016) A new approach for hand gestures recognition based on depth map captured by RGB-D camera. Computación y Sistemas 20(4):709–721

    Article  Google Scholar 

  4. Maehatake M, Nishida M, Horiuchi Y, Ichikawa A (2007) A study on sign language recognition based on gesture components of position and movement (in Japanese). In: Proc. Workshop Interact. Syst. Softw. (WISS), Japan, pp. 129–130

  5. Nishimura Y, Imamura D, Horiuchi Y, Kawamoto K, Shinozaki T (2012) HMM sign language recognition using Kinect and particle filter. Inst Electron Inf Commun Eng Pattern Recogn Media Underst 111(430):161–166

    Google Scholar 

  6. Sagayam KM, Hemanth DJ (2017) Hand posture and gesture recognition techniques for virtual reality applications: a survey. Virtual Reality 21(2):91–107

    Article  Google Scholar 

  7. Mahmoudi MT, Mojtahedi S, Shams S (2017) AR-based value-added visualization of infographic for enhancing learning performance. Comput Appl Eng Educ 25(6):1038–1052

    Article  Google Scholar 

  8. Rahim M A, Shin J, Islam M R (2018) Human-machine interaction based on hand gesture recognition using skeleton information of Kinect sensor. In: Proceedings of the 3rd international conference on applications in information technology, pp. 75–79, ACM

  9. Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev 43(1):1–54

    Article  Google Scholar 

  10. Ohn-Bar E, Trivedi MM (2014) Hand gesture recognition in real time for automotive interfaces: a multimodal vision-based approach and evaluations. IEEE Trans Intell Transp Syst 15(6):2368–2377

    Article  Google Scholar 

  11. Microsoft X-box Kinect, http://xbox.com

  12. Leap motion, http://www.leapmotion.com

  13. Ishiyama H, Kurabayashi S (2016) Monochrome glove: a robust real-time hand gesture recognition method by using a fabric glove with design of structured markers. In: Virtual reality (VR), pp. 187–188, IEEE

  14. Ho N, Wong PM, Chua M, Chui CK (2018) Virtual reality training for assembly of hybrid medical devices. Multimed Tools Appl 77(23):30651–30682

    Article  Google Scholar 

  15. Shin J, Kim CM (2017) Non-touch character input system based on hand tapping gestures using Kinect sensor. IEEE Access 5:10496–10505

    Article  Google Scholar 

  16. Mitra S, Acharya T (2007) Gesture recognition: a survey. IEEE Trans Syst Man Cybern Part C Appl Rev 37(3):311–324

    Article  Google Scholar 

  17. Kane L, Khanna P (2017) Vision-based mid-air unistroke character input using polar signatures. IEEE Trans Hum Mach Syst 47(6):1077–1088

    Article  Google Scholar 

  18. Chen M, AlRegib G, Juang BH (2016) Air-writing recognition—part I: modeling and recognition of characters, words, and connecting motions. IEEE Trans Hum Mach Syst 46(3):403–413

    Article  Google Scholar 

  19. Chen M, AlRegib G, Juang BH (2016) Air-writing recognition—part II: detection and recognition of writing activity in continuous stream of motion data. IEEE Trans Hum Mach Syst 46(3):436–444

    Article  Google Scholar 

  20. Wang RY, Popović J (2009) Real-time hand-tracking with a color glove. ACM Trans Graph 28(3):63

    Google Scholar 

  21. Lee KR, Chang WD, Kim S, Im CH (2016) Real-time “eye-writing” recognition using electrooculogram. IEEE Trans Neural Syst Rehabil Eng 25(1):37–48

    Article  Google Scholar 

  22. Amma C, Georgi M, Schultz T (2014) Airwriting: a wearable handwriting recognition system. Pers Ubiquit Comput 18(1):191–203

    Article  Google Scholar 

  23. Li Q, Cao H, Lu Y, Yan H, Li T (2016) Controlling non-touch screens as touch screens using Airpen, a writing tool with in-air gesturing mode. In: System and software reliability (ISSSR), international symposium on, pp. 68–76. IEEE

  24. Chua C, Guan H, Ho Y (2002) Model-based 3D hand posture estimation from a single 2D image. Image Vis Comput 20:191–202

    Article  Google Scholar 

  25. Lowe D (1991) Fitting parameterized 3D models to images. IEEE Trans Pattern Anal Mach Intell 13:441–450

    Article  Google Scholar 

  26. Ren Z, Yuan J, Meng J, Zhang Z (2016) Robust part-based hand gesture recognition using Kinect sensor. IEEE Trans Multimedia 15(5):1110–1120

    Article  Google Scholar 

  27. Murata T, Shin J (2014) Hand gesture and character recognition based on Kinect sensor. Int J Distrib Sens Netw 10(7):278460

    Article  Google Scholar 

  28. Neverova N, Wolf C, Taylor GW, Nebout F (2014) Multi-scale deep learning for gesture detection and localization. In: Computer vision-ECCV 2014 workshops. Springer, Heidelberg, pp 474–490

    Google Scholar 

  29. Oyedotun OK, Khashman A (2017) Deep learning in vision-based static hand gesture recognition. Neural Comput & Applic 28(12):3941–3951

    Article  Google Scholar 

  30. Wang P, Li W, Gao Z, Tang C, Zhang J, Ogunbona P O (2015) Convnets-based action recognition from depth maps through virtual cameras and pseudocoloring. In: Proc. ACM Int. Conf. Multimedia, pp. 1119–1122

  31. Wang P, Li W, Gao Z, Zhang J, Tang C, Ogunbona PO (2016) Action recognition from depth maps using deep convolutional neural networks. IEEE Trans Human–Mach Syst 46(4):498–509

    Article  Google Scholar 

  32. Hou Y, Li Z, Wang P, Li W (2018) Skeleton optical spectra-based action recognition using convolutional neural networks. IEEE Trans Circuits Syst Video Technol 28(3):807–811

    Article  Google Scholar 

  33. Yan Wu X (2019) A hand gesture recognition algorithm based on DC-CNN. Multimed Tools Appl 1–13

  34. Rautaray S S, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: A survey. Artificial Intelligence Review 43:1–54.

  35. LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521(7553):436–444

    Article  Google Scholar 

  36. Krizhevsky A, Sutskever I, Hinton G E (2012) Imagenet classification with deep convolutional neural networks. In Proceedings of the 25th international conference on neural information processing systems, pp. 1097–1105

  37. Islam MR, Uddin J, Kim J-M (2016) Acoustic emission sensor network based fault diagnosis of induction motors using a Gabor filter and multiclass support vector machines. Adhoc Sens Wirel Netw 34:273–287

    Google Scholar 

  38. Lin HI, Hsu MH, Chen WK (2014) Human hand gesture recognition using a convolution neural network. In IEEE international conference on automation science and engineering (CASE), pp. 1038–1043

  39. Matsui N, Yamamoto Y (2000) Virtual keyboard: realization of robust real-time fingertip detection from video-image. (in Japanese), in Proc. Japan Soc. Softw. Sci. Technol. (JSSST), pp. 1–7

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jungpil Shin.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rahim, M.A., Shin, J. & Islam, M.R. Hand gesture recognition-based non-touch character writing system on a virtual keyboard. Multimed Tools Appl 79, 11813–11836 (2020). https://doi.org/10.1007/s11042-019-08448-6

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-019-08448-6

Keywords

Navigation