Skip to main content
Log in

Kinect cane: an assistive system for the visually impaired based on the concept of object recognition aid

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

This paper proposes a novel concept to assist visually impaired individuals in recognizing three-dimensional objects in everyday environments. This concept is realized as a portable system that consists of a white cane, a Microsoft Kinect sensor, a numeric keypad, a tactile feedback device, and other components. By the use of the Kinect sensor, the system searches for an object that a visually impaired user instructs the system to find and then returns a searching result to the user via the tactile feedback device. The major advantage of the system is the ability to recognize the objects of various classes, such as chairs and staircases, out of detectable range of white canes. Furthermore, the system is designed to return minimum required information related to the instruction of a user so that the user can obtain necessary information more efficiently. The system is evaluated through two types of experiment: object recognition test and user study. The experimental results indicate that the system is promising as a means of helping visually impaired users recognize objects.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Akitaseiko (1976) http://www.akitaseiko.jp

  2. Android (2003) http://www.android.com/

  3. Bahadir SK, Koncar V, Kalaoglu F (2012) Wearable obstacle detection system fully integrated to textile structures for visually impaired people. Sens Actuators A Phys 179:297–311

    Article  Google Scholar 

  4. Benjamin JM, Ali NA, Schepis AF (1973) A laser cane for the blind. Proc S Diego Biomed Symp 12:53–57

    Google Scholar 

  5. Bernabei D, Ganovelli F, Benedetto MD, Dellepiane M, Scopigno R (2011) A low-cost time-critical obstacle avoidance system for the visually impaired. In: International conference on indoor positioning and indoor navigation, pp 21–23

  6. Bolgiano DR, Donnell Meeks JE (1967) A laser cane for the blind. IEEE J Quantum Electron 3(6):268

  7. Bradley NA, Dunlop MD (2005) An experimental investigation into wayfinding directions for visually impaired people. Pers Ubiquitous Comput 9(6):395–403. doi:10.1007/s00779-005-0350-y

    Article  Google Scholar 

  8. Bharathi S, Ramesh A, Vivek S, Kumar J (2012) Effective navigation for visually impaired by wearable obstacle avoidance system. Int J Power Control Signal Comput 3(1):51–53

    Google Scholar 

  9. Balakrishnan G, Sainarayanan G, Nagarajan R, Yaacob S (2006) A stereo image processing system for visually impaired. World Acad Sci Eng Technol 20:206–215

    Google Scholar 

  10. Balakrishnan G, Sainarayanan G, Nagarajan R, Yaacob S (2007) Wearable real-time stereo vision for the visually impaired. Eng Lett 14(2):1–9

    Google Scholar 

  11. Cardin S, Thalmann D, Vexo F (2007) A wearable system for mobility improvement of visually impaired people. Vis Comput 23(2):109–118

    Article  Google Scholar 

  12. Dakopoulos D, Bourbakis NG (2010) Wearable obstacle avoidance electronic travel aids for blind: a survey. IEEE Trans Syst Man Cybern Part C Appl Rev 40(1):25–35. doi:10.1109/TSMCC.2009.2021255

    Article  Google Scholar 

  13. Dambhare S, A.Sakhare P (2011) Smart stick for blind: obstacle detection, artificial vision and real-time assistance via gps. In: IJCA proceedings on 2nd national conference on information and communication technology NCICT(6). Published by Foundation of Computer Science, New York, USA, pp 31–33

  14. Dunai L, Fajarnes GP, Praderas VS, Garcia BD, Lengua IL (2010) Real-time assistance prototype—a new navigation aid for blind people. In: IECON 2010—36th annual conference on IEEE Industrial Electronics Society, pp 1173–1178

  15. Filipe V, Fernandes F, Fernandes H, Sousa A, Paredes H, Barroso J (2012) Blind navigation support system based on microsoft kinect. In: Proceedings of the 4th international conference on software development for enhancing accessibility and fighting info-exclusion (DSAI 2012), pp 94–101

  16. Gomez JD, Mohammed S, Bologna G, Pun T (2011) Toward 3d scene understanding via audio-description: Kinect-ipad fusion for the visually impaired. In: The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility, ASSETS ’11. ACM, New York, NY, USA, pp 293–294. doi:10.1145/2049536.2049613

  17. Gomez JV, Sandnes FE (2012) Roboguidedog: guiding blind users through physical environments with laser range scanners. Procedia Comput Sci 14:218–225

    Article  Google Scholar 

  18. Halabi O, Al-Ansari M, Halwani Y, Al-Mesaifri F, Al-Shaabi R (2012) Navigation aid for blind people using depth information and augmented reality technology. In: The proceedings of NICOGRAPH international 2012, pp 120–125

  19. Hersh MA, Johnson MA (2008) Assistive technology for visually impaired and blind people. Springer, Berlin

    Book  Google Scholar 

  20. Ikarashi M, Yokote H, Takizawa H, Yamamoto S (2000) Walking support system using stereo data for blind person. In: Proceedings of the IEICE general conference, vol 2, p 337

  21. Imadu A, Kawai T, Takada Y, Tajiri T (2011) Walking guide interface mechanism and navigation system for the visually impaired. In: Proceedings of the 4th international conference on human system interactions, pp 34–39

  22. iPad (2010) https://www.apple.com/ipad/

  23. Kajimoto H, Kawakami N, Maeda T, Tachi S (1999) Tactile feeling display using functional electrical stimulation. In: Ninth international conference on artificial reality and telexistence

  24. Kajimoto H, Kawakami N, Tachi S (2002) Optimal design method for selective nerve stimulation and its application to electrocutaneous display. In: Tenth symposium on haptic interfaces for virtual environment and teleoperator systems, pp 303–310

  25. Kawai Y, Tomita F (2002) A supporting system for visually impaired persons to understand three-dimensional visual information using acoustic interface. In: Proceedings of the 16th international conference on pattern recognition, vol 3, pp 974–977

  26. Khan A, Moideen F, Lopez J, Khoo WL, Zhu Z (2012) Kindectect: Kinect detecting objects. In: 13th International conference on computers helping people with special needs LNCS 7383(II), pp 588–595

  27. Kim D, Kim K, Lee S (2014) Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind. Sensors (Basel) 14(6):10412–10431

    Article  Google Scholar 

  28. KinectForWindows (2011) http://www.microsoft.com/en-us/kinectforwindows

  29. Kotani S, Mori H, Kiyohiro N (1996) Development of the robotic travel aid hitomi. Robot Auton Syst 17(1–2):119–128

    Article  Google Scholar 

  30. Kulyukin V, Gharpure C, Nicholson J (2004) Rfid in robot-assisted indoor navigation for the visually impaired. In: Proceedings of the 2004 IEEE/RSJ international conference on intelligent robots and systems, p 2004

  31. Kurata T, Kourogi M, Ishikawa T, Kameda Y, Aoki K, Ishikawa J (2011) Indoor–outdoor navigation system for visually-impaired pedestrians: preliminary evaluation of position measurement and obstacle display. In: 15th IEEE international symposium on wearable computers (ISWC 2011), pp 123–124

  32. Lee HP, Sheu TF (2014) Building a portable talking medicine reminder for visually impaired persons. In: The sixth international conference on future computational technologies and applications, pp 13–14

  33. Lee YH, Medioni G (2011) Rgb-d camera based navigation for the visually impaired. In: RSS 2011 RGB-D: advanced reasoning with depth camera workshop, pp 1–6

  34. Lin Q, Han Y (2014) A context-aware-based audio guidance system for blind people using a multimodal profile model. Sensors 14(10):18670–18700. doi:10.3390/s141018670. http://www.mdpi.com/1424-8220/14/10/18670

  35. Massie HT, Salisbury JK (1994) The phantom haptic interface: a device for probing virtual objects. In: Proceedings of ASME winter annual meeting, symposium on haptic interfaces for virtual environment and teleoperator systems, pp 295–302. http://ci.nii.ac.jp/naid/10021140029/

  36. Malvern Benjamin J (1974) M.S.E.E: the laser cane. J Rehabil Res Dev BPR 10–22:443–450

  37. Manduchi R, Coughlan J, Ivanchenko V (2008) Search strategies of visually impaired persons using a camera phone wayfinding system. In: Computers helping people with special needs, vol. lecture notes in computer science, vol 5105, pp 1135–1140

  38. Manning CD, Raghavan P, Schuetze H (2008) Introduction to information retrieval. Cambridge University Press, Cambridge

    Book  MATH  Google Scholar 

  39. Matusiak K, Skulimowski P, Strurnillo P (2013) Object recognition in a mobile phone application for visually impaired users. In: The 6th international conference on human system interaction (HSI), pp 1–6

  40. Meers S, Ward K (2007) Substitute three-dimensional perception using depth and colour sensors. In: The 2007 Australasian conference on robotics and automation, pp 1–5

  41. Molton N, Se S, Brady J, Lee D, Probert P (1998) A stereo vision-based aid for the visually impaired. Image Vis Comput 16:251–263

    Article  Google Scholar 

  42. Morrissette D, Goodrich G, Hennessey J (1981) A follow-up study of the mowat sensor’s applications, frequency of use, and maintenance reliability. J Vis Impair Blind 75:244–247

    Google Scholar 

  43. Muhammad A, Khan MUA, Azhar H, Masood A, Bakhshi MS (2010) Analytical study of intelligent assistants to help blind people in avoiding dangerous obstacles. J Am Sci 7(8):480–485

    Google Scholar 

  44. Okayasu M (2010) Newly developed walking apparatus for identification of obstructions by visually impaired people. J Mech Sci Technol 24(6):1261–1264

    Article  Google Scholar 

  45. OpenKinect (2011) http://openkinect.org

  46. Paul R, Garg A, Singh V, Mehra D, Balakrishnan M, Paul K, Manocha D (2010) Smart cane for the visually impaired: design, implementation and field testing of an affordable obstacle detection system. In: The 12th international conference on mobility and transport for elderly and disabled persons (TRANSED 2010)

  47. Pressey N (1977) Mowat sensor. Focus 11(3):35–39

    Google Scholar 

  48. Rohan P, Ankush G, Vaibhav S, Dheeraj MB, Kolin P, Dipendra M (2007) Smart cane for the visually impaired: technological solutions for detecting knee-above obstacles and accessing public buses. In: The 11th international conference on mobility and transport for elderly and disabled persons (TRANSED 2007)

  49. Saegusa S, Yasuda Y, Uratani Y, Tanaka E, Makino T, Chang JY (2011) Development of a guide-dog robot: human–robot interface considering walking conditions for a visually handicapped person. Microsyst Technol 17(5–7):1169–1174

    Article  Google Scholar 

  50. Saito T, Takizawa H, Yamamoto S (2002) A display system of obstacle positions for visible disabled persons. In: Proceedings of the IEICE general conference, vol 2, p 316

  51. Salerno M, Re M, Cristini A, Susi G, Bertola M, Daddario E, Capobianco F (2013) Audinect: an aid for the autonomous navigation of visually impaired people based on virtual interface. Int J Hum Comput Interact 4(1):25–33

    Google Scholar 

  52. Shirai Y (1987) Three-dimensional computer vision. Springer, Berlin

    Book  MATH  Google Scholar 

  53. Shoval S, Borenstein J, Koren Y (1998) The navbelt—a computerized travel aid for the blind based on mobile robotics technology. IEEE Trans Biomed Eng 45(11):1376–1386

    Article  Google Scholar 

  54. Takizawa H, Yamaguchi S, Aoyagi M, Ezaki N, Mizuno S (2012) Kinect cane: an assistive system for the visually impaired based on three-dimensional object recognition. In: Proceedings of the 2012 IEEE/SICE international symposium on system integration, pp 740–745

  55. Takizawa H, Yamaguchi S, Aoyagi M, Ezaki N, Mizuno S (2013) Kinect cane: object recognition aids for the visually impaired. In: 6th International conference on human system interaction, pp 1–6

  56. Tatsumi H, Murai Y, Miyakawa M (2007) Rfid for aiding the visually impaired recognize surroundings. In: IEEE international conference on systems, man and cybernetics, pp 3719–3724

  57. Ueda T, Kawata H, Tomizawa T, Ohya A, Yuta S (2006) Visual information assist system using 3d sokuiki sensor for blind people, system concept and object detecting experiments. In: 32nd Annual conference on IEEE industrial electronics, IECON 2006, pp 3058–3063. doi:10.1109/IECON.2006.347767

  58. Ulrich I, Borenstein J (2001) The guidecane—applying mobile robot technologies to assist the visually impaired. IEEE Trans Syst Man Cybern Part A Syst Hum 31:131–136

    Article  Google Scholar 

  59. Velzquez R, Maingreaud F, Pissaloux EE (2003) Intelligent glasses: a new man–machine interface concept integrating computer vision and human tactile perception. Proc EuroHaptics 2003:456–460

    Google Scholar 

  60. Vera P, Zenteno D, Salas J (2014) A smartphone-based virtual white cane. Pattern Anal Appl 17(3):623–632

    Article  MathSciNet  Google Scholar 

  61. Wahab MHA, Talib AA, Kadir HA, Johari A, Noraziah A, Sidek RM, Mutalib AA (2011) Smart cane: assistive cane for visually-impaired people. Int J Comput Sci Issues 8(4–2):21–27

  62. Wang Z, Liu H, Wang X, Qian Y (2014) Segment and label indoor scene based on rgb-d for the visually impaired. In: Gurrin C, Hopfgartner F, Hurst W, Johansen H, Lee H, Connor N (eds) MultiMedia modeling, lecture notes in computerscience, vol 8325. Springer, Berlin, pp 449–460. doi:10.1007/978-3-319-04114-8_38

  63. WHO (2014) World health organization, media centre, visual impairment and blindness, fact sheet no. 282. http://www.who.int/mediacentre/factsheets/fs282/en/. Accessed 1 Aug 2014

  64. Working group on mobility aids for the visually impaired and blind, committee on vision: electronic travel aids: new directions for research. The National Academies Press (1986). http://www.nap.edu/openbook.php?record_id=1011

  65. Xtion (2011) http://www.asus.com/Multimedia/Motion_Sensor/Xtion_PRO_LIVE

  66. Yasumuro Y, Murakami M, Imura M, Kuroda T, Manabe Y, Chihara K (2003) E-cane with situation presumption for the visually impaired. In: Proceedings of the user interfaces for all 7th international conference on universal access: theoretical perspectives, practice, and experience, ERCIM’02, pp 409–421. Springer, Berlin, Heidelberg. http://dl.acm.org/citation.cfm?id=1765426.1765463

  67. Zöllner M, Huber S, Jetter HC, Reiterer H (2011) Navi—a proof-of-concept of a mobile navigational aid for visually impaired based on the microsoft kinect. In: INTERACT 2011, 13th IFIP TC13 conference on human–computer interaction, vol IV (lecture notes in computer science volume 6949), pp 584–587

Download references

Acknowledgments

This work was supported in part by the JSPS KAKENHI Grant Number 25560278.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hotaka Takizawa.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Takizawa, H., Yamaguchi, S., Aoyagi, M. et al. Kinect cane: an assistive system for the visually impaired based on the concept of object recognition aid. Pers Ubiquit Comput 19, 955–965 (2015). https://doi.org/10.1007/s00779-015-0841-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-015-0841-4

Keywords

Navigation