Abstract
This paper presents the results of a comparative study of user input with a camera-joystick and a manual joystick used in a target acquisition task when neither targets nor pointer could be perceived visually. The camera-joystick is an input technique in which each on-screen item is accessible from the center with a predefined vector of head motion. Absolute pointing was implemented with an acceleration factor of 1.7 and a moving average on 5 detected head positions. The underlying assumption was that, in order to provide a robust input for blind users, the interaction technique has to be based on perceptually well-discriminated human movements, which compose a basic framework of an accessible virtual workspace demanding minimum external auxiliary cues. The target spots, having a diameter of 35 mm and a distance between the centers of adjacent spots of 60 mm, were arranged in a rectangular grid of 5 rows by 5 columns. The targets were captured from a distance of 600 mm. The results have shown that the camera input is a promising technique for non-visual human–computer interaction. The subjects demonstrated, more than twice, better performance in the target acquisition task with the camera-joystick versus the manual joystick. All the participants reported that the camera-joystick was a robust and preferable input technique when visual information was not available. Blind interaction techniques could be significantly further improved allowing a user-dependent activation of the navigational cues to better coordinate feedbacks with exploratory behavior.
Similar content being viewed by others
References
Arato, A., Juhasz, Z., Blenkhorn, P. et al.: Java-powered Braille Slate Talker. In: Proceedings of the 9th Int. Conference on Computers Helping People with Special Needs (ICCHP 2004) (Paris, France, July 2004). LNCS, vol. 3118. Springer, Heidelberg, pp. 506–513 (2004)
Betke, M., Gips, J., Fleming, P.: The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE Trans. Neural Syst. Rehabil. Eng. 10(1), 1–10 (2002)
Blenkhorn, P., Crombie, D., Dijkstra, S., et al.: Access to technical diagrams for blind people. In: Proceedings of AAATE 2003: Assistive Technology—Shaping the Future, pp. 466–470. IOS, Amsterdam (2003)
Brown, L., Brewster, S.: Drawing by ear: interpreting sonified line graphs. In: Proc. of ICAD 2003, Boston, MA, USA, pp. 152–156 (2003)
Chen, X., Tremaine, M., Lutz, R., Chung, J-W., Lacsina, P.: AudioBrowser: a mobile browsable information access for the visually impaired. Univ. Access Inform. Soc. (UAIS). 5(1), 4–22 (2006)
Cummings, A.H.: The evolution of game controllers and control schemes and their effect on their games. In: The 17th Annual University of Southampton Multimedia Systems Conference, January 2007. Available at: http://mms.ecs.soton.ac.uk/2007/ (2007)
DEMOR Location based 3D audiogame. Available at (web site 2004): http://demor.hku.nl (2004)
Dix A.: Closing the loop: modelling action, perception and information. In: Proceedings of the Workshop On Advanced Visual Interfaces, Gubbio, Italy, pp. 20–28. ACM, New York (1996)
Donker, H., Klante, P., Gorny, P.: The design of auditory user interfaces for blind users. In: Proceedings of the Second NordiCHI, (Aarhus Denmark October 2002), pp. 149–155. ACM, New York (2002)
Edwards, A.D.N.: Soundtrack: An auditory interface for blind users. Hum Com Interact. 4(1), 45–66 (1989)
Edwards, A.D.N., Evreinov, G.E., Agranovski, A.V.: Isomorphic sonification of spatial relations. In: Proceedings of HCI International’99, Munich, Germany. Lawrence Erlbaum, Mahwah, New Jersey, London, vol. 1, pp. 526–530 (1999)
Eriksson, Y., Gärdenfors, D.: Computer games for children with visual impairments. In: Proceedings of the 5th Int. Conference of Disability, pp. 79–86. Virtual Reality and Associated Technologies. Oxford, UK (2004)
Evreinov, G., Raisamo, R.: An evaluation of three sound mappings through the localization behavior of the eyes. In: Proc. of AES 22nd Int. Conf. on Virtual, pp. 239–248. Synthetic and Entertainment Audio 2002, Espoo, Finland (2002)
Evreinova, T.G., Vesterinen, L.K., Evreinov, G., Raisamo, R.: An exploration of directional-predictive sounds for non-visual interaction with graphs. KAIS J. ISSN 0219–1377, http://dx.doi.org/10.1007/s10115-006-0059-x (2007)
Evreinova, T.V., Evreinov, G., Raisamo, R.: Camera based head-mouse: optimization of template-based cross-correlation matching. In: Proc. of Int. Conference on Computer Vision Theory, pp. 507–514. VISAPP 2007. Barcelona, Spain, INSTICC (2007)
FaceMOUSE Product information. Web site 2005: http://www.aidalabs.com/ (2005)
Ford, D.N.: A behavioral approach to feedback loop dominance analysis. Syst Dynam Rev. 15(1), 3–36 (1999)
Franklin, K., Roberts, J.C.: Pie chart sonification. In: Proc. of the 7th Int. Conf. on Information Visualization (IV03), London, UK, pp. 4–9 (2003)
Franklin, K.M., Roberts, J.C.: A path based model for sonification. In: Eighth International Conference on Information Visualisation (IV’04), pp. 865–870 (2004)
Friberg, J., Gärdenfors, D.: Audio games: new perspectives on game audio. In: Proceedings of the Int. Conference on Advances in Computer Entertainment Technology (ACE’04), pp. 148–154. ACM, New York (2004)
Game controller The article available at (web site 2007): http://en.wikipedia.org/wiki/Game_controller (2007)
Gentilucci, M., Jeannerod, M., Tadary, B., Decety, J.: Dissociating visual and kinesthetic coordinates during pointing movements. J. Exp. Brain Res. 102(2), 359–366 (1994)
Gorny, P.: Typographic semantics of Webpages accessible for visually impaired users: mapping layout and interaction objects to an auditory interaction space. In: Proceedings of the Int. Conference on Computers Helping People with Special Needs, ACM Press Int. Conf. Proceedings Series vol. 31, New York, pp. 251–257 (2004)
Hohryakov, S.S.: Motionless Electronic Pen. Available at (web site 2004) http://www.geocities.com/hoh21.geo/ (2004)
Hollands, M.A., Ziavra, N.V., Bronstein, A.M.: A new paradigm to investigate the roles of head and eye movements in the coordination of whole-body movements. J. Exp. Brain Res. 154(2), 261–266 (2004)
Intuitive Mensch-Technik Interaktion (INVITE) Leitprojekt des Bundesministeriums für Bildung und Forschung Project site 2004: http://ls7-www.informatik.uni-dortmund.de/research/projekte/invite/ (2004)
Jaimes, A.: Posture and activity silhouettes for self-reporting, interruption management, and attentive interfaces. In: Proceedings of the Int. Conference on Intelligent User Interfaces (IUI’06), (Sydney, Australia, Jan. 28–Feb. 1). ACM Press, New York, pp. 24–31 (2006)
Jilin Tu, T., Huang, T., Tao, H.: Face as mouse through visual face tracking. In: Proceedings of the 2nd Canadian Conf Computer and Robot Vision (CRV05), (Victoria, BC, Canada), IEEE Computer Society Press, pp. 339–346 (2005)
Jones, L.A.: Kinesthetic sensing. In: Proc. of the Workshop on Human and Machine Haptics, HMH’97. Human and machine haptics, MIT Press 2000. Available at (web site 2007): http://brl.ee.washington.edu/Education/EE589/Readings/jones00.pdf (1997)
Kamel, H.M.: The Integrated Communication 2 Draw. Ph.D. Dissertation, Electrical Engineering and Computer Sciences Department, University of California, Berkeley. Available at (Web site 2005) http://dub.washington.edu/pubs/ (2003)
Kamel, H.M., Roth, P., Sinha, R.R.: Graphics and User’s Exploration via Simple Sonics (GUESS): providing interrelational representation of objects in a non-visual environment. In: Proceedings of the Int. Conference on Auditory Display (ICAD 2001) Otamedia Oy, Espoo, Finland, pp. 261–266 (2001)
Keates, S., Robinson, P.: The use of gestures in multimodal input. In: Proceedings of the Third International ACM Conference on Assistive Technologies. ACM, New York, pp. 35–42 (1998)
King, A., Blenkhorn, P., Crombie, D., et al.: Presenting UML software engineering diagrams to blind people. In: Proceedings of the 9th Int. Conf. on Computers Helping People with Special Needs. (ICCHP 2004), Paris, France. LNCS3118, Springer, Heidelberg, pp. 522–529 (2004)
Kjeldsen, R., Hartman, J.: Design issues for vision-based computer interaction systems. In: Proc. of the Workshop on Perceptual User Interfaces. ACM International Conference Proceeding Series, Orlando, vol. 15, pp. 1–8 (2001)
Kramer, G. (ed.): Auditory display—sonification, audification, and auditory interfaces. In: Proc. Volume XVIII, Addison Wesley, Reading (1994)
Kurze, M.: TDraw: A computer-based tactile drawing tool for blind people. In: Proceedings of Second Annual ACM Conference on Assistive Technologies: Assets’96, pp. 131–138 (1996)
Kyunghan, H.K.: Modeling of Head and Hand Coordination in Unconstrained Three-Dimensional Movements. PhD, The University of Michigan. Available at http://www.engin.umich.edu/dept/ioe/HUMOSIM/dissertations/Kim_Dissertation_2005.pdf (2005)
Lindenberger, U.: Aging, professional expertise, and cognitive plasticity: the sample case of imagery-based memory functioning in expert graphic designers. (Nebst) Elektronischer Ressource. Max-Planck-Inst. für Bildungsforschung, Berlin. Available at (web site 2006) http://library.mpib-berlin.mpg.de/ (1991)
Loomis J.M., Golledge R.G., Klatzky R.L.: GPS-based navigation systems for the visually impaired. In: Barfield, W., Caudell, T. (Eds.) Fundamentals of Wearable Computers and Augmented Reality, pp. 429–446. Lawrence Erlbaum, Mahwah (2001)
Loomis J.M., Knapp J.M.: Visual perception of egocentric distance in real and virtual environments. In: Hettinger, L.J., Haas, M.W. (eds.) Virtual and Adaptive Environments, pp. 21–46. Erlbaum, Mahwah (2003)
Macaluso E., Driver J., van Velzen J., Eimer M.: Influence of gaze direction on crossmodal modulation of visual ERPS by endogenous tactile spatial attention. Brain Res Cogn Brain Res. 23(2–3), 406–417 (2005)
MacKenzie I.S.: Motor behaviour models for human–computer interaction. In: Carroll, J.M. (ed.) HCI Models, Theories, and Frameworks: Toward a Multidisciplinary Science, pp. 27–54. Morgan Kaufmann, San Francisco (2003)
Meijer, P.B.L.: The vOICe Math Functions. Accessible Graphing Calculator for the Blind (web site 2005). http://www.seeingwithsound.com/winmath.htm (2005)
Morley, S., Petrie, H., O’Neill, A-M.: Auditory navigation in hyperspace: design and evaluation of a non-visual hypermedia system for blind users. In: Proceedings of the Third Int. ACM Conference on Assistive Technology: ASSETS’98. Marina del Rey, CA. ACM Press, New York, pp. 100–107 (1998)
Nielsen, J.: Alternative interfaces for accessibility. In: Bi-Weekly Column Alertbox: Current Issues in Web Usability, of April 7 2003. Available at (website 2007): http://www.useit.com/alertbox/ (2003)
Norman, K.L.: The Psychology of Menu Selection: Designing Cognitive Control at the Human–Computer Interface. Ablex Publishing Corporation, Norwood (1991)
Ouerfelli, M., Kumar, V., Harwin, W.S.: Kinematic modeling of head–neck movements. Systems, Man and Cybernetics, Part A. IEEE Trans Syst Man Cybern A Syst Hum. 29(6), 604–615 (1999)
Oviatt, S.: Ten myths of multimodal interaction. Communications of the ACM, vol. 42, pp. 74–81. http://www.cse.ogi.edu/CHCC/Personnel/oviatt.html (1999)
Perini, E., Soria, S., Prati, A., Cucchiara, R.: FaceMouse: A Human–computer interface for tetraplegic people. In: Proceedings of the Workshop on HCI, Computer Vision in Human–Computer Interaction, (HCI/ECCV 2006), (Graz, Austria). LNCS 3979, pp. 99–108, Springer, Heidelberg (2006)
Philbeck, J.W., Loomis, J.M., Beall, A.C.: Visually perceived location is an invariant in the control of action. Percept Psychophys. 59(4), 601–612 (1997)
Product information on EyeTwig.com Website 2005. http://www.eyetwig.com (2005)
Raz, N., Amedi, A., Zohary, E.: V1 activation in congenitally blind is associated with episodic retrieval. Cerebral Cortex. 15, 1459–1468 (2005)
Rinott, M.: SonicTexting. In: Proceedings of CHI 05. ACM, New York, pp. 1144–1145 (2005)
Sánchez, J., Flores, H.: AudioMath: Blind children learning mathematics through audio. In: Proceedings of The 5th International Conference on Disability, Virtual Reality and Associated Technologies, (ICDVRAT 2004) (September 20–22, Oxford, United Kingdom), pp. 183–189 (2004)
Savidis, A., Stephanidis, C., Korte, A., Crispie, K., Fellbaum, K.: A generic direct-manipulation 3D-auditory environment for hierarchical navigation in non-visual interaction. In: Proc. of ACM ASSETS’96, pp. 117–123 (1996)
Schubotz, R.I., von Cramon, D.Y.: Predicting perceptual events activates corresponding motor schemes in lateral premotor cortex: an fMRI study. NeuroImage. 15, 787–796 (2002)
Schönpflug, W.: The trade-off between internal and external information storage. J. Mem. Lang.. 25, 657–675 (1986)
Seisenbacher, G., Mayer, P., Panek, P., Zagler, W.L.: 3D-Finger System for auditory support of haptic exploration in the education of blind and visually impaired students—idea and feasibility study. In: Proceedings of the 8th European Conference for the Advancement of Assistive Technology, (AAATE 2005) (Lille, France). IOS Press, vol. 16, pp. 73–77 (2005)
Sellen, A., Kurtenbach, G., Buxton, W.: The prevention of mode errors through sensory feedback. Hum. Comput. Interact.. 7(2), 141–164 (1992)
Ullman, J.: UllmanMouseTM Product information available at (web site 2002) http://www.ullman.se/ullmanmouse/home/ (2002)
Ungar, S.: Cognitive mapping without visual experience. In: Kitchin, R. Freundschuh, S. (eds.) Cognitive Mapping: Past, Present and Future. Routledge, London (2000)
Walker, B.N., Lindsay, J.: Auditory navigation performance is affected by waypoint capture radius. In Proc. of ICAD 04, Sydney, Australia, July 6–9 [On-line]. Available at: http://www.icad.org/websiteV2.0/Conferences/ICAD2004/papers/walker_lindsay.pdf (2004)
Wall, S.A., Brewster, S.A.: Non-visual feedback for pen-based interaction with digital graphs. In: Proceedings of ICDVRAT 2006, Esbjerg, Denmark, pp. 223–230 (2006)
Yanz, J.L., Anderson, B.A., John, M.J.: Programmable interface for fitting hearing devices. US Patent Application 20040071304 (2004)
Yfantidis, G., Evreinov, G.: Adaptive blind interaction technique for touchscreens. J. Univ. Access Inform. Soc. Springer. 4(4), 344–345 (2004)
Zheng, X.S., McConkie, G.W., Schaeffer, B.: Navigational control effect on representing virtual environments. In: Proceedings of the Human Factors and Ergonomics Society 47th Annual Meeting. Available at (web site 2005) http://www.isl.uiuc.edu/Publications/publications.htm (2003)
Acknowledgments
This work was supported by the Academy of Finland (grant 107278), and the project MICOLE funded by the European Commission, IST-2003-511592 STP.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Evreinova, T.V., Evreinov, G. & Raisamo, R. A camera-joystick for sound-augmented non-visual navigation and target acquisition: a case study. Univ Access Inf Soc 7, 129–144 (2008). https://doi.org/10.1007/s10209-007-0109-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10209-007-0109-5