Skip to main content

People with Motor and Mobility Impairment: Innovative Multimodal Interfaces to Wheelchairs

  • Conference paper
Computers Helping People with Special Needs (ICCHP 2006)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 4061))

Included in the following conference series:

Abstract

Standard Interfaces have limited accessibility. Multimodal user interfaces combine various input and output modalities (including seeing/vision, hearing/audition, haptic/tactile, taste/gustation, smell/olfaction etc.), which are a classical research area in Human-Computer Interaction. One of the advantages of multiple modalities is increased flexibility in Usability. The weaknesses of one modality are offset by the strengths of another. For example, on a mobile device with a small visual interface and keypad, a word may be quite difficult to read/type, however very easy to say/listen. Such interfaces, in combination with mobile technologies, can have tremendous implications for accessibility and consequently, they are a potential benefit for people with a wide variety of impairments. Multimodal interfaces must be designed and developed exactly to fit the needs, requirements, abilities and different knowledge levels of the targeted end-users. It is also important to consider different contexts of use. However, in order to achieve advances in both research and development of such interfaces, it is essential to bring researchers and practitioners from Psychology and Computer Science together.

Introducing Statement:

Today, together for better interfaces of tomorrow!

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Blosser, S.R.: SCATIR Switch, Michigan State University - Artificial Language Laboratory (2005), http://www.msu.edu/~rtlang/SCATIR.html

  2. Brisson, L.J.: Designing user interfaces for severely handicapped persons. In: Proceedings of the 2001 EC/NSF workshop on Universal accessibility of ubiquitous computing: providing for the elderly, ACM Press, Alcácer do Sal, Portugal (2001)

    Google Scholar 

  3. Buxton, B.: Less is More (More or Less). In: Denning, P. (ed.) The Invisible Future: The seamless integration of technology in everyday life, pp. 145–179. McGraw Hill, New York (2001)

    Google Scholar 

  4. Hawley, M.S., Cudd, P.A., Wells, J.H., Wilson, A.J., PL. J.: Wheelchair-mounted integrated control systems for multiply handicapped people. Journal of Biomedical Engineering 14(3)

    Google Scholar 

  5. Holzinger, A.: Finger Instead of Mouse: Touch Screens as a means of enhancing Universal Access. In: Carbonell, N., Stephanidis, C. (eds.) UI4ALL 2002. LNCS, vol. 2615, pp. 387–397. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  6. Holzinger, A.: User-Centered Interface Design for disabled and elderly people: First experiences with designing a patient communication system (PACOSY). In: Miesenberger, K., Klaus, J., Zagler, W. (eds.) ICCHP 2002. LNCS, vol. 2398, pp. 34–41. Springer, Berlin (2002)

    Chapter  Google Scholar 

  7. LC-Technology. Eyegaze Eyetracking System (2005), http://www.eyegaze.com/

  8. Oviatt, S., Coulston, R., Lunsford, R.: When do we interact multimodally? Cognitive load and multimodal communication patterns. In: 6th international conference on Multimodal interfaces, pp. 129–136 (2004)

    Google Scholar 

  9. Oviatt, S., Darrell, T., Flickner, M.: Multimodal interfaces that flex, adapt, and persist. Communications of the ACM 47(1), 30–33

    Google Scholar 

  10. Sweller, J.: Cognitive load during problem solving: Effects on learning. Cognitive Science 12(2), 257–285

    Google Scholar 

  11. TashInc. Special Input Devices (2005), http://www.tashinc.com/

  12. Wobbrock, J.O., Myers, B.A., Aung, H.H., LoPresti, E.F.: Text entry from power wheelchairs: edgewrite for joysticks and touchpads. In: 6th international ACM SIGACCESS conference on Computers and accessibility, Atlanta (GA), pp. 110–117. ACM, New York (2004)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Holzinger, A., Nischelwitzer, A.K. (2006). People with Motor and Mobility Impairment: Innovative Multimodal Interfaces to Wheelchairs. In: Miesenberger, K., Klaus, J., Zagler, W.L., Karshmer, A.I. (eds) Computers Helping People with Special Needs. ICCHP 2006. Lecture Notes in Computer Science, vol 4061. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11788713_144

Download citation

  • DOI: https://doi.org/10.1007/11788713_144

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-36020-9

  • Online ISBN: 978-3-540-36021-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics