Skip to main content

Evaluation of Response Times on a Touch Screen Using Stereo Panned Speech Command Auditory Feedback

  • Conference paper
  • First Online:
Speech and Computer (SPECOM 2016)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9811))

Included in the following conference series:

  • 2217 Accesses

Abstract

User interfaces to access mobile and handheld devices usually incorporate touch screens. Fast user responses are in general not critical, however, some applications require fast and accurate reactions from users. Errors and response times depend on many factors such as the user’s abilities, feedback types and latencies from the device, sizes of the buttons to press, etc. We conducted an experiment with 17 subjects to test response time and accuracy to different kinds of speech-based auditory stimuli over headphones. Speech signals were spatialized based on stereo amplitude panning. Results show significantly better response times for 3 directions than for 5, as well as for native language compared to English, and more accurate judgements based on the meaning of the speech sounds rather than their direction.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Hyman, R.: Stimulus information as a determinant of reaction time. J. Exp. Psychol. 45(3), 188–196 (1953)

    Article  Google Scholar 

  2. Kosinski, R.J.: A literature review on reaction time. http://biae.clemson.edu/bpc/bp/lab/110/reaction.htm, Accessed 03 March 2016

  3. Pain, M.T.G., Hibbs, A.: Sprint starts and the minimum auditory reaction time. J. Sports Sci. 25(1), 79–86 (2007)

    Article  Google Scholar 

  4. El-Shimy, D., Grond, F., Olmos, A., Cooperstock, J.: Eyes-free environmental awareness for navigation. J. Multimodal User Interfaces 5, 131–141 (2012)

    Article  Google Scholar 

  5. Hourlier, S., Meehan, J., Léger, A., Roumes, C.: Relative effectiveness of audio tools for fighter pilots in simulated operational flights: a human factors approach. In: New Directions for Improving Audio Effectiveness, Meeting Proceedings RTO-MP-HFM-123, Neuilly-sur-Seine, France, pp. 23-1–23-8 (2005)

    Google Scholar 

  6. Ho, C., Spence, C.: Assessing the effectiveness of various auditory cues in capturing a driver’s visual attention. J. Appl. Exp. Psychol. 11(3), 157–174 (2005)

    Article  Google Scholar 

  7. Simpson, B., Brungart, D., Gilkey, R., McKinley, R.: Spatial audio displays for improving safety and enhancing situation awareness in general aviation environments. In: New Directions for Improving Audio Effectiveness, Meeting Proceedings RTO-MP-HFM-123, Neuilly-sur-Seine, France, pp. 26-1–26-16 (2005)

    Google Scholar 

  8. Gillard, J., Schutz, M.: Improving the efficacy of auditory alarms in medical devices by exploring the effect of amplitude envelope on learning and retention. In: Proceeding of the International Conference on Auditory Display (ICAD 2012), Atlanta, 240–241 (2012)

    Google Scholar 

  9. Suzuki, K., Jansson, H.: An analysis of driver’s steering behavior during auditory or haptic warnings for the designing of lane departure warning system. JSAE Rev. 24(1), 65–70 (2003)

    Article  Google Scholar 

  10. Mynatt, E.D.: Transforming graphical interfaces into auditory interfaces for blind users. Hum. Comput. Interact. 12, 7–45 (1997)

    Article  Google Scholar 

  11. Gaver, W.W.: Auditory icons: using sound in computer interfaces. Hum. Comput. Interact. 2(2), 167–177 (1986)

    Article  Google Scholar 

  12. Blattner, M.M., Sumikawa, D.A., Greenberg, R.M.: Earcons and Icons: their structure and common design principles. Hum. Comput. Interact. 4, 11–44 (1989)

    Article  Google Scholar 

  13. Gygi, B., Shafiro, V.: From signal to substance and back: insights from environmental sound research to auditory display design. In: Proceeding of the 15th International Conference on Auditory Display (ICAD 09), Copenhagen, pp. 240–251 (2009)

    Google Scholar 

  14. Gygi, B., Kidd, G.R., Watson, C.S.: Spectral-temporal factors in the identification of environmental sounds. J. Acoust. Soc. Am. 115(3), 1252–1265 (2004)

    Article  Google Scholar 

  15. Ballas, J.A.: Common factors in the identification of an assortment of brief everyday sounds. J. Exp. Psychol. Hum. 19(2), 250–267 (1993)

    Article  Google Scholar 

  16. Wersényi, G.: Auditory representations of a graphical user interface for a better human-computer interaction. In: Ystad, S., Aramaki, M., Kronland-Martinet, R., Jensen, K. (eds.) CMMR/ICAD 2009. LNCS, vol. 5954, pp. 80–102. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  17. Csapó, Á., Wersényi, G.: Overview of auditory representations in human-machine interfaces. J. ACM Comput. Surv. (CSUR) 46(2), 19 (2013). Article No. 19

    Google Scholar 

  18. Vargas, M.L.M., Anderson, S.: Combining speech and earcons to assist menu navigation. In: Proceeding of the International Conference on Auditory Display (ICAD 2003), Boston, pp. 38–41 (2003)

    Google Scholar 

  19. Walker, B.N., Nance, A., Lindsay, J.: Spearcons: speech-based earcons improve navigation performance in auditory menus. In: Proceeding of the International Conference on Auditory Display (ICAD 2006), London, pp. 63–68 (2006)

    Google Scholar 

  20. Hermann, T., Hunt, A., Neuhoff, J.G. (eds.): The Sonification Handbook. Logos Publishing House, Berlin (2011). http://sonification.de/handbook/chapters

    Google Scholar 

  21. Csapó, Á., Wersényi, G., Nagy, H., Stockman, T.: Survey of assistive technologies and applications for blind users on mobile platforms - a review and foundation for research. J. Multimodal User Interfaces 9(3), 11 (2015)

    Google Scholar 

  22. Jeon, M., Walker, B.N.: Spindex (speech index) improves auditory menu acceptance and navigation performance. J. ACM Trans. Access. Comput. (TACCESS) 3(3), 10 (2011). Article No. 10

    Google Scholar 

  23. Nagel, F., Stöter, F-R., Degara, N., Balke, S., Worrall, D.: Fast and accurate guidance–response times to navigational sounds. In: Proceeding of the 20th International Conference on Auditory Display (ICAD 2014), New York, p. 5 (2014)

    Google Scholar 

  24. Wersényi, G., Nagy, H., Csapó, Á.: Evaluation of reaction times to sound stimuli on mobile devices. In: Proceeding of the International Conference on Auditory Display (ICAD 2015), Graz, pp. 268–272 (2015)

    Google Scholar 

  25. http://www.oddcast.com/home/demos/tts/tts_example.php

Download references

Acknowledgement

This project received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 643636 “Sound of Vision”.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to György Wersényi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Nagy, H., Wersényi, G. (2016). Evaluation of Response Times on a Touch Screen Using Stereo Panned Speech Command Auditory Feedback. In: Ronzhin, A., Potapova, R., Németh, G. (eds) Speech and Computer. SPECOM 2016. Lecture Notes in Computer Science(), vol 9811. Springer, Cham. https://doi.org/10.1007/978-3-319-43958-7_33

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-43958-7_33

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-43957-0

  • Online ISBN: 978-3-319-43958-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics