Skip to main content

Multimodal Excitatory Interfaces with Automatic Content Classification

  • Chapter
  • First Online:
The Engineering of Mixed Reality Systems

Part of the book series: Human-Computer Interaction Series ((HCIS))

Abstract

We describe a non-visual interface for displaying data on mobile devices, based around active exploration: devices are shaken, revealing the contents rattling around inside. This combines sample-based contact sonification with event playback vibrotactile feedback for a rich and compelling display which produces an illusion much like balls rattling inside a box. Motion is sensed from accelerometers, directly linking the motions of the user to the feedback they receive in a tightly closed loop. The resulting interface requires no visual attention and can be operated blindly with a single hand: it is reactive rather than disruptive. This interaction style is applied to the display of an SMS inbox. We use language models to extract salient features from text messages automatically. The output of this classification process controls the timbre and physical dynamics of the simulated objects. The interface gives a rapid semantic overview of the contents of an inbox, without compromising privacy or interrupting the user.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. T. Bell, J. Cleary, and I. Witten. Data compression using adaptive coding and partial string matching. IEEE Transactions on Communications, 32(4):396–402, 1984.

    Article  Google Scholar 

  2. J. Cleary, W. Teahan, and I. Witten. Unbounded length contexts for PPM. In DCC-95, pages 52–61. IEEE Computer Society Press, 1995.

    Google Scholar 

  3. P. Eslambolchilar and R. Murray-Smith. Model-based, multimodal interaction in document browsing. In Multimodal Interaction and Related Machine Learning Algorithms, 2006.

    Google Scholar 

  4. M. Fernström. Sound objects and human-computer interaction design. In D. Rocchesso and F. Fontana, editors, The Sounding Object, pages 45–59. Mondo Estremo Publishing, 2003.

    Google Scholar 

  5. T. Hermann. Sonification for Exploratory Data Analysis. PhD thesis, Bielefeld University, Bielefeld, Germany, 2002.

    Google Scholar 

  6. T. Hermann, J. Krause, and H. Ritter. Real-time control of sonification models with an audio-haptic interface. In R. Nakatsu and H. Kawahara, editors, Proceedings of the International Conference on Auditory Display, pages 82–86, Kyoto, Japan, 7 2002. International Community for Auditory Display (ICAD), ICAD.

    Google Scholar 

  7. T. Hermann and H. Ritter. Listen to your data: Model-based sonification for data analysis. In M. R. Syed, editor, Advances in intelligent computing and mulimedia systems, pages 189–194. Int. Inst. for Advanced Studies in System Research and Cybernetics, 1999.

    Google Scholar 

  8. T. Hermann and H. Ritter. Crystallization sonification of high-dimensional datasets. ACM Transactions on Applied Perception, 2(4):550–558, 2005.

    Article  Google Scholar 

  9. K. Hinckley, J. Pierce, M. Sinclair, and E. Horvitz. Sensing techniques for mobile interaction. In UIST’2000, 2000.

    Google Scholar 

  10. Y. How and M.-Y. Kan. Optimizing predictive text entry for short message service on mobile phones. In Human Computer Interfaces International, 2005.

    Google Scholar 

  11. S. Hughes, I. Oakley, and S. O’Modhrain. Mesh: Supporting mobile multi-modal interfaces. In UIST 2004. ACM, 2004.

    Google Scholar 

  12. K.J. Kuchenbecker, J. Fiene, and G. Niemeyer. Improving contact realism through event-based haptic feedback. IEEE Transactions on Visualization and Computer Graphics, 12(2):219–230, 2006.

    Article  Google Scholar 

  13. J. Linjama, J. Hakkila, and S. Ronkainen. Gesture interfaces for mobile devices - minimalist approach for haptic interaction. In CHI Workshop: Hands on Haptics: Exploring Non-Visual Visualisation Using the Sense of Touch, 2005.

    Google Scholar 

  14. S. O’Modhrain and G. Essl. Pebblebox and crumblebag: Tactile interfaces for granular synthesis. In NIME’04, 2004.

    Google Scholar 

  15. M. Rath and D. Rocchesso. Continuous sonic feedback from a rolling ball. IEEE MultiMedia, 12(2):60–69, 2005.

    Google Scholar 

  16. J. Rekimoto. Tilting operations for small screen interfaces. In ACM Symposium on User Interface Software and Technology, pages 167–168, 1996.

    Google Scholar 

  17. W.J. Teahan and J.G. Cleary. The entropy of English using PPM-based models. In Data Compression Conference, pages 53–62, 1996.

    Google Scholar 

  18. K. van den Doel, P.G. Kry, and D.K. Pai. Foleyautomatic: physically-based sound effects for interactive simulation and animation. In SIGGRAPH ’01, pages 537–544. ACM Press, 2001.

    Google Scholar 

  19. J. Williamson. Continuous Uncertain Interaction. PhD thesis, University of Glasgow, 2006.

    Google Scholar 

  20. J. Williamson and R. Murray-Smith. Granular synthesis for display of time-varying probability densities,. In A. Hunt and Th. Hermann, editors, International Workshop on Interactive Sonification, 2004.

    Google Scholar 

  21. J. Williamson and R. Murray-Smith. Sonification of probabilistic feedback through granular synthesis. IEEE Multimedia, 12(2):45–52, 2005.

    Article  Google Scholar 

  22. J. Williamson, R. Murray-Smith, and S. Hughes. Shoogle: Excitatory multimodal interaction on mobile devices. In Proceedings of CHI 2007, page In Press, 2007.

    Google Scholar 

  23. H.-Y. Yao and V. Hayward. An experiment on length perception with a virtual rolling stone. In Eurohaptics 06, 2006.

    Google Scholar 

Download references

Acknowledgments

The authors are grateful for support from the IST Programme of the European Commission, under PASCAL Network of Excellence, IST 2002-506778; the IRCSET BRG project BRG SC/2003/271 Continuous Gestural Interaction with Mobile devices; HEA project Body Space; EPSRC project EP/E042740/1 and SFI grant 00/PI.1/C067. This publication reflects only the views of the authors. Audio examples and a video are available online at http://www.dcs.gla.ac.uk/ jhw/shoogle/

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to John Williamson .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag London

About this chapter

Cite this chapter

Williamson, J., Murray-Smith, R. (2010). Multimodal Excitatory Interfaces with Automatic Content Classification. In: Dubois, E., Gray , P., Nigay, L. (eds) The Engineering of Mixed Reality Systems. Human-Computer Interaction Series. Springer, London. https://doi.org/10.1007/978-1-84882-733-2_12

Download citation

  • DOI: https://doi.org/10.1007/978-1-84882-733-2_12

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-84882-732-5

  • Online ISBN: 978-1-84882-733-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics