Skip to main content

Multi-Device Applications Using the Multimodal Architecture

  • Chapter
  • First Online:
Multimodal Interaction with W3C Standards

Abstract

Nowadays, users have access to a multitude of devices at their homes, workplaces or that they can carry around. Each of these devices, given its features (e.g., interaction modalities, screen size), might be more suitable for particular users, tasks, and contexts. While having one application installed in several devices might be common, they mostly work isolated, not exploring the possibilities of several devices working together to provide a more versatile and richer interaction scenario. Adopting a multimodal interaction (MMI) architecture based on the W3C recommendations, beyond the advantages to the design and development of MMI, provides, we argue, an elegant approach to tackle multi-device interaction scenarios. In this regard, this chapter conveys our views and research outcomes addressing this subject, presenting concrete application examples.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://www.aal4all.org/.

  2. 2.

    http://www.paelife.eu/.

  3. 3.

    http://www.smartphones4seniors.org/.

  4. 4.

    http://iris-interaction.eu.

  5. 5.

    http://commons.apache.org/proper/commons-scxml/.

References

  1. Ghiani, G., Polet, J., Antila, V., & Mäntyjärvi, J. (2015). Evaluating context-aware user interface migration in multi-device environments. Journal of Ambient Intelligence and Humanized Computing, 6(2), 259–277.

    Article  Google Scholar 

  2. Dahl, D. A. (2013). The W3C multimodal architecture and interfaces standard. Journal on Multimodal User Interfaces, 7(3), 171–182.

    Article  Google Scholar 

  3. Freitas, J., Candeias, S., Dias, M. S., Lleida, E., Ortega, A., Teixeira, A., et al. (2014). The IRIS project: A liaison between industry and academia towards natural multimodal communication. In Proceedings of Iberspeech, Las Palmas de Gran Canaria, Spain, pp. 338–347.

    Google Scholar 

  4. Rekimoto, J. (1998). A multiple device approach for supporting whiteboard-based interactions. In Proceedings of the Conference on Human Factors in Computing Systems (CHI’98), Los Angeles, CA, pp. 344–351.

    Google Scholar 

  5. Diehl, J., & Borchers, J. O. (2014). Supporting multi-device iteraction in the wild by exposing application state. (PhD thesis, No. RWTH-CONV-144030). Aachen:Fachgruppe Informatik.

    Google Scholar 

  6. Teixeira, A., Hämäläinen, A., Avelar, J., Almeida, N., Németh, G., Fegyó, T., et al. (2013). Speech-centric multimodal interaction for easy-to-access online services: A personal life assistant for the elderly. In Proceedings DSAI 2013, Procedia Computer Science, Vigo, Spain, pp. 389–397.

    Google Scholar 

  7. Hämäläinen, A., Teixeira, A., Almeida, N., Meinedo, H., Fegyó, T., & Dias, M. S. (2015). Multilingual speech recognition for the elderly: the AALFred personal life assistant. Procedia Computer Science, 67, 283–292.

    Article  Google Scholar 

  8. Teixeira, A. J. S., Pereira, C., Oliveira e Silva, M., Alvarelhão, J., Silva, A., Cerqueira, M., et al. (2013). New telerehabilitation services for the elderly. In I. M. Miranda & M. M. Cruz-Cunha (Eds.), Handbook of research on ICTs for healthcare and social services: Developments and applications. Hershey, PA: IGI Global.

    Google Scholar 

  9. Ferreira, F., Almeida, N., Rosa, A. F., Oliveira, A., Casimiro, J., Silva, S., et al. (2013). Elderly centered design for Interaction—the case of the S4S Medication Assistant. In 5th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion, DSAI, Vigo, Spain.

    Google Scholar 

  10. Leal, A., Teixeira, A., & Silva, S. (2016). On the creation of a persona to support the development of technologies for children with autism spectrum disorder. In Proc. HCI International LNCS 9739, Toronto, Canada, 213–223. doi: 10.1007/978-3-319-40238-3_21

  11. Almeida, N., Silva, S., Santos, B. S., & Teixeira, A. (2016). Interactive, multi-device visualization supported by a multimodal interaction framework: Proof of concept. In Proc. HCI International. LNCS 9754, Toronto, Canada, 279–289. doi: 10.1007/978-3-319-39943-0_27

  12. Almeida, N., Silva, S., & Teixeira, A. (2014). Design and development of speech interaction: a methodology. In Proc. HCI International, LNCS 8511, Crete, Greece, 370–381.

    Google Scholar 

  13. Teixeira, A., Francisco, P., Almeida, N., Pereira, C., & Silva, S. (2014). Services to support use and development of speech input for multilingual multimodal applications for mobile scenarios. In The Ninth International Conference on Internet and Web Applications and Services (ICIW 2014), Track WSSA—Web Services-based Systems and Applications, Paris, France.

    Google Scholar 

  14. Vieira, D., Freitas, J. D., Acartürk, C., Teixeira, A., Sousa, L., Silva, S., Candeias, S., and Sales Dias, M. (2015). "Read that article": Exploring synergies between gaze and speech interaction. In Proc. 17th International ACM SIGACCESS Conference on Computers & Accessibility (ASSETS '15). ACM, New York, NY, USA, 341–342. doi: 10.1145/2700648.2811369

  15. Wiechno, P., Dahl, D., Ashimura, K., & Tumuluri, R. (2012). Registration & discovery of multimodal modality components in multimodal systems: Use cases and requirements. [Online]. https://www.w3.org/TR/mmi-discovery/. Accessed 1 Jan 2016.

  16. Almeida, N., Silva, S., & Teixeira, A. J. S. (2014). Multimodal multi-device application supported by an SCXML state chart machine. In Workshop on Engineering Interactive Systems with SCXML, The sixth ACM SIGCHI Symposium on Computing Systems, Toronto, Canada.

    Google Scholar 

  17. Almeida N., & Teixeira A. (2013). Enhanced interaction for the elderly supported by the W3C Multimodal Architecture. In Proc. 5a Conf. Nacional sobre Interacção, Vila Real, Portugal.

    Google Scholar 

  18. Teixeira, A., Almeida, N., Pereira, C., e Silva, M. O., & Pereira, J. C. (2013). Serviços de Suporte à Interação Multimodal. In A. Teixeira, A. Queirós, & N. Rocha (Eds.), Laboratório Vivo de Usabilidade (pp. 151–165). ARC Publishing.

    Google Scholar 

  19. Teixeira, A., Almeida, N., Pereira, C., e Silva, M. O., Vieira, D., & Silva, S. (2016). Applications in ambient assisted living. In D. Dahl (Ed.), Multimodal Interaction with W3C Standards. Springer.

    Google Scholar 

  20. Barnett, J., Akolkar, R., Auburn, R. J., Bodell, M., Burnett, D. C., Carter, J., et al. (2015), State chart XML (SCXML): State machine notation for control abstraction. W3C Recommendation. https://www.w3.org/TR/scxml/. Accessed 29 Jul 2016.

  21. Baggia, P., Burnett, D. C., Carter, J., Dahl, D. A., McCobb, G., & Raggett, D. (2009). EMMA: Extensible multimodal annotation markup language. https://www.w3.org/TR/emma/. Accessed 1 Jan 2016.

  22. Lee, B., Isenberg, P., Riche, N. H., & Carpendale, S. (2012). Beyond mouse and keyboard: Expanding design considerations for information visualization interactions. IEEE Transactions on Visualization and Computer Graphics, 18(12), 2689–2698.

    Article  Google Scholar 

  23. Ward, M. O., Grinstein, G., & Keim, D. (2010). Interactive data visualization: Foundations, techniques, and applications. Natick, MA: CRC Press.

    MATH  Google Scholar 

  24. Roberts, J. C., Ritsos, P. D., Badam, S. K., Brodbeck, D., Kennedy, J., & Elmqvist, N. (2014). Visualization beyond the desktop—the next big thing. IEEE Computer Graphics and Applications, 34(6), 26–34.

    Article  Google Scholar 

  25. Jaimes, A., & Sebe, N. (2007). Multimodal human-computer interaction: A survey. Computer Vision and Image Understanding, 108(1–2), 116–134.

    Article  Google Scholar 

  26. Schmidt, B. (2014). Facilitating data exploration in casual mobile settings with multi-device interaction. Universitat Stuttgart, Holzgartenstr. 16, 70174 Stuttgart.

    Google Scholar 

  27. Chung, H., North, C., Self, J. Z., Chu, S., & Quek, F. (2014). VisPorter: Facilitating information sharing for collaborative sensemaking on multiple displays. Personal and Ubiquitous Computing, 18(5), 1169–1186.

    Article  Google Scholar 

  28. Isenberg, P., Elmqvist, N., Scholtz, J., Cernea, D., Ma, K.-L., & Hagen, H. (2011). Collaborative visualization: Definition, challenges, and research agenda. Information Visualization, 10(4), 310–326.

    Article  Google Scholar 

  29. Pereira, C., Almeida, N., Martins, A. I., Silva, S., Rosa, A. F., Oliveira e Silva, M., & Teixeira, A. (2015). Evaluation of complex distributed multimodal applications: evaluating a telerehabilitation system when it really matters. In Proc. HCI International, LNCS 9194, Los Angeles, CA, USA, 146–157, doi:10.1007/978-3-319-20913-5_14

  30. Bostock, M., Ogievetsky, V., & Heer, J. (2011). D3: Data-driven documents. IEEE Transactions on Visualization and Computer Graphics, 17(12), 2301–2309.

    Article  Google Scholar 

  31. Barnett, J., Dahl, D., Tumuluri, R., Kharidi, N., & Ashimura, K. (2016). Discovery and registration of multimodal modality components: State handling. [Online]. https://www.w3.org/TR/mmi-mc-discovery/. Accessed 15 Mar 2016.

Download references

Acknowledgements

The work presented in this chapter has been partially funded by IEETA Research Unit funding (Incentivo/EEI/UI0127/2014), Marie Curie IAPP project IRIS (ref. 610986, FP7-PEOPLE-2013-IAPP), project PaeLife (AAL-08-1-2001-0001), and QREN projects Smart Phones for Seniors (S4S), AAL4ALL and EMIF—European Medical Information Framework (EU FP7), co-funded by COMPETE and FEDER.

The authors thank all W3C MMI recommendations contributors for their insightful and inspiring approaches to MMI.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Samuel Silva .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Almeida, N., Silva, S., Teixeira, A., Vieira, D. (2017). Multi-Device Applications Using the Multimodal Architecture. In: Dahl, D. (eds) Multimodal Interaction with W3C Standards. Springer, Cham. https://doi.org/10.1007/978-3-319-42816-1_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-42816-1_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-42814-7

  • Online ISBN: 978-3-319-42816-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics