Skip to main content

Automated Tools for Usability Evaluation: A Systematic Mapping Study

  • Conference paper
  • First Online:
Social Computing and Social Media: Design, User Experience and Impact (HCII 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13315))

Included in the following conference series:

Abstract

Usability is one of the most critical indicators in determining the quality of a software product. It corresponds to how users can use a software system to achieve specific objectives with effectiveness, efficiency, and satisfaction. A usability evaluation is necessary to ensure that the software system is usable, but this has certain disadvantages (e.g., a high cost of time and budget for the evaluation to be implemented). While these disadvantages can be a bit daunting despite the benefits they provide, some tools can automatically generate and support usability testing. We conducted a systematic mapping study to identify the tools that support automatic usability evaluation. We identified a total of 15 primary studies. In addition, we classify the tools into four categories: measure usability, support usability evaluation, detect usability problems, and correct usability problems. We identified that the automatic evaluation of the usability of web platforms and mobile devices is the most interesting.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Nielsen, J.: Usability engineering. Morgan Kaufmann Publishers Inc., San Francisco (1994).ISBN: 978-0080520292

    MATH  Google Scholar 

  2. ISO 9241–11:2018. Ergonomics of human-system interaction–part 11: Usability: Definitions and concepts, ISO (2018)

    Google Scholar 

  3. Ferré, X.: Marco de integración de la usabilidad en el proceso de desarrollo software. Facultad de Informática, Universidad Politécnica de Madrid, Madrid, Spain, Tesis doctoral (2005)

    Google Scholar 

  4. Ivory, M.Y., Hearst, M.A.: The state of the art in automating usability evaluation of user interfaces. ACM Comput. Surv. 33(4), 470–516 (2001). https://doi.org/10.1145/503112.503114

    Article  Google Scholar 

  5. Marenkov, J., Robal, T., Kalja, A.: Guideliner: a tool to improve web UI development for better usability. In: 8th International Conference on Web Intelligence, Mining and Semantics (WIMS 2018), ACM, Novi Sad, Serbia, article 17, pp. 1–9 (2018). https://doi.org/10.1145/3227609.3227667

  6. Fabo, P., Durikovic, R.: Automated usability measurement of arbitrary desktop application with eyetracking. In: 2012 16th International Conference on Information Visualisation, IEEE, Montpellier, France, pp. 625–629 (2012). https://doi.org/10.1109/IV.2012.105

  7. Federici, S., et al.: UX evaluation design of UTAssistant: a new usability testing support tool for Italian public administrations. In: Kurosu, M. (ed.) HCI 2018. LNCS, vol. 10901, pp. 55–67. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91238-7_5

    Chapter  Google Scholar 

  8. Grigera, J., Garrido, A., Rossi, G.: Kobold: web usability as a service. In: 2017 32nd IEEE/ACM International Conference on Automated Software Engineering (ASE 2017), IEEE, Urbana, IL, USA, pp. 990–995 (2017). https://doi.org/10.1109/ASE.2017.8115717

  9. Liyanage, N. L., Vidanage, K.: Site-ability: a website usability measurement tool. In: 2016 Sixteenth International Conference on Advances in ICT for Emerging Regions (ICTer’16), IEEE, Negombo, Sri Lanka, pp. 257–265 (2016). Doi: https://doi.org/10.1109/ICTER.2016.7829929

  10. Charfi, S., Trabelsi, A., Ezzedine, H., Kolski, C.: Widgets dedicated to user interface evaluation. Int. J. Hum.-Comput. Interact. 30(5), 408–421 (2014). https://doi.org/10.1080/10447318.2013.873280

    Article  Google Scholar 

  11. Bakaev, M., Mamysheva, T., Gaedke, M.: Current trends in automating usability evaluation of websites: can you manage what you can’t measure? In: 2016 11th International Forum on Strategic Technology (IFOST 2016), Novosibirsk, Russia, pp. 510–514. IEEE (2016). https://doi.org/10.1109/IFOST.2016.7884307

  12. Khasnis, S. S., Raghuram, S., Aditi, A., Samrakshini, R. S., Namratha, M.: Analysis of automation in the field of usability evaluation. In: 2019 1st International Conference on Advanced Technologies in Intelligent Control, Environment, Computing & Communication Engineering (ICATIECE 2019), Bangalore, India, pp. 85–91. IEEE (2019). https://doi.org/10.1109/ICATIECE45860.2019.9063859

  13. Ferré, X., Juristo, N., Moreno, A.M.: Deliverable D.5.1. selection of the software process and the usability techniques for consideration. STATUS Project (code IST-2001–32298) financed by the European Commission from December of 2001 to December of 2004 (2002). http://is.ls.fi.upm.es/status/results/STATUSD5.1v1.0.pdf

  14. Ferré, X., Juristo, N., Moreno, A. M.: Deliverable D.5.2. specification of the software process with integrated usability techniques. STATUS Project (code IST-2001–32298) financed by the European Commission from December of 2001 to December of 2004 (2002). http://is.ls.fi.upm.es/status/results/STATUSD5.2v1.0.pdf

  15. Kitchenham, B.A., Budgen, D., Brereton, O.P.: Using mapping studies as the basis for further research–a participant-observer case study. Inf. Softw. Technol. 53(6), 638–651 (2011). https://doi.org/10.1016/j.infsof.2010.12.011

    Article  Google Scholar 

  16. Zhang, H., Babar, M.A., Tell, P.: Identifying relevant studies in software engineering. Inf. Softw. Technol. 53(6), 625–637 (2011). https://doi.org/10.1016/j.infsof.2010.12.010

    Article  Google Scholar 

  17. Assila, A., de Oliveira, K. M., Ezzedine, H.: An environment for integrating subjective and objective usability findings based on measures. In: 2016 IEEE Tenth International Conference on Research Challenges in Information Science (RCIS 2016), Grenoble, France, pp. 1–12. IEEE (2016). https://doi.org/10.1109/RCIS.2016.7549320

  18. Barra, S., Francese, R., Risi, M.: Automating Mockup-based usability testing on the mobile device. In: Miani, R., Camargos, L., Zarpelão, B., Rosas, E., Pasquini, R. (eds.) GPC 2019. LNCS, vol. 11484, pp. 128–143. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-19223-5_10

    Chapter  Google Scholar 

  19. Paternò, F., Schiavone, A. G., Conti, A.: Customizable automatic detection of bad usability smells in mobile accessed web applications. In: 19th International Conference on Human-Computer Interaction with Mobile Devices and Services (Mo-bileHCI 2017), Vienna, Austria, article 42, pp. 1–11. ACM (2017). https://doi.org/10.1145/3098279.3098558

  20. Atlas.ti9 Atlas.ti 9 desktop trial (windows) (2021). https://atlasti.com/

  21. Scopus.com: An eye on global research: 5000 Publishers. Over 71 M records and 23,700 titles 2020. https://www.scopus.com/freelookup/form/author.uri. Accessed 16 Sept 21

  22. Castro, J. W., Acuña, S. T.: Comparativa de selección de estudios primarios en una revisión sistemática. In: XVI Jornadas de Ingeniería del Software y Bases de Datos (JISBD 2011), A Coruña, España, pp. 319–332 (2011). http://hdl.handle.net/10486/665299. Accessed 16 Sept 21

  23. Magües, D., Castro, J.W., Acuña, S.T.: Usability in agile development: a systematic mapping study. In: XLII Conferencia Latinoamericana de Informática (CLEI 2016), Valparaíso, Chile, pp. 677–684. IEEE (2016). https://doi.org/10.1109/CLEI.2016.7833347

  24. Ren, R., Castro, J.W., Acuña, S.T., De Lara, J.: Evaluation techniques for chatbot usability: a systematic mapping study. Int. J. Software Eng. Knowl. Eng. 29(11n12), 1673–1702 (2019). https://doi.org/10.1142/S0218194019400163

  25. Preece, J., Rogers, Y., Sharp, H., Benyon, D., Holland, S., Carey, T.: Human-Computer Interaction. Concepts and Design. Addison-Wesley, Harlow (1994). ISBN: 978–0201627695

    Google Scholar 

  26. Shneiderman, B.: Designing the User Interface: Strategies for Effective Human-Computer. Pearson, Boston (1998). ISBN: 978–0201694970

    Google Scholar 

  27. Hix, D., Hartson, H.R.: Developing User Interfaces: Ensuring Usability Through Product & Process. Wiley, New York (1993).ISBN: 978-0471578130

    MATH  Google Scholar 

  28. Constantine, L.L., Lockwood, L.A.: Software for use: A Practical Guide to the Models and Methods of Usage-Centered Design. Addison-Wesley Professional, New York (1999).ISBN: 978-0321773722

    Google Scholar 

  29. Nielsen, J.: Usability inspection methods. In: Conference Companion on Human Factors in Computing Systems (CHI 1994), Boston, Massachusetts, USA, pp. 413–414. ACM (1994). https://doi.org/10.1145/259963.260531

  30. Rojas P., L.A., Truyol, M.E., Calderon Maureira, J.F., Orellana Quiñones, M., Puente, A.: Qualitative evaluation of the usability of a web-based survey tool to assess reading comprehension and metacognitive strategies of university students. In: Meiselwitz, G. (ed.) HCII 2020. LNCS, vol. 12194, pp. 110–129. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-49570-1_9

    Chapter  Google Scholar 

  31. Rojas, L.A., Macías, J.A.: Toward collisions produced in requirements rankings: a qualitative approach and experimental study. J. Syst. Softw. 158, 110417 (2019). Article 42. https://doi.org/10.1016/j.jss.2019.110417

Download references

Acknowledgment

This work was supported by the Chilean Ministry of Education and the University of Atacama (ATA1899 project).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to John W. Castro .

Editor information

Editors and Affiliations

Appendix A: Primary Studies

Appendix A: Primary Studies

This appendix lists the references of the primary studies used for the mapping study described in this paper.

[PS1] Gonçalves, L. F., Vasconcelos, L. G., Munson, E. V., Baldochi, L. A.: Supporting adaptation of web applications to the mobile environment with automated usability evaluation. In: 31st Annual ACM Symposium on Applied Computing (SAC’16), ACM, Pisa, Italy, pp. 787–794 (2016). https://doi.org/10.1145/2851613.2851863.

[PS2] Assila, A., de Oliveira, K. M., Ezzedine, H.: An environment for integrating subjective and objective usability findings based on measures. In: 2016 IEEE Tenth International Conference on Research Challenges in Information Science (RCIS’16), IEEE, Grenoble, France, pp. 1–12 (2016). https://doi.org/10.1109/RCIS.2016.7549320.

[PS3] Grigera, J., Garrido, A., Rivero, J. M., Rossi, G.: Automatic detection of usability smells in web applications. International Journal of Human-Computer Studies 97, 129–148 (2017a). https://doi.org/10.1016/j.ijhcs.2016.09.009.

[PS4] Paternò, F., Schiavone, A. G., Conti, A.: Customizable automatic detection of bad usability smells in mobile accessed web applications. In: 19th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI’17), ACM, Vienna, Austria, article 42, pp. 1–11 (2017). https://doi.org/10.1145/3098279.3098558.

[PS5] Grigera, J., Garrido, A., Rossi, G.: Kobold: web usability as a service. In: 2017 32nd IEEE/ACM International Conference on Automated Software Engineering (ASE’17), Urbana, IL, USA, pp. 990–995 (2017). https://doi.org/10.1109/ASE.2017.8115717.

[PS6] Soui, M., Chouchane, M., Gasmi, I., Mkaouer, M. W.: PLAIN: PLugin for predicting the usability of mobile user interface. In: 12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP’17) - Vol. 1: GRAPP, Porto, Portugal, pp. 127–136 (2017). https://doi.org/10.5220/0006171201270136.

[PS7] Desolda, G., Gaudino, G., Lanzilotti, R., Federici, S., Cocco, A.: UTAssistant: A web platform supporting usability testing in italian public administrations. In: 12th Biannual Conference of the Italian SIGCHI Chapter (CHItaly’17), Cagliari, Italy, pp. 138–142 (2017).

[PS8] Federici, S., Mele, M. L., Lanzilotti, R., Desolda, G., Bracalenti, M., Meloni, F., Gaudino, G., Cocco, A., Amendola, M.: UX evaluation design of UTAssistant: A new usability testing support tool for italian public administrations. In: Kurosu M. (ed.) Human-Computer Interaction. Theories, Methods, and Human Issues. HCI 2018 (55–67). Lecture Notes in Computer Science, vol 10901. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91238-7_5.

[PS9] Federici, S., Mele, M. L., Bracalenti, M., Buttafuoco, A., Lanzilotti, R., Desolda, G.: Bio-behavioral and self-report user experience evaluation of a usability assessment platform (UTAssistant). In: 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP’19) - Vol. 2: HUCAPP, Prague, CZ, pp. 19–27 (2019).

[PS10] Marenkov, J., Robal, T., Kalja, A.: Guideliner: A tool to improve web UI development for better usability. In: 8th International Confe- rence on Web Intelligence, Mining and Semantics (WIMS’18), ACM, Novi Sad, Serbia, article 17, pp. 1–9 (2018). https://doi.org/10.1145/3227609.3227667.

[PS11] Chettaoui, N. Bouhlel, M. S.: I2Evaluator: An aesthetic metric-tool for evaluating the usability of adaptive user interfaces. In: Hassanien, A. E., Shaalan, K., Gaber, T., and Tolba, M. F. (eds.) Proceedings of the International Conference on Advanced Intelligent Systems and Informatics. AISI 2017 (374–383). Advances in Intelligent Systems and Computing, vol 639. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-64861-3_35.

[PS12] Barra, S., Francese, R., Risi, M.: Automating mockup-based usability testing on the mobile device. In: Miani, R., Camargos, L., Zarpelão, B., Rosas, E., and Pasquini, R. (eds.) Green, Pervasive, and Cloud Computing. GPC 2019 (128–143). Lecture Notes in Computer Science, vol 11484. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-19223-5_10.

[PS13] Liu, Z., Chen, C., Wang, J., Huang, Y., Hu, J., Wang, Q.: Owl eyes: Spotting UI display issues via visual understanding. In: 35th IEEE/ACM International Conference on Automated Software Engineering (ASE’20), ACM, Virtual Event Australia, pp. 398–409 (2020). https://doi.org/10.1145/3324884.3416547.

[PS14] Bacíková, M., Porubän, J., Sulír, M., Chodarev, S., Steingartner, W., Madeja, M.: Domain usability evaluation. Electronics 10(16), 1–28, article 1963, (2021). https://doi.org/10.3390/electronics10161963.

[PS15] Al-Sakran, H. O. Alsudairi, M. A.: Usability and accessibility assessment of saudi arabia mobile E-government websites. IEEE Access 9, 48254–48275 (2021). https://doi.org/10.1109/ACCESS.2021.3068917.

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Castro, J.W., Garnica, I., Rojas, L.A. (2022). Automated Tools for Usability Evaluation: A Systematic Mapping Study. In: Meiselwitz, G. (eds) Social Computing and Social Media: Design, User Experience and Impact. HCII 2022. Lecture Notes in Computer Science, vol 13315. Springer, Cham. https://doi.org/10.1007/978-3-031-05061-9_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-05061-9_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-05060-2

  • Online ISBN: 978-3-031-05061-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics