Skip to main content

An Introduction and Guide to Evaluation of Visualization Techniques Through User Studies

  • Chapter
  • First Online:
Handbook of Human Centric Visualization

Abstract

The objective of this chapter is to increase awareness of what constitutes a sound scientific approach to evaluation through user studies in visualization, and to provide basic knowledge of current research practice relating to usability and evaluation. The content covers the most fundamental and relevant issues to consider during the different phases of an evaluation: planning, design, execution, analysis of results and reporting. It outlines how to proceed to achieve high quality results and points out common mistakes during the different phases and how these could be avoided. The chapter can be used as a guide when planning to conduct an evaluation study. The reader will also learn to better judge the relevance and quality of a publication presenting an evaluation when reviewing such work, since the same guidelines apply.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. The eurograpics conference on visualization. accessed 2013-01-29. URL http://www.eurovis2013.de/content/full-paper-submission

  2. Iso 9241-11: Ergonomics requirements for office work with visual display terminals, part 11. guidance on usability (1998)

    Google Scholar 

  3. Amar, R., Eagan, J., Stasko, J.: Low-level components of analytic activity in information visualization. In: Proceedings of the IEEE Symposium on Information Visualization, InfoVis’05, pp. 111–147 (2005)

    Google Scholar 

  4. Amar, R., Stasko, J.: Knowledge task-based framework for design and evaluation of information visualizations. In: Proceedings of the IEEE Symposium on Information Visualization, InfoVis’04, pp. 143–149 (2004)

    Google Scholar 

  5. Buchner, A., Erdfelder, E., Faul, F.: How to use g*power (1997). URL http://www.psycho.uni-duesseldorf.de/aap/projects/gpower/how_to_use_gpower.html

  6. Card, S.K., Mackinlay, J.D., Shneiderman, B.: Readings in information visualization: using vision to think. Morgan Kaufmann (1999)

    Google Scholar 

  7. Carpendale, S.: Evaluating Information Visualizations., pp. 19–45. Springer (2008)

    Google Scholar 

  8. Chen, C.: Top 10 unsolved information visualization problems. IEEE computer graphics and applications 25(4), 12–16 (2005)

    Article  Google Scholar 

  9. Cohen, J.: A power primer. Psychological Bulletin 112(1), 155–159 (1992)

    Article  Google Scholar 

  10. Downs, J.S., Holbrook, M.B., Sheng, S., Cranor, L.F.: Are your participants gaming the system?: screening mechanical turk workers. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’10, pp. 2399–2402 (2010)

    Google Scholar 

  11. Dumas, J., Loring, B.: Moderating Usability Tests: Principles & Practices for Interacting. Morgan Kaufman (2008)

    Google Scholar 

  12. Faulkner, L.: Beyond the five-user assumption: Benefits of increased sample sizes in usability testing. Behavior Research methods, Instruments and Computers 35, 379–383 (2003)

    Article  Google Scholar 

  13. Field, A.P.: Discovering statistics using SPSS. Sage Publications Limited (2009)

    Google Scholar 

  14. Field, A.P., Hole, G.: How to Design and Report Experiments. Sage Publications Ltd (2003)

    Google Scholar 

  15. Forsell, C.: A guide to scientific evaluation in visualization. In: Proceedings of the 14th International Conference Information Visualisation (IV’10), pp. 162–169. IEEE (2010)

    Google Scholar 

  16. Forsell, C., Cooper, M.D.: Scientific evaluation in visualization. In: Eurographics 2011-Tutorials, p. T6. The Eurographics Association (2011)

    Google Scholar 

  17. Forsell, C., Cooper, M.D.: A guide to reporting scientific evaluation in visualization. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, pp. 608–611. ACM (2012)

    Google Scholar 

  18. Forsell, C., Johansson, J.: Task-based evaluation of multirelational 3d and standard 2d parallel coordinates. In: Proceedings of SPIE 2007 - Visualization and Data Analysis. The International Society for Optical Engineering (2007)

    Google Scholar 

  19. Forsell, C., Seipel, S., Lind, M.: Surface glyphs for efficient visualization of spatial multivariate data. Information Visualization 5(2), 112–124 (2006)

    Article  Google Scholar 

  20. Frøkjær, E., Hertzum, M., Hornbæk, K.: Measuring usability: are effectiveness, efficiency and satisfaction really correlated? In: In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’00), pp. 345–352. The Hague, Netherlands (2000)

    Google Scholar 

  21. Gabbard Jr., J.L., Swan II, J.E., North, C.: Quantitative and qualitative methods for human-subject visualization experiments. Tutorial presented at VisWeek 2011 Providence, R.I., USA (2011)

    Google Scholar 

  22. Graziano, A.M., Raulin, M.L.: Research methods: A process of inquiry (2nd ed.). HarperCollins College Publishers, New York, NY, US (1993)

    Google Scholar 

  23. Greene, J., D’Oliviera, M.: Learning to use statisitical tests in psychology, 2nd edition. Open University Press, Philadelphia (2001)

    Google Scholar 

  24. Hornbæk, K.: Current practice in measuring usability: challenges to usability studies and research. International Journal of Human Computer Studies 64(2), 79–102 (2006)

    Article  Google Scholar 

  25. https://www.mturk.com:Amazonmechanicalturk.accessed2013-01-29

    Google Scholar 

  26. IEEE: Vis. accessed 2013-01-29. URL http://ieeevis.org/

  27. IEEE: Visweek. paper submission guidelines. accessed 2013-01-29. URL http://visweek.vgtc.org/year/2012/info/call-participation/paper-submission-guidelines

  28. Johansson, J., Forsell, C., Lind, M., Cooper, M.: Perceiving patterns in parallel coordinates: Determining thresholds for identification of relationships. Information Visualization 7(2), 152–162 (2008)

    Article  Google Scholar 

  29. Keim, D.A., Bergeron, R.D., Pickett, R.M.: Test Data Sets for Evaluating Data Visualization Techniques, pp. 9–22. Springer Verlag (1995)

    Google Scholar 

  30. Kosara, R., Healey, C.G., Interrante, V., Laidlaw, D.H., Ware, C.: Thoughts on User Studies: Why, How, and When. Computer Graphics and Applications 23(4), 20–25 (2003)

    Article  Google Scholar 

  31. Lam, H.: A framework of interaction costs in information visualization. IEEE Transactions on Visualization and Computer Graphics 14, 1149–1156 (2008)

    Article  Google Scholar 

  32. Lam, H., Bertini, E., Isenberg, P., Catherine, P., Carpendale, S.: Empirical studies in information visualization: Seven scenarios. IEEE Transactions on Visualization and Computer Graphics 18(9), 1520–1536 (2012)

    Article  Google Scholar 

  33. Laramee, R.S., Kosara, R.: Challenges and unsolved problems. In: A. Kerren, A. Ebert, J. Meyer (eds.) Human-Centered Visualization Environments, chap. 5, pp. 231–254. Springer Lecture Notes in Computer Science, Volume 4417: GI-Dagstuhl Research Seminar Dagstuhl Castle, Germany, March 5–8, 2006 Revised Lectures. (2007)

  34. Morse, E., Lewis, M.: Evaluating visualizations: using a taxonomic guide. International Journal of Human-Computer Studies 53, 637–662 (2000)

    Article  MATH  Google Scholar 

  35. Nielsen, J.: Heuristic Evaluation. In: J. Nielsen, R.L. Mack (eds.) Usability Inspection Methods. Wiley & Sons, New York, NY, US (1994)

    Google Scholar 

  36. Plaisant, C.: The challenge of information visualization evaluation. In: Proceedings of the Working Conference on Advanced Visual Interfaces - AVI ’04, pp. 109–116. ACM Press, New York, New York, USA (2004). DOI 10.1145/989863.989880. URL http://portal.acm.org/citation.cfm?doid=989863.989880

  37. Purchase, H.C.: Experimental Human-Computer Interaction: A Practical Guide with Visual Examples. Cambridge University Press (2009)

    Google Scholar 

  38. Stasko, J.: Evaluating information visualizations: Issues and opportunities (a position statement). In: Proceedings of BEyond time and errors: novel evaLuation methods for Information Visualization (BELIV06). Venice, Italy (2006)

    Google Scholar 

  39. Tominski, C., Forsell, C., Johansson, J.: Interaction support for visual comparison inspired by natural behavior. IEEE Transactions on Visualization and Computer Graphics 18(12), 2719–2728 (2012). DOI 10.1109/TVCG.2012.237

    Article  Google Scholar 

  40. Tory, M., Möller, T.: Evaluating visualizations: Do expert reviews work. IEEE Computer Graphics and Applications 25, 8–11 (2005)

    Article  Google Scholar 

  41. Tullis, T., Albert, B.: Measuring the User Experience. Collecting, Analyzing, and Presenting Usability Metrics. Morgan Kaufmann (2008)

    Google Scholar 

  42. Valiati, E., Pimenta, M., Freitas, C.: A taxonomy of tasks for guiding the evaluation of multidimensional visualizations. In: Proceedings of BEyond time and errors: novel evaLuation methods for Information Visualization (BELIV06), pp. 1–6 (2006)

    Google Scholar 

  43. Vrotsou, K., Forsell, C., Cooper, M.D.: 2D and 3D Representations for Feature Recognition in Time Geographical Diary Data. Information Visualization 9(4), 263–276 (2010)

    Article  Google Scholar 

  44. Wood, C., Giles, D., Percy, C.: Your Psychology Project Handbook: Becoming a Researcher. Pearson (2009)

    Google Scholar 

  45. Woolrych, A., Cockton, G.: Why and when five test users aren’t enough. In: In Proceedings of the IHM-HCI 2001 Conference, pp. 105–108 (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Camilla Forsell .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media New York

About this chapter

Cite this chapter

Forsell, C., Cooper, M. (2014). An Introduction and Guide to Evaluation of Visualization Techniques Through User Studies. In: Huang, W. (eds) Handbook of Human Centric Visualization. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-7485-2_11

Download citation

  • DOI: https://doi.org/10.1007/978-1-4614-7485-2_11

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4614-7484-5

  • Online ISBN: 978-1-4614-7485-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics