Abstract
The objective of this chapter is to increase awareness of what constitutes a sound scientific approach to evaluation through user studies in visualization, and to provide basic knowledge of current research practice relating to usability and evaluation. The content covers the most fundamental and relevant issues to consider during the different phases of an evaluation: planning, design, execution, analysis of results and reporting. It outlines how to proceed to achieve high quality results and points out common mistakes during the different phases and how these could be avoided. The chapter can be used as a guide when planning to conduct an evaluation study. The reader will also learn to better judge the relevance and quality of a publication presenting an evaluation when reviewing such work, since the same guidelines apply.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
The eurograpics conference on visualization. accessed 2013-01-29. URL http://www.eurovis2013.de/content/full-paper-submission
Iso 9241-11: Ergonomics requirements for office work with visual display terminals, part 11. guidance on usability (1998)
Amar, R., Eagan, J., Stasko, J.: Low-level components of analytic activity in information visualization. In: Proceedings of the IEEE Symposium on Information Visualization, InfoVis’05, pp. 111–147 (2005)
Amar, R., Stasko, J.: Knowledge task-based framework for design and evaluation of information visualizations. In: Proceedings of the IEEE Symposium on Information Visualization, InfoVis’04, pp. 143–149 (2004)
Buchner, A., Erdfelder, E., Faul, F.: How to use g*power (1997). URL http://www.psycho.uni-duesseldorf.de/aap/projects/gpower/how_to_use_gpower.html
Card, S.K., Mackinlay, J.D., Shneiderman, B.: Readings in information visualization: using vision to think. Morgan Kaufmann (1999)
Carpendale, S.: Evaluating Information Visualizations., pp. 19–45. Springer (2008)
Chen, C.: Top 10 unsolved information visualization problems. IEEE computer graphics and applications 25(4), 12–16 (2005)
Cohen, J.: A power primer. Psychological Bulletin 112(1), 155–159 (1992)
Downs, J.S., Holbrook, M.B., Sheng, S., Cranor, L.F.: Are your participants gaming the system?: screening mechanical turk workers. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’10, pp. 2399–2402 (2010)
Dumas, J., Loring, B.: Moderating Usability Tests: Principles & Practices for Interacting. Morgan Kaufman (2008)
Faulkner, L.: Beyond the five-user assumption: Benefits of increased sample sizes in usability testing. Behavior Research methods, Instruments and Computers 35, 379–383 (2003)
Field, A.P.: Discovering statistics using SPSS. Sage Publications Limited (2009)
Field, A.P., Hole, G.: How to Design and Report Experiments. Sage Publications Ltd (2003)
Forsell, C.: A guide to scientific evaluation in visualization. In: Proceedings of the 14th International Conference Information Visualisation (IV’10), pp. 162–169. IEEE (2010)
Forsell, C., Cooper, M.D.: Scientific evaluation in visualization. In: Eurographics 2011-Tutorials, p. T6. The Eurographics Association (2011)
Forsell, C., Cooper, M.D.: A guide to reporting scientific evaluation in visualization. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, pp. 608–611. ACM (2012)
Forsell, C., Johansson, J.: Task-based evaluation of multirelational 3d and standard 2d parallel coordinates. In: Proceedings of SPIE 2007 - Visualization and Data Analysis. The International Society for Optical Engineering (2007)
Forsell, C., Seipel, S., Lind, M.: Surface glyphs for efficient visualization of spatial multivariate data. Information Visualization 5(2), 112–124 (2006)
Frøkjær, E., Hertzum, M., Hornbæk, K.: Measuring usability: are effectiveness, efficiency and satisfaction really correlated? In: In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’00), pp. 345–352. The Hague, Netherlands (2000)
Gabbard Jr., J.L., Swan II, J.E., North, C.: Quantitative and qualitative methods for human-subject visualization experiments. Tutorial presented at VisWeek 2011 Providence, R.I., USA (2011)
Graziano, A.M., Raulin, M.L.: Research methods: A process of inquiry (2nd ed.). HarperCollins College Publishers, New York, NY, US (1993)
Greene, J., D’Oliviera, M.: Learning to use statisitical tests in psychology, 2nd edition. Open University Press, Philadelphia (2001)
Hornbæk, K.: Current practice in measuring usability: challenges to usability studies and research. International Journal of Human Computer Studies 64(2), 79–102 (2006)
https://www.mturk.com:Amazonmechanicalturk.accessed2013-01-29
IEEE: Vis. accessed 2013-01-29. URL http://ieeevis.org/
IEEE: Visweek. paper submission guidelines. accessed 2013-01-29. URL http://visweek.vgtc.org/year/2012/info/call-participation/paper-submission-guidelines
Johansson, J., Forsell, C., Lind, M., Cooper, M.: Perceiving patterns in parallel coordinates: Determining thresholds for identification of relationships. Information Visualization 7(2), 152–162 (2008)
Keim, D.A., Bergeron, R.D., Pickett, R.M.: Test Data Sets for Evaluating Data Visualization Techniques, pp. 9–22. Springer Verlag (1995)
Kosara, R., Healey, C.G., Interrante, V., Laidlaw, D.H., Ware, C.: Thoughts on User Studies: Why, How, and When. Computer Graphics and Applications 23(4), 20–25 (2003)
Lam, H.: A framework of interaction costs in information visualization. IEEE Transactions on Visualization and Computer Graphics 14, 1149–1156 (2008)
Lam, H., Bertini, E., Isenberg, P., Catherine, P., Carpendale, S.: Empirical studies in information visualization: Seven scenarios. IEEE Transactions on Visualization and Computer Graphics 18(9), 1520–1536 (2012)
Laramee, R.S., Kosara, R.: Challenges and unsolved problems. In: A. Kerren, A. Ebert, J. Meyer (eds.) Human-Centered Visualization Environments, chap. 5, pp. 231–254. Springer Lecture Notes in Computer Science, Volume 4417: GI-Dagstuhl Research Seminar Dagstuhl Castle, Germany, March 5–8, 2006 Revised Lectures. (2007)
Morse, E., Lewis, M.: Evaluating visualizations: using a taxonomic guide. International Journal of Human-Computer Studies 53, 637–662 (2000)
Nielsen, J.: Heuristic Evaluation. In: J. Nielsen, R.L. Mack (eds.) Usability Inspection Methods. Wiley & Sons, New York, NY, US (1994)
Plaisant, C.: The challenge of information visualization evaluation. In: Proceedings of the Working Conference on Advanced Visual Interfaces - AVI ’04, pp. 109–116. ACM Press, New York, New York, USA (2004). DOI 10.1145/989863.989880. URL http://portal.acm.org/citation.cfm?doid=989863.989880
Purchase, H.C.: Experimental Human-Computer Interaction: A Practical Guide with Visual Examples. Cambridge University Press (2009)
Stasko, J.: Evaluating information visualizations: Issues and opportunities (a position statement). In: Proceedings of BEyond time and errors: novel evaLuation methods for Information Visualization (BELIV06). Venice, Italy (2006)
Tominski, C., Forsell, C., Johansson, J.: Interaction support for visual comparison inspired by natural behavior. IEEE Transactions on Visualization and Computer Graphics 18(12), 2719–2728 (2012). DOI 10.1109/TVCG.2012.237
Tory, M., Möller, T.: Evaluating visualizations: Do expert reviews work. IEEE Computer Graphics and Applications 25, 8–11 (2005)
Tullis, T., Albert, B.: Measuring the User Experience. Collecting, Analyzing, and Presenting Usability Metrics. Morgan Kaufmann (2008)
Valiati, E., Pimenta, M., Freitas, C.: A taxonomy of tasks for guiding the evaluation of multidimensional visualizations. In: Proceedings of BEyond time and errors: novel evaLuation methods for Information Visualization (BELIV06), pp. 1–6 (2006)
Vrotsou, K., Forsell, C., Cooper, M.D.: 2D and 3D Representations for Feature Recognition in Time Geographical Diary Data. Information Visualization 9(4), 263–276 (2010)
Wood, C., Giles, D., Percy, C.: Your Psychology Project Handbook: Becoming a Researcher. Pearson (2009)
Woolrych, A., Cockton, G.: Why and when five test users aren’t enough. In: In Proceedings of the IHM-HCI 2001 Conference, pp. 105–108 (2001)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer Science+Business Media New York
About this chapter
Cite this chapter
Forsell, C., Cooper, M. (2014). An Introduction and Guide to Evaluation of Visualization Techniques Through User Studies. In: Huang, W. (eds) Handbook of Human Centric Visualization. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-7485-2_11
Download citation
DOI: https://doi.org/10.1007/978-1-4614-7485-2_11
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-7484-5
Online ISBN: 978-1-4614-7485-2
eBook Packages: Computer ScienceComputer Science (R0)