Skip to main content

Integrating Manual and Automatic Evaluations to Measure Accessibility Barriers

  • Conference paper
Computers Helping People with Special Needs (ICCHP 2012)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 7382))

Included in the following conference series:

Abstract

Explicit syntax and implicit semantics of Web coding are typically addressed as distinct dominions in providing metrics for content accessibility. A more down-to-earth portrait about barriers and their impact on users with disa-bilities could be obtained whether any quantitative synthesis about number and size of barriers integrated measurements from automatic checks and human as-sessments. In this work, we present a metric to evaluate accessibility as a unique meas-ure of both syntax correctness and semantic consistence, according to some general assumptions about relationship and dependencies between them. WCAG 2.0 guidelines are used to define boundaries for any single barrier eval-uation, either from a syntactic point of view, or a subjective/human one. In or-der to assess our metric, gathered data form a large scale accessibility monitor has been utilized.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Battistelli, M., Mirri, S., Muratori, L.A., Salomoni, P., Spagnoli, S.: Making the Tree Fall Sound: Reporting Web Accessibility with the VaMoLà Monitor. Accepted for Publication in Proceedings of the 5th International Conference on Methodologies, Technologies and Tools enabling e-Government, Camerino, Italy, June 30-July 1 (2011)

    Google Scholar 

  2. Brajnik, G.: Web Accessibility Testing: When the Method Is the Culprit. In: Miesenberger, K., Klaus, J., Zagler, W.L., Karshmer, A.I. (eds.) ICCHP 2006. LNCS, vol. 4061, pp. 156–163. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  3. Brajnik, G., Lomuscio, R.: SAMBA: a Semi-Automatic Method for Measuring Barriers of Accessibility. In: Proceedings of the 9th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 43–50 (2007)

    Google Scholar 

  4. Chua, F.C.T., Lim, E.: Trust network inference for online rating data using generative models. In: KDD 2010 Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Washington, D.C., USA (2010)

    Google Scholar 

  5. Gay, G.R., Li, C.: AChecker: Open, Interactive, Customizable, Web Accessibility Check-ing. In: Proceedings of the 7th ACM International Cross Disciplinary Conference on Web Acces-sibility (W4A 2010), Raleigh, North Carolina, USA. ACM Press, New York (2010)

    Google Scholar 

  6. IDRC, Ontario College of Art and Design, AChecker (2012), http://www.atutor.ca/achecker/index.php

  7. Italian parliament. Law nr. 4 – 01/09/2004. Official Journal nr. 13 – 01/17/ 2004 (January 2004)

    Google Scholar 

  8. Lauw, H.W., Lim, E.: A Multitude of Opinions: Mining Online Rating Data. In: Proceedings of the National Science Foundation Symposium on Next Generation of Data Mining and Cyber-Enabled Discovery for Innovation (NGDM 2007), Baltimore (October 2007)

    Google Scholar 

  9. Mirri, S., Muratori, L.A., Salomoni, P.: Monitoring accessibility: large scale evaluations at a geo political level. In: Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2011), Dundee, Scotland, UK (October 2011)

    Google Scholar 

  10. Mirri, S., Muratori, L.A., Salomoni, P., Battistelli, M.: Getting one voice: tuning up experts’ assessment in measuring accessibility. In: Proceedings of the W4A 2012, Lyon, France, April 16-17. ACM Press, New York (2012)

    Google Scholar 

  11. Parmanto, B., Zeng, X.: Metric for Web Accessibility Evaluation. Journal of the American Society for Information Science and Technology 56(13), 1394–1404 (2005)

    Article  Google Scholar 

  12. Vigo, M., Arrue, M., Brajnik, G., Lomuscio, R., Abascal, J.: Quantitative Metrics for Measuring Web Accessibility. In: Proceedings of the W4A 2007, Banff, Alberta, Canada, May 7-8, pp. 99–107. ACM Press, New York (2007)

    Chapter  Google Scholar 

  13. World Wide Web Consortium. Web Content Accessibility Guidelines (WCAG) 2.0 (2008), http://www.w3.org/TR/WCAG20/

  14. Yesilada, Y., Brajnik, G., Harper, S.: How Much Does Expertise Matter? A Barrier Walkthrough Study with Experts and Non-Experts. In: Proceedings of the Eleventh International ACM SIGACCESS Conference on Computers and Accessibility, ASSESTS 2009 (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Salomoni, P., Mirri, S., Muratori, L.A., Battistelli, M. (2012). Integrating Manual and Automatic Evaluations to Measure Accessibility Barriers. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds) Computers Helping People with Special Needs. ICCHP 2012. Lecture Notes in Computer Science, vol 7382. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31522-0_59

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-31522-0_59

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-31521-3

  • Online ISBN: 978-3-642-31522-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics