Skip to main content

How Many Answers Are Enough? Optimal Number of Answers for Q&A Sites

  • Conference paper
Social Informatics (SocInfo 2012)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 7710))

Included in the following conference series:

Abstract

With the proliferation of the social web questions about information quality and optimization attract the attention of IS scholars. Question-answering (QA) sites, such as Yahoo!Answers, have the potential to produce good answers, but at the same time not all answers are good and not all QA sites are alike. When organizations design and plan for the integration of question answering services on their sites, identification of good answers and process optimization become critical. Arguing that ‘given enough answers all questions are answered successfully,’ this paper identifies the optimal number of posts that generate high quality answers. Based on content analysis of Yahoo! Answers’ informational questions (n=174) and their answers (n=1,023), the study found that seven answers per question are ‘enough’ to provide a good answer.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Briggs, R.O., Nunamaker, J., Sprague, R.: Introduction to the special section: social aspects of sociotechnical systems. Journal of Management Information Systems 27(1), 13–16 (2010)

    Article  Google Scholar 

  2. Ballou, D., Madnick, S., Wang, R.: Special section: assuring information quality. Journal of Management Information & Systems 20(3), 9–11 (2003)

    Google Scholar 

  3. Nelson, R.R., Todd, P.A., Wixom, B.: Antecedents of information and system quality: an empirical examination within the context of data warehousing. Journal of Management Information Systems 21(4), 199–235 (2005)

    Google Scholar 

  4. Schweik, C.M., English, R.C., Kisting, M., Haire, S.: Brooks’ versus Linus’ law: an empirical test of open source projects. In: Proceedings of the 2008 International Conference on Digital Government Research, pp. 423–424. ACM, Montreal (2008)

    Google Scholar 

  5. Howe, J.: The rise of crowdsourcing. Wired 14(6) (2006), http://www.wired.com/wired/archive/14.06/crowds.html

  6. Howe, J.: Crowdsourcing. Crown Publishing Group, New York (2008)

    Google Scholar 

  7. Leimeister, J.M., Huber, M., Bretschneider, U., Krcmar, H.: Leveraging crowdsourcing: activation-supporting components for IT-based ideas competition. Journal of Management Information Systems 26(1), 197–224 (2009)

    Article  Google Scholar 

  8. Giles, J.: Internet encyclopedias go head to head. Nature 438, 900–901 (2005), http://www.nature.com/news/2005/051212/full/438900a.html

    Article  Google Scholar 

  9. Fichman, P.: A comparative assessment of answer quality on four question answering sites. Journal of Information Science 37(5), 476–486 (2011)

    Article  Google Scholar 

  10. Keen, E.: The Cult of the Amateur: How Today’s Internet is Killing Our Culture. Doubleday/Currency, New York (2008)

    Google Scholar 

  11. Weinberger, D.: Everything is Miscellaneous: The Power of the New Digital Disorder. Henry Holt & Co., New York (2007)

    Google Scholar 

  12. Surowiecki, J.: The Wisdom of Crowds. Anchor Books, New York (2004)

    Google Scholar 

  13. Raymond, E.: The cathedral and the bazaar. Knowledge, Technology & Policy 12(3), 23–49 (1999)

    Article  MathSciNet  Google Scholar 

  14. Brooks Jr., F.P.: The Mythical Man-Month: Essays on Software Engineering. Addison-Wesley Publishing Company, Reading (1975)

    Google Scholar 

  15. Noguchi, Y.: Web searches go low-tech: you ask, a person answers. Washington Post, p. A01 (2006), http://www.washingtonpost.com/wp-dyn/content/article/2006/08/15/AR2006081501142.htm

  16. Yahoo Answers hits 200 million visitors worldwide! Yahoo Answers Blog. Yahoo (2009), http://yanswersblog.com/index.php/archives/2009/12/14/yahoo-answers-hits-200-million-visitors-worldwide/

  17. Harper, F.M., Raban, D., Rafaeli, S., Konstan, J.: Predictors of answer quality in online Q&A sites. In: Proceedings of the Conference on Human Factors in Computing Systems, pp. 865–874. ACM, New York (2008)

    Google Scholar 

  18. Shachaf, P.: Social reference: a unifying theory. Library & Information Science Research 32(1), 66–76 (2010)

    Article  Google Scholar 

  19. Agichtein, E., Castillo, C., Donato, D., Gionides, A., Mishne, G.: Finding high-quality content in social media. In: Proceedings of the International Conference on Web Search and Web Data Mining, pp. 183–194. ACM, Palo Alto (2008)

    Chapter  Google Scholar 

  20. Gazan, R.: Microcollaborations in a social Q&A community. Information Processing & Management 46(6), 693–702 (2010)

    Article  Google Scholar 

  21. Harper, F.M., Weinberg, J., Logie, J., Konstan, J.: Question types in social Q&A sites. First Monday 15(7) (2010), http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/viewArticle/2913/2571

  22. Kim, S., Oh, S.: Users’ relevance criteria for evaluating answers in social Q&A site. Journal of the American Society for Information Science and Technology 60(4), 716–727 (2009)

    Article  MathSciNet  Google Scholar 

  23. Kim, S.: Questioners’ credibility judgments of answers in a social question and answer site. Information Research 15(2), paper 432 (2010), http://InformationR.net/ir/15-2/paper432.html

    Google Scholar 

  24. Rosenbaum, H., Shachaf, P.: A structuration approach to online communities of practice: the case of Q&A communities. Journal of the American Society for Information Science and Technology 61(9), 1933–1944 (2010)

    Article  Google Scholar 

  25. Shachaf, P.: The paradox of expertise: is the Wikipedia Reference Desk as good as your library? Journal of Documentation 65(6), 977–996 (2009)

    Article  Google Scholar 

  26. Gazan, R.: Specialists and synthesists in a question answering community. In: Proceedings of the American Society for Information Science & Technology Annual Meeting, ASIST, Austin, pp. 1–10 (2006)

    Google Scholar 

  27. Gazan, R.: Seekers, sloths and social reference: Homework questions submitted to a question-answering community. New Review of Hypermedia & Multimedia 13(2), 239–248 (2007)

    Article  Google Scholar 

  28. Nam, K.K., Ackerman, M.S., Adamic, L.A.: Questions in, knowledge in?: a study of Naver’s question answering community. In: Proceedings of the 27th International Conference on Human Factors in Computing Systems, pp. 779–788. ACM, Boston (2009)

    Google Scholar 

  29. O’Neill, N.: Chacha, Yahoo!, and Amazon. Searcher 15(4), 7–11 (2007)

    MathSciNet  Google Scholar 

  30. Saxton, M.L., Richardson, J.: Understanding Reference Transactions: Transforming an Art into a Science. Academic Press, San Diego (2002)

    Google Scholar 

  31. DeLone, W.H., McLean, E.: The DeLone and McLean model of information systems success: a ten-year update. Journal of Management Information Systems 19(4), 9–30 (2003)

    Google Scholar 

  32. Rieh, S.: Judgment of information quality and cognitive authority in the Web. Journal of the American Society for Information Science and Technology 53(2), 145–161 (2002)

    Article  Google Scholar 

  33. Fallis, D.: On verifying the accuracy of information: Philosophical perspectives. Library Trends 52(3), 463–487 (2004)

    Google Scholar 

  34. Frické, M., Fallis, D.: Indicators of accuracy for answers to ready reference questions on the Internet. Journal of the American Society for Information Science and Technology 55(3), 238–245 (2004)

    Article  Google Scholar 

  35. Arazy, O., Nov, O., Patterson, R., Yeo, L.: Information quality in Wikipedia: the effects of group composition and task conflict. Journal of Management Information Systems 27(4), 71–98 (2011)

    Article  Google Scholar 

  36. Stvilia, B., Twidale, M.D., Smith, L.C., Gasser, L.: Information quality work organization in Wikipedia. Journal of the American Society for Information Science and Technology 59(6), 983–1001 (2008)

    Article  Google Scholar 

  37. Blooma, J.M., Chua, A.Y.K., Goh, D.: A predictive framework for retrieving the best answer. In: Proceedings of the 2008 ACM Symposium on Applied Computing. ACM, Fortaleza (2008)

    Google Scholar 

  38. Adamic, L.A., Zhang, J., Bakshy, E., Ackerman, M.: Knowledge sharing and Yahoo! Answers: Everyone knows something. In: Proceedings of the International World Wide Web Conference, ACM, Beijing (2008)

    Google Scholar 

  39. Poston, R., Speier, C.: Effective use of knowledge management systems: A process model of content ratings and credibility indicators. MIS Quarterly 29(2), 221–244 (2005)

    Google Scholar 

  40. Bouguessa, M., Dumoulin, B., Wang, S.: Identifying authoritative actors in question-answering forums: The case of Yahoo! Answers. In: Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 866–874. ACM, Las Vegas (2009)

    Google Scholar 

  41. Jurczyk, P., Agichtein, E.: Discovering authorities in question answer communities by using link analysis. In: Proceedings of the Sixteenth ACM Conference on Information and Knowledge Management, pp. 919–922. ACM, New York (2007a)

    Chapter  Google Scholar 

  42. Jurczyk, P., Agichtein, E.: Hits on question answer portals: exploration of link analysis for author ranking. In: Annual ACM Conference on Research and Development in Information Retrieval, pp. 845–846. ACM, Amsterdam (2007b)

    Google Scholar 

  43. Chen, W., Zeng, Q., Wenyin, L.: A user reputation model for a user-interactive question answering system. In: Proceedings of the Second International Conference on Semantics, Knowledge, and Grid, pp. 40–45. IEEE Computer Society, Washington D.C (2006)

    Google Scholar 

  44. Adamic, L.A., Wei, X., et al.: Individual focus and knowledge contribution. First Monday 5(3) (2010)

    Google Scholar 

  45. Dom, B., Paranjpe, D.: A Bayesian technique for estimating the credibility of question answerers. Proceedings of the Society for Industrial and Applied Mathematics (SIAM), pp. 399–409. SIAM, Atlanta (2008), http://www.siam.org/proceedings/datamining/2008/dm08_36_Dom.pdf

    Google Scholar 

  46. Ong, C., Day, M., Hsu, M.: The measurement of user satisfaction with question answering systems. Information & Management 46(7), 397–403 (2009)

    Article  Google Scholar 

  47. Harper, F.M., Moy, D., Konstan, J.: Facts or friends?: Distinguishing informational and conversational questions in social Q&A sites. In: Conference on Human Factors in Computing Systems, pp. 759–768. ACM, Boston (2009)

    Google Scholar 

  48. Li, B., Liu, Y., Ram, A., Garcia, E.V., Agichtein, E.: Exploring question subjectivity prediction in community QA. In: Proceedings of the 31st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 735–736. ACM, Singapore (2009)

    Google Scholar 

  49. Liu, Y., Li, S., Cao, Y., et al.: Understanding and summarizing answers in community-based question answering services. In: Proceedings of the 22nd International Conference on Computational Linguistics, pp. 497–504. ACL, Manchester (2008)

    Google Scholar 

  50. Hitwise. U.S. visits to question and answer websites increased 118 percent year-over-year. Hitwise, New York (March 19, 2008), http://www.hitwise.com/news/us200803.html

  51. Neuendorf, K.: The Content Analysis Guidebook. Sage, Thousand Oaks (2002)

    Google Scholar 

  52. Lombard, M., Snyder-Duch, J., Bracken, C.: Content analysis in mass communication: assessment and reporting of intercoder reliability. Human Communication Research 28(4), 587–604 (2002)

    Article  Google Scholar 

  53. Krippendorff, K.: Content Analysis: An Introduction to its Methodology, 2nd edn. Sage, Thousand Oaks (2004)

    Google Scholar 

  54. Landis, J.R., Koch, G.: An application of hierarchical Kappa-type statistics in the assessment of majority agreement among multiple observers. Biometrics 33(2), 363–374 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  55. Zhang, X., Feng, Z.: Group size and incentives to contribute: A natural experiment at Chinese Wikipedia. American Economic Review 101(4), 1601–1615 (2011)

    Article  Google Scholar 

  56. Meneely, A., Williams, L.: Secure open source collaboration: an empirical study of Linus’ Law. In: Proceedings of the 16th ACM Conference on Computer and Communications Security, pp. 453–462. ACM, New York (2009)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Fichman, P. (2012). How Many Answers Are Enough? Optimal Number of Answers for Q&A Sites. In: Aberer, K., Flache, A., Jager, W., Liu, L., Tang, J., Guéret, C. (eds) Social Informatics. SocInfo 2012. Lecture Notes in Computer Science, vol 7710. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-35386-4_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-35386-4_20

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-35385-7

  • Online ISBN: 978-3-642-35386-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics