Skip to main content

An Algorithm of Crowdsourcing Answer Integration Based on Specialty Categories of Workers

  • Conference paper
  • First Online:
Proceedings of the Fifth Euro-China Conference on Intelligent Data Analysis and Applications (ECC 2018)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 891))

  • 690 Accesses

Abstract

The effective integration of crowdsourcing answers has become research hot spots in crowdsourcing quality control. Taking into account the influence of the specialty categories of workers on the accuracy of crowdsourced answers, a crowdsourced answer integration algorithm based on the specialty categories of workers is proposed(SCAI). Firstly, SCAI use the crowdsourced answer set to determine the difficulty of the task. Secondly calculate the accuracy of each crowdsourced answer, then obtain the professional classification of the workers and update the professional accuracy. Experiments were conducted on real data sets and compared with classical majority voting method(MV) and expectation maximization evaluation algorithm(EM). The results show that the proposed algorithm can effectively improve the accuracy of crowd-sourced answer.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Allahbakhsh, M., Benatallah, B., Ignjatovic, A., Motahari-Nezhad, H.R., Bertino, E., Dustdar, S.: Quality control in crowdsourcing systems: issues and directions. IEEE Internet Comput. 17(2), 76–81 (2013). https://doi.org/10.1109/MIC.2013.20

    Article  Google Scholar 

  2. Feng, J.H., Guo-Liang, L.I., Feng, J.H.: A survey on crowdsourcing. Chin. J. Comput. 38(9), 1713–1726 (2015). https://doi.org/10.11897/SP.J.1016.2015.01713

    Article  Google Scholar 

  3. Alonso, O., Mizzaro, S.: Can we get rid of TREC assessors? using mechanical turk for relevance assessment. In: SIGIR Workshop on the Future of IR Evaluation, pp. 19–23 (2009). https://doi.org/10.1016/j.ipm.2012.01.004

    Article  Google Scholar 

  4. Franklin, M.J., Kossmann, D., Kraska, T., Ramesh, S., Xin, R.: CrowdDB: answering queries with crowdsourcing. In: ACM SIGMOD International Conference on Management of Data, pp. 61–72. ACM (2011).https://doi.org/10.1145/1989323.1989331

  5. Lease, M., Carvalho, V.R., Yilmaz, E.: Crowdsourcing for search and data mining. ACM SIGIR Forum 45(1), 18–24 (2011). https://doi.org/10.1145/1988852.1988856

    Article  Google Scholar 

  6. Alabduljabbar, R., Al-Dossari, H.: A task ontology-based model for quality control in crowdsourcing systems. In: International Conference on Research in Adaptive and Convergent Systems, pp. 22–28. ACM (2016). https://doi.org/10.1145/2987386.2987413

  7. Li, G., Fan, J., Fan, J., Wang, J., Cheng, R.: Crowdsourced data management: overview and challenges. In: ACM International Conference on Management of Data, pp. 1711–1716. ACM (2017). https://doi.org/10.1145/3035918.3054776

  8. Muhammadi, J., Rabiee, H.R., Hosseini, A.: A unified statistical framework for crowd labeling. Knowl. Inf. Syst. 45(2), 271–294 (2015). https://doi.org/10.1007/s10115-014-0790-7

    Article  Google Scholar 

  9. Yue, D.J., Ge, Y.U., Shen, D.R., Xiao-Cong, Y.U.: Crowdsourcing quality evaluation strategies based on voting consistency. J. Northeast. Univ. 35(8), 1097–1101 (2014). https://doi.org/10.3969/j.issn.1005-3026.2014.08.008

    Article  Google Scholar 

  10. Ipeirotis, P.G., Provost, F., Wang, J.: Quality management on Amazon Mechanical Turk. In: ACM SIGKDD Workshop on Human Computation, pp. 64–67. ACM (2010). https://doi.org/10.1145/1837885.1837906

  11. Liu, X., Lu, M., Ooi, B.C., et al.: CDAS: a crowdsourcing data analytics system. In: Proceedings of the VLDB Endowment (2012). https://doi.org/10.14778/2336664.2336676

    Article  Google Scholar 

  12. Ding, Y., Wang, P.: Quality control algorithm research of crowdsourcing based on social platform. Softw. Guide 16(12), 90–93 (2017). https://doi.org/10.11907/rjdk.171970

    Article  Google Scholar 

  13. Zheng, Z., Jiang, G., Zhang, D., et al.: Crowdsourcing quality evaluation algorithm based on sliding task window. Small Microcomput. Syst. 38(09), 2125–2129 (2017). https://doi.org/10.3969/j.issn.1000-1220.2017.09.038. 5(10), 1040–1051

    Article  Google Scholar 

  14. Demartini, G., Difallah, D.E., Cudré Mauroux, P.: ZenCrowd: leveraging probabilistic reasoning and crowdsourcing techniques for large-scale entity linking. In: International Conference on World Wide Web, pp. 469–478. ACM (2012). https://doi.org/10.1145/2187836.2187900

  15. Zhang, Z.Q.: Research on crowdsourcing quality control strategies and evaluation algorithm. Chin. J. Comput. 36(8), 1636–1649 (2013). https://doi.org/10.3724/SP.J.1016.2013.01636

    Article  Google Scholar 

  16. Feng, J., Li, G., Wang, H., Feng, J.: Incremental Quality Inference in Crowdsourcing. In: International Conference on Database Systems for Advanced Applications, vol. 8422, pp. 453–467. Springer (2014). https://doi.org/10.1007/978-3-319-05813-9_30

    Chapter  Google Scholar 

  17. Yin, X., Han, J., Yu, P.S.: Truth discovery with multiple conflicting information providers on the web. In: ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, vol. 20, pp. 1048–1052. ACM (2007). https://doi.org/10.1109/tkdE.2007.190745

    Article  Google Scholar 

Download references

Acknowledgements

This research is supported by the National Natural Science Foundation of China (61373116) and Science and the Technology Project in Shaanxi Province of China (Program No. 2016KTZDGY04-01) and the International Science and Technology Cooperation Program of the Science and Technology Department of Shaanxi Province of China (Grant No. 2018KW-049), and the Special Scientific Research Program of the Education Department of Shaanxi Province of China (Grant No. 17JK0711).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hong Xia .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chen, Y., Wang, H., Xia, H., Gao, C., Wang, Z. (2019). An Algorithm of Crowdsourcing Answer Integration Based on Specialty Categories of Workers. In: Krömer, P., Zhang, H., Liang, Y., Pan, JS. (eds) Proceedings of the Fifth Euro-China Conference on Intelligent Data Analysis and Applications. ECC 2018. Advances in Intelligent Systems and Computing, vol 891. Springer, Cham. https://doi.org/10.1007/978-3-030-03766-6_4

Download citation

Publish with us

Policies and ethics