Skip to main content

Improving the Weighted Distribution Estimation for AdaBoost Using a Novel Concurrent Approach

  • Conference paper
  • First Online:
Intelligent Distributed Computing IX

Part of the book series: Studies in Computational Intelligence ((SCI,volume 616))

  • 776 Accesses

Abstract

AdaBoost is one of the most known Ensemble approaches used in the Machine Learning literature. Several AdaBoost approaches that use Parallel processing, in order to speed up the computation in Large datasets, have been recently proposed. These approaches try to approximate the classic AdaBoost, thus sacrificing its generalization ability. In this work, we use Concurrent Computing in order to improve the Distribution Weight estimation, hence obtaining improvements in the capacity of generalization. We train in parallel in each round several weak hypotheses, and using a weighted ensemble we update the distribution weights of the following boosting rounds. Our results show that in most cases the performance of AdaBoost is improved and that the algorithm converges rapidly. We validate our proposal with 4 well-known real data sets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Mach. Learn. 36(1–2), 105–139 (1999)

    Article  Google Scholar 

  2. Clemen, R.T.: Combining forecasts: a review and annotated bibliography. Int. J. Forecast. 5(4), 559–583 (1989)

    Article  Google Scholar 

  3. Fercoq, O.: Parallel Coordinate Descent for the Adaboost Problem. In: Proceedings of the 12th International Conference on Machine Learning and Applications (ICMLA), vol. 1, no. 1, pp. 354-358, 4-7 December 2013

    Google Scholar 

  4. Freund, Y., Schapire, R.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  5. Freund, Y., Shapire, R.E.: A short introduction to boosting. J. Jpn. Soc. Artif. Int. 14(5), 771–780 (1999)

    Google Scholar 

  6. Kuncheva, L., Whitaker, C.: Using diversity with three variants of boosting: aggressive, conservative and inverse. Lect. Notes Comput. Sci. 2364(1), 81–90 (2002)

    Article  MATH  Google Scholar 

  7. Kyrkou, C., Theocharides, T.: A flexible parallel hardware architecture for adaboost-based real-time object detection. IEEE Trans. Very Larg. Scale Integr. (VLSI) Syst. 19(6), 1034–1047 (2011)

    Article  Google Scholar 

  8. Lichman, M.: UCI Machine Learning Repository. University of California, Irvine. School of Information and Computer Science (2013). http://archive.ics.uci.edu/ml

  9. Liu, H., Tian, H.Q., Li, Y.F., Zhang, L.: Comparison of four Adaboost algorithm based artificial neural networks in wind speed predictions. Energy Convers. Manage. 92(1), 67–81 (2015)

    Article  Google Scholar 

  10. Mukherjee, I., Canini, K., Frongillo, R., Singer, Y.: Parallel boosting with momentum. Mach. Learn. Knowl. Discov. Databases 8190(1), 17–32 (2013)

    Article  Google Scholar 

  11. Palit, I., Reddy, C.K.: Parallelized Boosting with Map-Reduce. In: IEEE International Conference on Data Mining Workshops (ICDMW), vol. 1, no. 1, pp. 1346–1353, 13 December 2010

    Google Scholar 

  12. Palit, I., Reddy, C.K.: Scalable and parallel boosting with mapreduce. IEEE Trans. Knowl. Data Eng. 24(10), 1904–1916 (2012)

    Article  Google Scholar 

  13. Seiffert, C., Khoshgoftaar, T.M., Van Hulse, J., Napolitano, A.: Resampling or reweighting: a comparison of boosting implementations. In Proceedings of the 20th IEEE International Conference on Tools with Artificial Intelligence (ICTAI 08), vol. 1, no. 1, pp. 445–451, November 2008

    Google Scholar 

  14. Valiant, L.G.: A theory of the learnable. Commun. ACM 27(11), 1134–1142 (1984)

    Article  MATH  Google Scholar 

Download references

Acknowledgments

This work was supported by the following research grants: Fondecyt 1110854 and DGIP-UTFSM. The work of C. Moraga was partially supported by the Foundation for the Advancement of Soft Computing, Mieres, Spain.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Héctor Allende-Cid .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Allende-Cid, H., Valle, C., Moraga, C., Allende, H., Salas, R. (2016). Improving the Weighted Distribution Estimation for AdaBoost Using a Novel Concurrent Approach. In: Novais, P., Camacho, D., Analide, C., El Fallah Seghrouchni, A., Badica, C. (eds) Intelligent Distributed Computing IX. Studies in Computational Intelligence, vol 616. Springer, Cham. https://doi.org/10.1007/978-3-319-25017-5_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-25017-5_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-25015-1

  • Online ISBN: 978-3-319-25017-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics