Skip to main content

Bagging

  • Reference work entry
  • First Online:
Encyclopedia of Machine Learning and Data Mining
  • 223 Accesses

Bagging is an ensemble learning technique. The name “Bagging” is an acronym derived from Bootstrap AGGregatING. Each member of the ensemble is constructed from a different training dataset. Each dataset is a bootstrap sample from the original. The models are combined by a uniform average or vote. Bagging works best with unstable learners, that is those that produce differing generalization patterns with small changes to the training data. Bagging therefore tends not to work well with linear models. See ensemble learning for more details.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 699.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 949.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer Science+Business Media New York

About this entry

Cite this entry

(2017). Bagging. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning and Data Mining. Springer, Boston, MA. https://doi.org/10.1007/978-1-4899-7687-1_925

Download citation

Publish with us

Policies and ethics