Skip to main content

Sparse Boosting with Correlation Based Penalty

  • Conference paper
Advanced Data Mining and Applications (ADMA 2012)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7713))

Included in the following conference series:

  • 3482 Accesses

Abstract

In high dimensional setting, componentwise L 2boosting method has been used to construct sparse model of high prediction, but it tends to select many ineffective variables. Several sparse boosting methods, such as, SparseL 2Boosting and Twin Boosting, have been proposed to improve the variable selection of L 2boosting algorithm. In this paper, we propose a new general sparse boosting method (GSBoosting). The relations are established between GSBoosting and other well known regularized variable selection methods in orthogonal linear model, such as adaptive Lasso, hard thresholds etc. Simulations results show that GSBoosting has good performance in both prediction and variable selection.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Arcing classifiers (with discussion). Ann. Statist. 26, 801–849 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  2. Breiman, L.: Prediction games and arcing algorithms. Neural Computation 11, 1493–1517 (1999)

    Article  Google Scholar 

  3. Breiman, L., Friedman, J.: Estimating Optimal Transformations for Multiple Regression and Correlation. J. Am. Statist. Ass. 80, 580–598 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bühlmann, P.: Boosting for high-dimensional linear models. Ann. Statist. 34, 559–583 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  5. Bühlmann, P., Hothorn, T.: Twin Boosting: improved feature selection and prediction. Statistics and Computing 20(2), 119–138 (2010)

    Article  MathSciNet  Google Scholar 

  6. Bühlmann, P., Hothorn, T.: Boosting algorithms: regularization, prediction and model fitting. Statistical Science 4(22), 477–505 (2007)

    Article  Google Scholar 

  7. Bühlmann, P., Yu, B.: Boosting with the L 2 loss: Regression and classification. J. Amer. Statist.Assoc. 98, 324–339 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  8. Bühlmann, P., Yu, B.: Sparse boosting. J. Machine Learning Research 7, 1001–1024 (2006)

    MATH  Google Scholar 

  9. Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting (with discussion). Ann. Statist. 28, 337–407 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  10. Friedman, J.: Greedy function approximation: a gradient boosting machine. Ann. Statist. 29, 1189–1232 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  11. Frank, I.E., Friedman, J.H.: A statistical view of some chemometrics regression tools. Technometrics 35, 109–148 (1993)

    Article  MATH  Google Scholar 

  12. Hurvich, C.M., Tsai, C.L.: Regression and Time Series Model Selection in Small Samples. Biometrika 76, 297–307 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  13. Tibshirani, R.: Regression shrinkage and selction via the lasso. J. R. Statist. Soc. B 58, 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  14. Zou, H.: The adaptive lasso and its oracle properties. J. Am. Statist. Ass. 101, 1418–1429 (2006)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhao, J. (2012). Sparse Boosting with Correlation Based Penalty. In: Zhou, S., Zhang, S., Karypis, G. (eds) Advanced Data Mining and Applications. ADMA 2012. Lecture Notes in Computer Science(), vol 7713. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-35527-1_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-35527-1_14

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-35526-4

  • Online ISBN: 978-3-642-35527-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics