Skip to main content

L1 LASSO Modeling and Its Bayesian Inference

  • Conference paper
AI 2008: Advances in Artificial Intelligence (AI 2008)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5360))

Included in the following conference series:

  • 1900 Accesses

Abstract

A new iterative procedure for solving regression problems with the so-called LASSO penalty [1] is proposed by using generative Bayesian modeling and inference. The algorithm produces the anticipated parsimonious or sparse regression models that generalize well on unseen data. The proposed algorithm is quite robust and there is no need to specify any model hyperparameters. A comparison with state-of-the-art methods for constructing sparse regression models such as the relevance vector machine (RVM) and the local regularization assisted orthogonal least squares regression (LROLS) is given.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Tibshirani, R.: Regression shrinkage and selection via the LASSO. J. Royal. Statist. Soc. B 58, 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  2. Meinshausen, N.: Relaxed LASSO. Computational Statistics & Data Analysis 52(1), 374–393 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  3. Figueiredo, M., Nowak, R., Wright, S.: Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems. IEEE Journal of Selected Topics in Signal Processing 1(4), 586–597 (2007)

    Article  Google Scholar 

  4. Hesterberg, T., Choi, N., Meier, L., Fraley, C.: Least angle and L1 regression: A Review. Statistics Surveys 2, 61–93 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  5. Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Annals of Statistics 32, 407–451 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  6. Vapnik, V.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

  7. Schölkopf, B., Smola, A.: Learning with Kernels. The MIT Press, Cambridge (2002)

    MATH  Google Scholar 

  8. Chen, S.: Local regularization assisted orthogonal least squares regression. NeuroComputing 69, 559–585 (2006)

    Article  Google Scholar 

  9. Drezet, P., Harrison, R.: Support vector machines for system identification. In: Proceeding of UKACC Int. Conf. Control 1998, Swansea, U.K., pp. 688–692 (1998)

    Google Scholar 

  10. Tipping, M.: Sparse Bayesian learning and the relevance vector machine. J. Machine Learning Research 1, 211–244 (2001)

    MathSciNet  MATH  Google Scholar 

  11. Chen, S., Billings, S., Luo, W.: Orthogonal least squares methods and their application to non-linear system identification. International Journal of Control 50(5), 1873–1896 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  12. Kruif, B., Vries, T.: Support-Vector-based least squares for learning non-linear dynamics. In: Proceedings of 41st IEEE Conference on Decision and Control, Las Vegas, USA, pp. 10–13 (2002)

    Google Scholar 

  13. Gestel, T., Espinoza, M., Suykens, J., Brasseur, C., deMoor, B.: Bayesian input selection for nonlinear regression with LS-SVMS. In: Proceedings of 13th IFAC Symposium on System Identification, Totterdam, The Netherlands, pp. 27–29 (2003)

    Google Scholar 

  14. Valyon, J., Horváth, G.: A generalized LS-SVM. In: Principe, J., Gile, L., Morgan, N., Wilson, E. (eds.) Proceedings of 13th IFAC Symposium on System Identification, Rotterdam, The Netherlands (2003)

    Google Scholar 

  15. Suykens, J., van Gestel, T., DeBrabanter, J., DeMoor, B.: Least Square Support Vector Machines. World Scientific, Singapore (2002)

    Book  Google Scholar 

  16. Pontil, M., Mukherjee, S., Girosi, F.: On the noise model of support vector machine regression. A.I. Memo 1651, AI Laboratory, MIT (1998)

    Google Scholar 

  17. Gao, J., Gunn, S., Kandola, J.: Adapting kernels by variational approach in SVM. In: McKay, B., Slaney, J.K. (eds.) Canadian AI 2002. LNCS (LNAI), vol. 2557, pp. 395–406. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  18. Gao, J., Xu, R.: Mixture of the robust L1 distributions and its applications. In: Orgun, M.A., Thornton, J. (eds.) AI 2007. LNCS (LNAI), vol. 4830, pp. 26–35. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  19. Gao, J.: Robust L1 principal component analysis and its Bayesian variational inference. Neural Computation 20, 555–572 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  20. Billings, S., Chen, S., Backhouse, R.: The identification of linear and nonlinear models of a turbocharged automotive diesel engine. Mech. Syst. Signal Processing 3(2), 123–142 (1989)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Gao, J., Antolovich, M., Kwan, P.W. (2008). L1 LASSO Modeling and Its Bayesian Inference. In: Wobcke, W., Zhang, M. (eds) AI 2008: Advances in Artificial Intelligence. AI 2008. Lecture Notes in Computer Science(), vol 5360. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-89378-3_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-89378-3_31

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-89377-6

  • Online ISBN: 978-3-540-89378-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics