Skip to main content

Sparse Point Estimation for Bayesian Regression via Simulated Annealing

  • Conference paper
Pattern Recognition (DAGM/OAGM 2012)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 7476))

Abstract

In the context of variable selection in a regression model, the classical Lasso based optimization approach provides a sparse estimate with respect to regression coefficients but is unable to provide more information regarding the distribution of regression coefficients. Alternatively, using a Bayesian approach is more advantageous since it gives direct access to the distribution which is usually summarized by estimating the expectation (not sparse) and variance. Additionally, to support frequent application requirements, heuristics like thresholding are generally used to produce sparse estimates for variable selection purposes. In this paper, we provide a more principled approach for generating a sparse point estimate in a Bayesian framework. We extend an existing Bayesian framework for sparse regression to generate a MAP estimate by using simulated annealing. We then justify this extension by showing that this MAP estimate is also sparse in the regression coefficients. Experiments on real world applications like the splice site detection and diabetes progression demonstrate the usefulness of the extension.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Andrieu, C., Breyer, L.A., Doucet, A.: Convergence of simulated annealing using Foster-Lyapunov criteria. J. Appl. Probab. 38(4), 975–994 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  2. Caron, F., Doucet, A.: Sparse Bayesian nonparametric regression. In: ICML 2008, pp. 88–95. ACM (2008)

    Google Scholar 

  3. Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. The Annals of Statistics 32(2), 407–499 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  4. Gelfand, S.B., Mitter, S.K.: Metropolis-type annealing algorithms for global optimization in Rd. SIAM J. Control Optim. 31, 111–131 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  5. van Gerven, M., Cseke, B., Oostenveld, R., Heskes, T.: Bayesian source localization with the multivariate laplace prior. In: Advances in Neural Information Processing Systems 22, pp. 1901–1909 (2009)

    Google Scholar 

  6. Goldstein, L.: Mean square rates of convergence in the continuous time simulated annealing algorithm on Rd. Adv. Appl. Math. 9, 35–39 (1988)

    Article  MATH  Google Scholar 

  7. Gramacy, R.B., Polson, N.G.: Simulation-based Regularized Logistic Regression. ArXiv e-prints (May 2010)

    Google Scholar 

  8. Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220(4598), 671–680 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  9. Kyung, M., Gill, J., Ghosh, M., Casella, G.: Penalized regression, standard errors, and Bayesian Lassos. Bayesian Analysis 5(2), 369–412 (2010)

    MathSciNet  Google Scholar 

  10. Meier, L., van de Geer, S., Bühlmann, P.: The Group Lasso for logistic regression. J. Roy. Stat. Soc. B 70(1), 53–71 (2008)

    Article  MATH  Google Scholar 

  11. Park, T., Casella, G.: The Bayesian Lasso. Journal of the American Statistical Association 103, 681–686 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  12. Raman, S., Fuchs, T., Wild, P., Dahl, E., Roth, V.: The Bayesian Group-Lasso for analyzing contingency tables. In: Proceedings of the 26th International Conference on Machine Learning, pp. 881–888 (June 2009)

    Google Scholar 

  13. Raman, S., Roth, V.: Sparse Bayesian Regression for Grouped Variables in Generalized Linear Models. In: Denzler, J., Notni, G., Süße, H. (eds.) DAGM 2009. LNCS, vol. 5748, pp. 242–251. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  14. Roberts, G.O., Stramer, O.: Langevin diffusions and Metropolis-Hastings algorithms. Methodology and Computing in Applied Probability 4, 337–357 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  15. Royer, G.: A remark on simulated annealing of diffusion processes. SIAM Journal on Control and Optimization 27(6), 1403–1408 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  16. Tibshirani, R.: Regression shrinkage and selection via the Lasso. J. Roy. Stat. Soc. B 58(1), 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  17. Černý, V.: Thermodynamical approach to the traveling salesman problem: An efficient simulation algorithm. Journal of Optimization Theory and Applications 45(1), 41–51 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  18. Yeo, G., Burge, C.: Maximum entropy modeling of short sequence motifs with applications to RNA splicing signals. J. Comp. Biology 11, 377–394 (2004)

    Article  Google Scholar 

  19. Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. Roy. Stat. Soc. B, 49–67 (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Raman, S., Roth, V. (2012). Sparse Point Estimation for Bayesian Regression via Simulated Annealing. In: Pinz, A., Pock, T., Bischof, H., Leberl, F. (eds) Pattern Recognition. DAGM/OAGM 2012. Lecture Notes in Computer Science, vol 7476. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32717-9_32

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-32717-9_32

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-32716-2

  • Online ISBN: 978-3-642-32717-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics