Skip to main content
Log in

Sparse reduced-rank regression with covariance estimation

  • Published:
Statistics and Computing Aims and scope Submit manuscript

Abstract

Improving the predicting performance of the multiple response regression compared with separate linear regressions is a challenging question. On the one hand, it is desirable to seek model parsimony when facing a large number of parameters. On the other hand, for certain applications it is necessary to take into account the general covariance structure for the errors of the regression model. We assume a reduced-rank regression model and work with the likelihood function with general error covariance to achieve both objectives. In addition we propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty, and to estimate the error covariance matrix simultaneously by using a similar penalty on the precision matrix. We develop a numerical algorithm to solve the penalized regression problem. In a simulation study and real data analysis, the new method is compared with two recent methods for multivariate regression and exhibits competitive performance in prediction and variable selection.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. When we applied the exact MRCE algorithm for \(p \ge n\), the R function “mrce” often resulted in the precision matrix with extremely large determinant (>\(10^{15}\)) while the sparsity of \(\hat{\varvec{\varOmega }}\) was estimated reasonably well.

References

  • Breiman, L., Friedman, J.: Predicting multivariate responses in multiple linear regression. J. R. Stat. Soc. B 59, 3–54 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  • Buchen, T., Wohlrabe, K.: Forecasting with many predictors: is boosting a viable alternative? Econ. Lett. 113(1), 16–18 (2011)

  • Bunea, F., She, Y., Wegkamp, M.H.: Optimal selection of reduced rank estimators of high-dimensional matrices. Ann. Stat. 39(2), 1282–1309 (2011)

  • Bunea, F., She, Y., Wegkamp, M.H.: Joint variable and rank selection for parsimonious estimation of high-dimensional matrices. Ann. Stat. 40(5), 2359–2388 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  • Chen, K., Chan, K.S., Stenseth, N.C.: Reduced rank stochastic regression with a sparse singular value decomposition. J. R. Stat. Soc. B 74(2), 203–221 (2012)

    Article  MathSciNet  Google Scholar 

  • Chen, K., Dong, H., Chan, K.S.: Reduced rank regression via adaptive nuclear norm penalization. Biometrika 100(4), 901–920 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  • Chen, L., Huang, J.Z.: Sparse reduced-rank regression with covariance estimation. J. Am. Stat. Assoc. 107(500), 1533–1545 (2012)

    Article  MATH  Google Scholar 

  • Friedman, J., Hastie, T., Tibshirani, R.: Sparse inverse covariance estimation with the graphical lasso. Biostatistics 9, 432–441 (2007)

    Article  Google Scholar 

  • Izenman, A.J.: Reduced-rank regression for the multivariate linear model. J. Multivar. Anal. 5, 248–264 (1975)

    Article  MATH  MathSciNet  Google Scholar 

  • Izenman, A.J.: Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning. Springer, New York (2008)

    Book  Google Scholar 

  • Mazumder, R., Hastie, T.: The graphical lasso: new insights and alternatives. Electron. J. Stat. 6, 2125–2149 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  • Negahban, S., Wainwright, M.J., et al.: Estimation of (near) low-rank matrices with noise and high-dimensional scaling. Ann. Stat. 39(2), 1069–1097 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  • Obozinski, G., Wainwright, M.J., Jordan, M.I., et al.: Support union recovery in high-dimensional multivariate regression. Ann. Stat. 39(1), 1–47 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  • Peng, J., Zhu, J., Bergamaschi, A., Han, W., Noh, D., Pollack, J.R., Wang, P.: Regularized multivariate regression for identifying master predictors with application to integrative genomics study of breast cancer. Ann. Appl. Stat. 4, 53–77 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  • Reinsel, G.C., Velu, R.P.: Multivariate Reduced-Rank Regression, Theory and Applications. Springer, New York (1998)

    Book  Google Scholar 

  • Rothman, A., Levina, E., Zhu, J.: Sparse multivariate regression with covariance estimation. J. Comput. Gr. Stat. 19(4), 947–962 (2010)

    Article  MathSciNet  Google Scholar 

  • Simila, T., Tikka, J.: Input selection and shrinkage in multiresponse linear regression. Comput. Stat. Data Anal. 52, 406–422 (2007)

    Article  MathSciNet  Google Scholar 

  • Stock, J.H., Watson, M.W.: An empirical comparison of methods for forecasting using many predictors. Princeton University, Manuscript (2005)

  • Stock, J.H., Watson, M.W.: Forecasting with many predictors. Handbook of economic forecasting 1, 515–554 (2006)

  • Turlach, B., Venables, W., Wright, S.: Simultaneous variable selection. Technometrics 47, 350–363 (2005)

    Article  MathSciNet  Google Scholar 

  • Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. B 68, 49–67 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  • Yuan, M., Lin, Y.: Model selection and estimation in the gaussian graphical model. Biometrika 94(1), 19–35 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  • Yuan, M., Ekici, A., Lu, Z., Monteiro, R.: Dimension reduction and coefficient estimation in multivariate linear regression. J. R. Stat. Soc. B 69, 329–346 (2007)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

Huang’s work was partially supported by NSF grant DMS-1208952 and by Award Numbers KUS-CI-016-04 and GRP-CF-2011-19-P-Gao-Huang, made by King Abdullah University of Science and Technology (KAUST).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lisha Chen.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, L., Huang, J.Z. Sparse reduced-rank regression with covariance estimation. Stat Comput 26, 461–470 (2016). https://doi.org/10.1007/s11222-014-9517-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11222-014-9517-6

Keywords

Navigation