Skip to main content

On Extended Guttman Condition in High Dimensional Factor Analysis

  • Conference paper
  • First Online:
Quantitative Psychology (IMPS 2017, IMPS 2018)

Abstract

It is well-known that factor analysis and principal component analysis often yield similar estimated loading matrices. Guttman (Psychometrika 21:273–285, 1956) identified a condition under which the two matrices are close to each other at the population level. We discuss the matrix version of the Guttman condition for closeness between the two methods. It can be considered as an extension of the original Guttman condition in the sense that the matrix version involves not only the diagonal elements but also the off-diagonal elements of the inverse matrices of variance-covariances and unique variances. We also discuss some implications of the extended Guttman condition, including how to obtain approximate estimates of the inverse of covariance matrix under high dimensions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Anderson, T. W. (2003). An introduction to multivariate statistical analysis (3rd ed.). New York: Wiley.

    MATH  Google Scholar 

  • Bentler, P. M. (1976). Multistructure statistical model applied to factor analysis. Multivariate Behavioral Research, 11, 3–15.

    Article  Google Scholar 

  • Guttman, L. (1956). Best possible systematic estimates of communalities. Psychometrika, 21, 273–285.

    Article  MathSciNet  MATH  Google Scholar 

  • Harville, D. A. (1997). Matrix algebra from a statistician’s perspective. New York: Springer.

    Book  MATH  Google Scholar 

  • Hayashi, K., & Bentler, P. M. (2000). On the relations among regular, equal unique variances and image factor analysis. Psychometrika, 65, 59–72.

    Article  MathSciNet  MATH  Google Scholar 

  • Hayashi, K., & Bentler, P. M. (2001). The asymptotic covariance matrix of maximum-likelihood estimates in factor analysis: The case of nearly singular matrix of estimates of unique variances. Linear Algebra and its Applications, 321, 153–173.

    Article  MathSciNet  MATH  Google Scholar 

  • Krijnen, W. P. (2006). Convergence of estimates of unique variances in factor analysis, based on the inverse sample covariance matrix. Psychometrika, 71, 193–199.

    Article  MathSciNet  MATH  Google Scholar 

  • Lawley, D. N., & Maxwell, A. E. (1971). Factor analysis as a statistical method (2nd ed.). New York: American Elsevier.

    Google Scholar 

  • Pourahmadi, M. (2013). High-dimensional covariance estimation. New York: Wiley.

    Book  MATH  Google Scholar 

  • Schneeweiss, H. (1997). Factors and principal components in the near spherical case. Multivariate Behavioral Research, 32, 375–401.

    Article  Google Scholar 

  • Schneeweiss, H., & Mathes, H. (1995). Factor analysis and principal components. Journal of Multivariate Analysis, 55, 105–124.

    Article  MathSciNet  MATH  Google Scholar 

  • Tipping, M. E., & Bishop, C. M. (1999). Probabilistic principal component analysis. Journal of the Royal Statistical Society: Series B, 61, 611–622.

    Article  MathSciNet  MATH  Google Scholar 

  • Velicer, W. F., & Jackson, D. N. (1990). Component analysis versus common factor analysis: Some issues in selecting an appropriate procedure. Multivariate Behavioral Research, 25, 1–28.

    Article  Google Scholar 

  • Warton, D. I. (2008). Penalized normal likelihood and ridge regularization of correlation and covariance matrices. Journal of the American Statistical Association, 103, 340–349.

    Article  MathSciNet  MATH  Google Scholar 

  • Yuan, K.-H., & Chan, W. (2008). Structural equation modeling with near singular covariance matrices. Computational Statistics & Data Analysis, 52, 4842–4858.

    Article  MathSciNet  MATH  Google Scholar 

  • Yuan, K.-H., & Chan, W. (2016). Structural equation modeling with unknown population distributions: Ridge generalized least squares. Structural Equation Modeling, 23, 163–179.

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors are thankful to Dr. Dylan Molenaar’s comments. Ke-Hai Yuan’s work was supported by the National Science Foundation under Grant No. SES-1461355.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kentaro Hayashi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hayashi, K., Yuan, KH., Jiang, G.(. (2019). On Extended Guttman Condition in High Dimensional Factor Analysis. In: Wiberg, M., Culpepper, S., Janssen, R., González, J., Molenaar, D. (eds) Quantitative Psychology. IMPS IMPS 2017 2018. Springer Proceedings in Mathematics & Statistics, vol 265. Springer, Cham. https://doi.org/10.1007/978-3-030-01310-3_20

Download citation

Publish with us

Policies and ethics