Skip to main content

Neighborhood Preserving Projections (NPP): A Novel Linear Dimension Reduction Method

  • Conference paper
Advances in Intelligent Computing (ICIC 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3644))

Included in the following conference series:

Abstract

Dimension reduction is a crucial step for pattern recognition and information retrieval tasks to overcome the curse of dimensionality. In this paper a novel unsupervised linear dimension reduction method, Neighborhood Preserving Projections (NPP), is proposed. In contrast to traditional linear dimension reduction method, such as principal component analysis (PCA), the proposed method has good neighborhood-preserving property. The main idea of NPP is to approximate the classical locally linear embedding (i.e. LLE) by introducing a linear transform matrix. The transform matrix is obtained by optimizing a certain objective function. Preliminary experimental results on known manifold data show the effectiveness of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Jain, A.K., Duin, R.P.W., Mao, J.: Statistical Pattern Recognition: A Review. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(1), 4–37 (2000)

    Article  Google Scholar 

  2. Belkin, M., Niyogi, P.: Laplacian Eigenmaps for Dimensionality Reduction and Data Representation. Neural Computation 15(6), 1373–1396 (2003)

    Article  MATH  Google Scholar 

  3. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification. Wiley-Interscience, Hoboken (2000)

    Google Scholar 

  4. Peter, N.B., Joao, P.H., David, J.K.: Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7), 711–720 (1997)

    Article  Google Scholar 

  5. Scholkopf, B., Smola, A., Muller, K.R.: Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Neural Computation 10(5), 1299–1319 (1998)

    Article  Google Scholar 

  6. Baudat, G., Anouar, F.: Generalized Discriminant Analysis Using a Kernel Approach. Neural Computation 12, 2385–2404 (2000)

    Article  Google Scholar 

  7. Roweis, S., Saul, L.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290(5500), 2323–2326 (2000)

    Article  Google Scholar 

  8. Joshua, B., Tenenbaum, Langford, J.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290(5500), 2319–2323 (2000)

    Article  Google Scholar 

  9. Belkin, M., Niyogi, P.: Laplacian Eigenmaps for Dimensionality Reduction and Data Representation. Neural Computation 5(6), 1373–1396 (2003)

    Article  Google Scholar 

  10. Bengio, Y., Paiement, J., Vincent, P., Dellallaeu, O., Roux, N.L., Quimet, M.: Out-of-sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering. In: Neural Information Processing Systems (2003)

    Google Scholar 

  11. He, X., Yan, S., Hu, Y., Zhang, H.: Learning a Locality Preserving Subspace for Visual Recognition. In: Proc. IEEE International Conference on Computer Vison (2003)

    Google Scholar 

  12. He, X., Yan, S., Hu, Y., Niyogi, P., Zhang, H.J.: Face Recognition Using Laplacianfaces. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(3), 328–340 (2005)

    Article  Google Scholar 

  13. Saul, L.K., Roweis, S.T.: Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifolds. Journal of Machine Learning Research 4, 119–155 (2003)

    Article  MathSciNet  Google Scholar 

  14. He, X., Niyogi, P.: Locality Preserving Projection. Technical Report TR-2002-09, Department of Computer Science, the University of Chicago

    Google Scholar 

  15. Gering, D.: Linear and Nonlinear Data Dimensionality Reduction. Technical Report, the Massachusettes Institute of Technology (2002)

    Google Scholar 

  16. John, S.T., Nello, C.: Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge (2004)

    Google Scholar 

  17. Ham, J., Lee, D.D., Mika, S., Scholkopf, B.: A Kernel View of the Dimensionality Reduction of Manifold. In: Proc. Int. Conf. Machine Learning, pp. 369–376 (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pang, Y., Zhang, L., Liu, Z., Yu, N., Li, H. (2005). Neighborhood Preserving Projections (NPP): A Novel Linear Dimension Reduction Method. In: Huang, DS., Zhang, XP., Huang, GB. (eds) Advances in Intelligent Computing. ICIC 2005. Lecture Notes in Computer Science, vol 3644. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11538059_13

Download citation

  • DOI: https://doi.org/10.1007/11538059_13

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28226-6

  • Online ISBN: 978-3-540-31902-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics