Skip to main content

Abstract

Hessian locally linear embedding (HLLE) achieves linear embedding by minimizing the Hessian functional on the manifold where the data set resides. The conceptual framework of HLLE may be viewed as a modification of the Laplacian Eigenmaps framework. Let H be the observed high-dimensional data which reside on a low-dimentional manifold M and h be the coordinate mapping on M so that Y = h(H)is a DR of H. In Laplacian eigenmaps method, h is found in the numerically null space of the Laplace-Beltrsmi operator on M, while in Hessian locally linear embedding, it is found in the null space of the Hessian. Since HLLE embedding is locally linear, it works well for the data lying on a manifold that may not be convex. Compared with other nonlinear DR methods, such as Isomaps that need the data set lying on a convex manifold, HLLE can be applied to data in a wider range. The chapter is organized as follows. In Section 13.1, we describe the Hessian locally linear embedding method and its mathematical background. In Sections 13.2, the HLLE DR algorithm is introduced. The experiments of the algorithm are included in Section 13.3.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Donoho, D.L., Grimes, C.: Hessian eigenmaps: New locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. USA 100, 5591–5596 (2003).

    Article  MathSciNet  MATH  Google Scholar 

  2. Belkin, M.: Problems of Learning on Manifolds. Ph.D. thesis, The University of Chicago (2003).

    Google Scholar 

  3. Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6), 1373–1396 (2003).

    Article  MATH  Google Scholar 

  4. de Silva, V., Tenenbaum, J.B.: Global versus local methods in nonlinear dimensionality reduction. In: S. Becker, S. Thrun, K. Obermayer (eds.) Neural Information Processing Systems (NIPS 2002), pp. 705–712. MIT Press (2002).

    Google Scholar 

  5. Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000).

    Article  Google Scholar 

  6. Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000).

    Article  Google Scholar 

  7. Saul, L.K., Roweis, S.T.: Think globally, fit locally: Unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research 4, 119–155 (2003).

    MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Higher Education Press, Beijing and Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Wang, J. (2012). Hessian Locally Linear Embedding. In: Geometric Structure of High-Dimensional Data and Dimensionality Reduction. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-27497-8_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-27497-8_13

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-27496-1

  • Online ISBN: 978-3-642-27497-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics