Skip to main content
Log in

Nonlinear Dimensionality Reduction for Data with Disconnected Neighborhood Graph

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Neighborhood graph based nonlinear dimensionality reduction algorithms, such as Isomap and LLE, perform well under an assumption that the neighborhood graph is connected. However, for datasets consisting of multiple clusters or lying on multiple manifolds, the neighborhood graphs are often disconnected, or in other words, have multiple connected components. Neighborhood graph based dimensionality reduction techniques cannot recognize both the local and global properties of such datasets. In this paper, a new method, called enhanced neighborhood graph, is proposed to solve the problem. The concept is to add edges to the neighborhood graph adaptively and iteratively until it becomes connected. Nonlinear dimensionality reduction can then be performed based on the enhanced neighborhood graph. As a result, both local and global properties of the data can be exactly recognized. In this study, thorough simulations on synthetic datasets and natural datasets are conducted. The experimental results corroborate that the proposed method provides significant improvements on dimensionality reduction for data with disconnected neighborhood graph.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

References

  1. Lee JA, Verleysen M (2007) Nonlinear dimensionality reduction. Springer, Berlin

    Book  MATH  Google Scholar 

  2. Verleysen M, Lee JA (2013) Nonlinear dimensionality reduction for visualization. In: 20th international conference neural information processing, ICONIP 2013. Springer, Berlin, pp 617–622

  3. Jolliffe I (2005) Principal component analysis. Encyclopedia of statistics in behavioral science. Wiley, Hoboken

    Google Scholar 

  4. Borg I, Groenen PJF (2005) Modern multidimensional scaling: theory and applications. Springer, Berlin

    MATH  Google Scholar 

  5. DeMers D, Cottrell GW (1993) Non-linear dimensionality reduction. In: Advances in neural information processing systems 5, [NIPS conference]. Morgan Kaufmann Publishers Inc., pp 580–587

  6. Van der Maaten LJP, Postma EO, Van den Herik HJ (2009) Dimensionality reduction: a comparative review. J Mach Learn Res 10:66–71

    Google Scholar 

  7. Hoffmann H (2007) Kernel PCA for novelty detection. Pattern Recognit 40:863–874

    Article  MATH  Google Scholar 

  8. Hinton GE, Salakhutdinov RR (2006) Reducing the dimensionality of data with neural networks. Science 313:504–507

    Article  MathSciNet  MATH  Google Scholar 

  9. Mohebi E, Bagirov A (2016) Constrained self organizing maps for data clusters visualization. Neural Process Lett 43:849–869

    Article  Google Scholar 

  10. Lee JA, Peluffo-Ordóñez DH, Verleysen M (2015) Multi-scale similarities in stochastic neighbour embedding: reducing dimensionality while preserving both local and global structure. Neurocomputing 169:246–261

    Article  Google Scholar 

  11. Yang J, Fan L (2014) A novel indefinite kernel dimensionality reduction algorithm: weighted generalized indefinite kernel discriminant analysis. Neural Process Lett 40:301–313

    Article  Google Scholar 

  12. Sammon JW (1969) A nonlinear mapping for data structure analysis. IEEE Trans Comput 18:401–409

    Article  Google Scholar 

  13. Demartines P, Herault J (1997) Curvilinear component analysis: a self-organizing neural network for nonlinear mapping of data sets. IEEE Trans Neural Netw 8:148–154

    Article  Google Scholar 

  14. Wan M, Lai Z, Jin Z (2011) Locally minimizing embedding and globally maximizing variance: unsupervised linear difference projection for dimensionality reduction. Neural Process Lett 33:267–282

    Article  Google Scholar 

  15. Wang F, Zhang D (2013) A new locality-preserving canonical correlation analysis algorithm for multi-view dimensionality reduction. Neural Process Lett 37:135–146

    Article  Google Scholar 

  16. Zhou Y, Sun S (2016) Local tangent space discriminant analysis. Neural Process Lett 43:727–744

    Article  Google Scholar 

  17. Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290:2323–2326

    Article  Google Scholar 

  18. Tenenbaum JB, De Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290:2319–2323

    Article  Google Scholar 

  19. Belkin M, Niyogi P (2003) Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput 15:1373–1396

    Article  MATH  Google Scholar 

  20. Donoho DL, Grimes C (2003) Hessian eigenmaps: locally linear embedding techniques for high-dimensional data. Proc Natl Acad Sci 100:5591–5596

    Article  MathSciNet  MATH  Google Scholar 

  21. Saul LK, Roweis ST (2003) Think globally, fit locally: unsupervised learning of low dimensional manifolds. J Mach Learn Res 4:119–155

    MathSciNet  MATH  Google Scholar 

  22. Zhang Z, Zha H (2004) Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. J Shanghai Univ 8:406–424

    Article  MathSciNet  MATH  Google Scholar 

  23. Coifman RR, Lafon S (2006) Diffusion maps. Appl Comput Harmon Anal 21:5–30

    Article  MathSciNet  MATH  Google Scholar 

  24. Lee JA, Verleysen M (2005) Nonlinear dimensionality reduction of data manifolds with essential loops. Neurocomputing 67:29–53

    Article  Google Scholar 

  25. Weinberger KQ , Sha F, Saul LK (2004) Learning a kernel matrix for nonlinear dimensionality reduction. In: Proceedings of the twenty-first international conference on Machine learning. ACM, Banff, p 106

  26. Mekuz N, Tsotsos J (2006) Parameterless Isomap with adaptive neighborhood selection. Pattern Recognit. Springer, Berlin, pp 364–373

    Chapter  Google Scholar 

  27. Samko O, Marshall AD, Rosin PL (2006) Selection of the optimal parameter value for the Isomap algorithm. Pattern Recognit Lett 27:968–979

    Article  Google Scholar 

  28. Zhang Z, Wang J, Zha H (2012) Adaptive manifold learning. IEEE Trans Pattern Anal Mach Intell 34:253–265

    Article  Google Scholar 

  29. Jia W et al (2008) Adaptive neighborhood selection for manifold learning. In: International conference on machine learning and cybernetics, 2008

  30. Song Y et al (2008) A unified framework for semi-supervised dimensionality reduction. Pattern Recognit 41:2789–2799

    Article  MATH  Google Scholar 

  31. de Ridder D et al (2003) Supervised locally linear embedding. Artificial neural networks and neural information processing—ICANN/ICONIP 2003. Springer, Berlin, pp 333–341

  32. Huang Y, Xu D, Nie F (2012) Semi-supervised dimension reduction using trace ratio criterion. IEEE Trans Neural Netw Learn Syst 23:519–526

    Article  Google Scholar 

  33. Zhang Z, Chow TWS, Zhao M (2013) M-Isomap: orthogonal constrained marginal Isomap for nonlinear dimensionality reduction. IEEE Trans Cybern 43:180–191

    Article  Google Scholar 

  34. Nene SA, Nayar SK, Murase H (1996) Columbia object image library (COIL-20). Columbia University, New York

    Google Scholar 

  35. Liu X, Lu H, Li W (2010) Multi-manifold modeling for head pose estimation. In: 2010 IEEE international conference on image processing

  36. Valencia-Aguirre J et al (2011) Multiple manifold learning by nonlinear dimensionality reduction. Springer, Iberoamerican Congress on Pattern Recognition

  37. Torki M , Elgammal A, Lee CS (2010) Learning a joint manifold representation from multiple data sets. In: 2010 20th international conference on pattern recognition (ICPR). IEEE

  38. Hadid A, Pietikäinen M (2003) Efficient locally linear embeddings of imperfect manifolds. In: Machine learning and data mining in pattern recognition: third international conference, MLDM 2003 Proceedings. Springer, Berlin, pp 188–201

  39. Lee C-S, Elgammal A, Torki M (2016) Learning representations from multiple manifolds. Pattern Recognit 50:74–87

    Article  Google Scholar 

  40. Yan S et al (2007) Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Trans Pattern Anal Mach Intell 29:40–51

    Article  Google Scholar 

  41. Lee CY (1961) An algorithm for path connections and its applications. IRE Trans Electron Comput EC–10:346–365

  42. Tarjan R (1972) Depth-first search and linear graph algorithms. SIAM J Comput 1:146–160

    Article  MathSciNet  MATH  Google Scholar 

  43. Hopcroft J, Tarjan R (1973) Algorithm 447: efficient algorithms for graph manipulation. Commun ACM 16:372–378

    Article  Google Scholar 

  44. Weyrauch B et al (2004) Component-based face recognition with 3D morphable models. In: Proceedings of the 2004 conference on computer vision and pattern recognition workshop (CVPRW’04), vol 05. IEEE Computer Society, p 85

  45. Lee JA et al (2013) Type 1 and 2 mixtures of Kullback–Leibler divergences as cost functions in dimensionality reduction based on similarity preservation. Neurocomputing 112:92–108

    Article  Google Scholar 

  46. Lee JA, Verleysen M (2014) Two key properties of dimensionality reduction methods. In: 2014 IEEE symposium on computational intelligence and data mining (CIDM). IEEE

Download references

Acknowledgements

This work is partially supported by the National Science Foundation of China under Grant no.61601112, the Fundamental Research Funds for the Central Universities and DHU Distinguished Young Professor Program. This work is also partially supported by the Natural Science Foundation of China under grant no. 61572156.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tommy W. S. Chow.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fan, J., Chow, T.W.S., Zhao, M. et al. Nonlinear Dimensionality Reduction for Data with Disconnected Neighborhood Graph. Neural Process Lett 47, 697–716 (2018). https://doi.org/10.1007/s11063-017-9676-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-017-9676-5

Keywords

Navigation