Abstract
Generation of useful variables in the features spaces is an important issue throughout the neural networks, the machine learning and artificial intelligence for their efficient and discriminative computations. In this paper, the nearest neighbor relations are proposed for the minimal generation and the reduced variables for the feature spaces. First, the nearest neighbor relations are shown to be minimal independent and inherited for the construction of the feature space. For the analysis, convex cones are made of the nearest neighbor relations, which are independent vectors for the generation of the reduced variables. Then, edges of convex cones are compared for the discrimination of variables. Finally, feature spaces with the reduced variables based on the nearest neighbor relations are shown to be useful for the real documents classification.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bansal, N., Chen, X., Wang, Z.: Can We gain more from orthogonality regularizations in training deep networks? In: Bengio, S., Wallach, H.M., Larochelle, H., Grauman, K., Cesa-Bianchi, N., Garnett, R. (eds.) Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems 2018, NeurIPS 2018, 3–8 December 2018, Montréal, Canada, pp. 4266–4276 (2018). https://proceedings.neurips.cc/paper/2018/hash/bf424cb7b0dea050a42b9739eb261a3a-Abstract.html
Bishop, C.M.: Pattern Recognition and Machine Learning (Information Science and Statistics). Springer, Heidelberg (2006). https://www.springer.com/jp/book/9780387310732
Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)
Hu, S.T.: Threshold Logic. University of California Press, Berkeley (1965)
Ishii, N., Torii, I., Iwata, K., Odagiri, K., Nakashima, T.: Generation of reducts and threshold functions using discernibility and indiscerniblity matrices. In: 2017 IEEE 15th International Conference on Software Engineering Research, Management and Applications (SERA), pp. 55–61 (2017). https://ieeexplore.ieee.org/document/7965707
Kuhn, H.W., Tucker, A.W.: On systems of linear inequalities. In: Linear Inequalities and Related Systems (AM-38), vol. 38, pp. 99–156. Princeton University Press (1966)
Porter, M.F.: An algorithm for suffix stripping. Program Electron. Libr. Inf. Syst. 40(3), 130–137 (1980). https://www.emerald.com/insight/content/doi/10.1108/eb046814/full/html
Reuters-21578 Text Categorization Collection. https://kdd.ics.uci.edu/databases/reuters21578/reuters21578.html
Shi, W., Gong, Y., Cheng, D., Tao, X., Zheng, N.: Entropy and orthogonality based deep discriminative feature learning for object recognition. Pattern Recogn. 81, 71–80 (2018). https://www.sciencedirect.com/science/article/pii/S0031320318301262
Skowron, A., Polkowski, L.: Decision algorithms: a survey of rough set - theoretic methods. Fundam. Informaticae 30, 345–358 (1997)
Wang, J., Chen, Y., Chakraborty, R., Yu, S.X.: Orthogonal convolutional neural networks. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June 2020
Zhang, S., Jiang, H., Dai, L.: Hybrid orthogonal projection and estimation (HOPE): a new framework to learn neural networks. J. Mach. Learn. Res. 17(37), 1–33 (2016). http://jmlr.org/papers/v17/15-335.html
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Ishii, N., Iwata, K., Mukai, N., Odagiri, K., Matsuo, T. (2021). Features Spaces with Reduced Variables Based on Nearest Neighbor Relations and Their Inheritances. In: Rojas, I., Joya, G., Català, A. (eds) Advances in Computational Intelligence. IWANN 2021. Lecture Notes in Computer Science(), vol 12861. Springer, Cham. https://doi.org/10.1007/978-3-030-85030-2_7
Download citation
DOI: https://doi.org/10.1007/978-3-030-85030-2_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-85029-6
Online ISBN: 978-3-030-85030-2
eBook Packages: Computer ScienceComputer Science (R0)