Abstract
Although the extreme learning machine (ELM) technique is an efficient and effective neural approach, there are still some downsides in the traditional ELM technique. When there are some outlier training samples, the trained neural network is usually with poor performance. Another issue is that when there are some noise and faults in the trained network, the performance of the trained network is also poor. This paper looks into the ELM technique under multiple imperfections, including outlier training samples, weight noise and node faults. This paper first identifies a regularization term for handling weight noise and node faults. To handle outlier training samples, the maximum correntropy criterion (MCC) concept is used in the objective function. A learning algorithm, namely, robust fault aware ELM algorithm (RFAELM), for faulty networks is then proposed. Simulation results show that the performance of the proposed algorithm is much better than that of two state-of-art robust algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)
Huang, G.B., et al.: Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 42(2), 513–529 (2011)
Liu, Z., Jin, W., Mu, Y.: Variances-constrained weighted extreme learning machine for imbalanced classification. Neurocomputing 403, 45–52 (2020)
Lei, Y., et al.: A semi-supervised Laplacian extreme learning machine and feature fusion with CNN for industrial superheat identification. Neurocomputing 381, 186–195 (2020)
Li, X., Mao, W., Jiang, W.: Extreme learning machine based transfer learning for data classification. Neurocomputing 174, 203–210 (2016)
Wang, Z., et al.: Distributed and weighted extreme learning machine for imbalanced big data learning. In: Proceedings of ELM-2015 (2015)
Chen, B., Wang, X., Lu, N., Wang, S., Cao, J., Qin, J.: Mixture correntropy for robust learning. Pattern Recogn. 79, 318–327 (2018)
Wang, K., Pei, H., Cao, J., Zhong, P.: Robust regularized extreme learning machine for regression with non-convex loss function via DC program. J. Franklin Inst. 357(11), 7069–7091 (2020)
Martolia, R., Jain, A., Singla, L.: Analysis & survey on fault tolerance in radial basis function networks. In: 2015 IEEE International Conference on Computing, Communication & Automation (ICCCA), pp. 469–473 (2015)
Feng, R.B., Han, Z.F., Wan, W.Y., Leung, C.S.: Properties and learning algorithms for faulty RBF networks with coexistence of weight and node failures. Neurocomputing 224, 166–176 (2017)
Liu, B., Kaneko, T.: Error analysis of digital filters realized with floating-point arithmetic. Proc. IEEE 57(10), 1735–1747 (1969)
Adegoke, M., Wong, H.T., Leung, A.C.S., Sum, J.: Two noise tolerant incremental learning algorithms for single layer feed-forward neural networks. J. Ambient Intell. Humaniz. Comput., 1–15 (2019). https://doi.org/10.1007/s12652-019-01488-8
Shi, W., Li, Y., Wang, Y.: Noise-free maximum correntropy criterion algorithm in non-Gaussian environment. IEEE Trans. Circ. Syst. II: Express Briefs 67(10), 2224–2228 (2019)
Liu, W., Pokharel, P.P., Príncipe, J.C.: Correntropy: properties and applications in non-Gaussian signal processing. IEEE Trans. Signal Process. 55(11), 5286–5298 (2007)
Pokharel, P.P., Liu, W., Principe, J.C.: A low complexity robust detector in impulsive noise. Signal Process. 89(10), 1902–1909 (2009)
Vapnik, V.N.: The Nature of Statistical Learning Theory (1995)
Yang, J., Cao, J., Wang, T., Xue, A., Chen, G.: Regularized correntropy criterion based semi-supervised ELM. Neural Netw. 122, 117–129 (2020)
Dua, D., Graff, C.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine (2019). http://archive.ics.uci.edu/ml
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Adegoke, M., Xiao, Y., Leung, CS., Leung, K.W. (2022). A Robust ELM Algorithm for Compensating the Effect of Node Fault and Weight Noise. In: Ghazali, R., Mohd Nawi, N., Deris, M.M., Abawajy, J.H., Arbaiy, N. (eds) Recent Advances in Soft Computing and Data Mining. SCDM 2022. Lecture Notes in Networks and Systems, vol 457. Springer, Cham. https://doi.org/10.1007/978-3-031-00828-3_7
Download citation
DOI: https://doi.org/10.1007/978-3-031-00828-3_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-00827-6
Online ISBN: 978-3-031-00828-3
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)