Skip to main content
Log in

On robustness of radial basis function network with input perturbation

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this article, we have proposed a methodology for making a radial basis function network (RBFN) robust with respect to additive and multiplicative input noises. This is achieved by properly selecting the centers and widths for the radial basis function (RBF) units of the hidden layer. For this purpose, firstly, a set of self-organizing map (SOM) networks are trained for center selection. For training a SOM network, random Gaussian noise is injected in the samples of each class of the data set. The number of SOM networks is same as the number of classes present in the data set, and each of the SOM networks is trained separately by the samples belonging to a particular class. The weight vector associated with a unit in the output layer of a particular SOM network corresponding to a class is used as the center of a RBF unit for that class. To determine the widths of the RBF units, p-nearest neighbor algorithm is used class-wise. Proper selection of centers and widths makes the RBFN robust with respect to input perturbation and outliers present in the data set. The weights between the hidden and output layers of RBFN are obtained by pseudo inverse method. To test the robustness of the proposed method in additive and multiplicative noise scenarios, ten standard data sets have been used for classification. Proposed method has been compared with three existing methods, where the centers have been generated in three ways: randomly, using k-means algorithm, and based on SOM network. Simulation results show the superiority of the proposed method compared to those methods. Wilcoxon signed-rank test also shows that the proposed method is statistically better than those methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Lowe D (1988) Multi-variable functional interpolation and adaptive networks. Complex Syst 2:321–355

    MATH  Google Scholar 

  2. Saha A, Wu CL, Tang DS (1993) Approximation, dimension reduction, and nonconvex optimization using linear superpositions of gaussians. IEEE Trans Comput 42(10):1222–1233

    Article  MathSciNet  MATH  Google Scholar 

  3. Bernier JL, Díaz AF, Fernández F, Cañas A, González J, Martin-Smith P, Ortega J (2003) Assessing the noise immunity and generalization of radial basis function networks. Neural Process Lett 18(1):35–48

    Article  Google Scholar 

  4. Webb AR (1994) Functional approximation by feed-forward networks: a least-squares approach to generalization. IEEE Trans Neural Netw 5(3):363–371

    Article  MathSciNet  Google Scholar 

  5. Haykin S, Network N (2004) A comprehensive foundation. Neural Netw 2:2004

    Google Scholar 

  6. Eickhoff R, Rückert U (2007) Robustness of radial basis functions. Neurocomputing 70(16):2758–2767

    Article  Google Scholar 

  7. Yu H, Xie T, Paszczynski S, Wilamowski BM (2011) Advantages of radial basis function networks for dynamic system design. IEEE Trans Ind Electron 58(12):5438–5450

    Article  Google Scholar 

  8. Cover TM (1965) Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition. IEEE Trans Electron Comput 3:326–334

    Article  MATH  Google Scholar 

  9. Moody J, Darken CJ (1989) Fast learning in networks of locally-tuned processing units. Neural Comput 1(2):281–294

    Article  Google Scholar 

  10. Bishop C (1991) Improving the generalization properties of radial basis function neural networks. Neural Comput 3(4):579–588

    Article  Google Scholar 

  11. Chen S, Cowan CF, Grant PM (1991) Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans Neural Netw 2(2):302–309

    Article  Google Scholar 

  12. Whitehead BA, Choate TD (1996) Cooperative-competitive genetic evolution of radial basis function centers and widths for time series prediction. IEEE Trans Neural Netw 7(4):869–880

    Article  Google Scholar 

  13. Schölkopf B, Sung KK, Burges CJ, Girosi F, Niyogi P, Poggio T, Vapnik V (1997) Comparing support vector machines with gaussian kernels to radial basis function classifiers. IEEE Trans Signal Process 45(11):2758–2765

    Article  Google Scholar 

  14. Mao K (2002) Rbf neural network center selection based on fisher ratio class separability measure. IEEE Trans Neural Netw 13(5):1211–1217

    Article  Google Scholar 

  15. Mao KZ, Huang GB (2005) Neuron selection for RBF neural network classifier based on data structure preserving criterion. IEEE Trans Neural Netw 16(6):1531–1540

    Article  Google Scholar 

  16. Orr MJ (1995) Regularization in the selection of radial basis function centers. Neural Comput 7(3):606–623

    Article  Google Scholar 

  17. Cohen S, Intrator N (2000) Global optimization of RBF networks. http://www.cs.tau.ac.il/~nin/papers/rbf.pdf (cf. p. 156)

  18. Schwenker F, Kestler HA, Palm G (2001) Three learning phases for radial-basis-function networks. Neural Netw 14(4):439–458

    Article  MATH  Google Scholar 

  19. Fritzke B (1994) Growing cell structures a self-organizing network for unsupervised and supervised learning. Neural Netw 7(9):1441–1460

    Article  Google Scholar 

  20. Anouar F, Badran F, Thiria S (1998) Probabilistic self-organizing map and radial basis function networks. Neurocomputing 20(1):83–96

    Article  MATH  Google Scholar 

  21. Bouchired S, Ibnkahla M, Roviras D, Castanié F (1998) Equalization of satellite mobile communication channels using combined self-organizing maps and RBF networks. In: Proceedings of the 1998 IEEE international conference on acoustics, speech and signal processing, 1998, vol 6. IEEE, pp 3377–3379

  22. Andrieu C, De Freitas N, Doucet A (2001) Robust full bayesian learning for radial basis networks. Neural Comput 13(10):2359–2407

    Article  MATH  Google Scholar 

  23. Townsend NW, Tarassenko L (1999) Estimations of error bounds for neural-network function approximators. IEEE Trans Neural Netw 10(2):217–230

    Article  Google Scholar 

  24. Ikonomopoulos A, Endou A (1998) Wavelet decomposition and radial basis function networks for system monitoring. IEEE Trans Nuclear Sci 45(5):2293–2301

    Article  Google Scholar 

  25. Lee CC, Chung PC, Tsai JR, Chang CI (1999) Robust radial basis function neural networks. IEEE Trans Syst Man Cybern B Cybern 29(6):674–685

    Google Scholar 

  26. Bruzzone L, Prieto DF (1999) A technique for the selection of kernel-function parameters in rbf neural networks for classification of remote-sensing images. IEEE Trans Geosci Remote Sens 37(2):1179–1184

    Article  Google Scholar 

  27. Ho KI, Leung CS, Sum J (2010) Convergence and objective functions of some fault/noise-injection-based online learning algorithms for rbf networks. IEEE Trans Neural Netw 21(6):938–947

    Article  Google Scholar 

  28. Tinós R, Terra MH (2001) Fault detection and isolation in robotic manipulators using a multilayer perceptron and a rbf network trained by the Kohonen’s self-organizing map. Rev Soc Bras Autom Contr Autom 12(1):11–18

    Google Scholar 

  29. Shi D, Yeung DS, Gao J (2005) Sensitivity analysis applied to the construction of radial basis function networks. Neural Netw 18(7):951–957

    Article  MATH  Google Scholar 

  30. Yeung DS, Chan PP, Ng WW (2009) Radial basis function network learning using localized generalization error bound. Inf Sci 179(19):3199–3217

    Article  MATH  Google Scholar 

  31. Tu S, Ben K, Tian L, Zhang L (2008) Combination of SOM and RBF based on incremental learning for acoustic fault identification of underwater vehicles. In: Congress on image and signal processing, 2008 (CISP’08), vol 4. IEEE, pp 38–42

  32. Yao W, Chen X, Luo W (2009) A gradient-based sequential radial basis function neural network modeling method. Neural Comput Appl 18(5):477–484

    Article  Google Scholar 

  33. Han H, Chen Q, Qiao J (2010) Research on an online self-organizing radial basis function neural network. Neural Comput Appl 19(5):667–676

    Article  Google Scholar 

  34. Ding S, Xu L, Su C, Jin F (2012) An optimizing method of rbf neural network based on genetic algorithm. Neural Comput Appl 21(2):333–336

    Article  Google Scholar 

  35. Hartono P (2016) Classification and dimensional reduction using restricted radial basis function networks. Neural Comput Appl. doi:10.1007/s00521-016-2726-5

    Google Scholar 

  36. Bishop CM (1995) Training with noise is equivalent to Tikhonov regularization. Neural Comput 7(1):108–116

    Article  Google Scholar 

  37. Wahba G (1990) Spline models for observational data, vol 59. SIAM, Philadelphia

    Book  MATH  Google Scholar 

  38. Frank A, Asuncion A (2010) UCI machine learning repository, vol 213. University of California, Irvine

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Prasenjit Dey.

Ethics declarations

Conflict of interest

We have tried our best to minimize the overlap between this manuscript and published articles for fragments of sentences and technical terms. We do not have any conflict of interest with others.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dey, P., Gopal, M., Pradhan, P. et al. On robustness of radial basis function network with input perturbation. Neural Comput & Applic 31, 523–537 (2019). https://doi.org/10.1007/s00521-017-3086-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-017-3086-5

Keywords

Navigation