Skip to main content

Noise Resistant Training for Extreme Learning Machine

  • Conference paper
  • First Online:
Advances in Neural Networks - ISNN 2017 (ISNN 2017)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10262))

Included in the following conference series:

  • 2879 Accesses

Abstract

The extreme learning machine (ELM) concept provides some effective training algorithms to construct single hidden layer feedforward networks (SHLFNs). However, the conventional ELM algorithms were designed for the noiseless situation only, in which the outputs of the hidden nodes are not contaminated by noise. This paper presents two noise-resistant training algorithms, namely noise-resistant incremental ELM (NRI-ELM) and noise-resistant convex incremental ELM (NRCI-ELM). For NRI-ELM, its noise-resistant ability is better than that of the conventional incremented ELM algorithms. To further enhance the noise resistant ability, the NRCI-ELM algorithm is proposed. The convergent properties of the two proposed noise resistant algorithms are also presented.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2(5), 359–366 (1989)

    Article  Google Scholar 

  2. Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Netw. 17(4), 879–892 (2006)

    Article  Google Scholar 

  3. Huang, G.B., Chen, L.: Convex incremental extreme learning machine. Neurocomputing 70, 3056–3062 (2007)

    Article  Google Scholar 

  4. Burr, J.: Digital neural network implementations. In: Neural Networks, Concepts, Applications, and Implementations. Prentice Hall, Englewood Cliffs, NJ (1995)

    Google Scholar 

  5. Liu, B., Kaneko, K.: Error analysis of digital filter realized with floating-point arithmetic. Proc. IEEE 57(10), 1735–1747 (1969)

    Article  Google Scholar 

  6. Mahvash, M., Parker, A.C.: Synaptic variability in a cortical neuromorphic circuit. IEEE Trans. Neural Netw. Learn. Syst. 24(3), 397–409 (2013)

    Article  Google Scholar 

  7. Leung, C.-S., Wang, H., Sum, J.: On the selection of weight decay parameter for faulty networks. IEEE Trans. Neural Netw. 21(8), 1232–1244 (2010)

    Article  Google Scholar 

  8. Leung, C.-S., Sum, J.: RBF networks under the concurrent fault situation. IEEE Trans. Neural Netw. Learn. Syst. 23(7), 1148–1155 (2012)

    Article  Google Scholar 

  9. Sugiyama, M., Ogawa, H.: Optimal design of regularization term and regularization parameter by subspace information criterion. Neural Netw. 15(3), 349–361 (2002)

    Article  Google Scholar 

  10. Lichman, M.: UCI machine learning repository (2013). http://archive.ics.uci.edu/ml

Download references

Acknowledgment

The work was supported by a research grant from the Government of the Hong Kong Special Administrative Region (CityU 11259516).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chi-Sing Leung .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Lui, Y.L., Wong, H.T., Leung, CS., Kwong, S. (2017). Noise Resistant Training for Extreme Learning Machine. In: Cong, F., Leung, A., Wei, Q. (eds) Advances in Neural Networks - ISNN 2017. ISNN 2017. Lecture Notes in Computer Science(), vol 10262. Springer, Cham. https://doi.org/10.1007/978-3-319-59081-3_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-59081-3_31

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-59080-6

  • Online ISBN: 978-3-319-59081-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics