Skip to main content

Continuous-State Hopfield Dynamics Based on Implicit Numerical Methods

  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN 2002 (ICANN 2002)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2415))

Included in the following conference series:

  • 79 Accesses

Abstract

A novel technique is presented that implements continuous-state Hopfield neural networks on a digital computer. Instead of the usual forward Euler rule, the backward method is used. The stability and Lyapunov function of the proposed discrete model are indirectly guaranteed, even for reasonably large step size. This is possible because discretization by implicit numerical methods inherits the stability of the continuous-time model. On the contrary, the forward Euler method requires a very small step size to guarantee convergence to solutions. The presented technique takes advantage of the extensive research on continuous-time stability, as well as recent results in the field of dynamical analysis of numerical methods. Also, standard numerical methods allow for synchronous activation of neurons, thus leading to performance enhancement. Numerical results are presented that illustrate the validity of this approach when applied to optimization problems.

This work has been partially supported by the Spanish Ministerio de Ciencia y Tecnología (MCYT), Project No. TIC2001-1758.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hopfield, J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 79 (1982) 2554–2558

    Article  MathSciNet  Google Scholar 

  2. Hopfield, J.: Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. USA 81 (1984) 3088–3092

    Article  Google Scholar 

  3. Tank, D., Hopfield, J.: ‘Neural’ computation of decisions in optimization problems. Biol. Cybern. 52 (1985) 141–152

    MATH  MathSciNet  Google Scholar 

  4. Joya, G., Atencia, M.A., Sandoval, F.: Hopfield neural networks for optimization: Study of the different dynamics. Neurocomputing 43 (2002) 219–237

    Article  MATH  Google Scholar 

  5. Vidyasagar, M.: Are analog neural networks better than binary neural networks? Circuits, Systems and Signal Processing 17 (1998) 243–270

    Article  MATH  MathSciNet  Google Scholar 

  6. Vidyasagar, M.: Minimum-seeking properties of analog neural networks with multilinear objective functions. IEEE Trans. On Automatic Control 40 (1995) 1359–1375

    Article  MATH  MathSciNet  Google Scholar 

  7. Chen, T., Amari, S.I.: New theorems on global convergence of some dynamical systems. Neural Networks 14 (2001) 251–255

    Article  Google Scholar 

  8. Smith, K., Palaniswami, M., Krishnamoorthy, M.: Neural techniques for combinatorial optimization with applications. IEEE Trans. On Neural Networks 9 (1998) 1301–1318

    Article  Google Scholar 

  9. Bharitkar, S., Tsuchiya, K., Takefuji, Y.: Microcode optimization with neural networks. IEEE Transactions on Neural Networks 10 (1999) 698–703

    Article  Google Scholar 

  10. Wang, L.: On the dynamics of discrete-time, continuous-state Hopfield neural networks. IEEE Trans. On Circuits and Systems-II 45 (1998) 747–749

    Article  Google Scholar 

  11. Tino, P., Horne, B., Giles, C.: Attractive periodic sets in discrete-time recurrent networks (with emphasis on fixed-point stability and bifurcations in two-neuron networks). Neural Computation 13 (2001) 1379–1414

    Article  MATH  Google Scholar 

  12. Stuart, A., Humphries, A.: Dynamical systems and numerical analysis. Cambridge University Press (1996)

    Google Scholar 

  13. Abe, S.: Theories on the Hopfield neural networks. In: Proc. IEE International Joint Conference on Neural Networks. Volume I. (1989) 557–564

    Article  Google Scholar 

  14. Galan-Marin, G., Muñoz Perez, J.: Design and analysis of maximum Hopfield networks. IEEE Trans. On Neural Networks 12 (2001) 329–339

    Article  Google Scholar 

  15. Atencia, M.A., Joya, G., Sandoval, F.: Numerical implementation of continuous Hopfield networks for optimization. In: Proc. European Symposium on Artificial Neural Networks. (2001) 359–364

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Atencia, M.A., Joya, G., Sandoval, F. (2002). Continuous-State Hopfield Dynamics Based on Implicit Numerical Methods. In: Dorronsoro, J.R. (eds) Artificial Neural Networks — ICANN 2002. ICANN 2002. Lecture Notes in Computer Science, vol 2415. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46084-5_220

Download citation

  • DOI: https://doi.org/10.1007/3-540-46084-5_220

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-44074-1

  • Online ISBN: 978-3-540-46084-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics