Skip to main content

A Unifying View of Gradient Calculations and Learning for Locally Recurrent Neural Networks

  • Conference paper
Neural Nets WIRN VIETRI-97

Part of the book series: Perspectives in Neural Computing ((PERSPECT.NEURAL))

Abstract

In this paper a critical review of gradient-based training methods for recurrent neural networks is presented including Back Propagation Through Time (BPTT), Real Time Recurrent Learning (RTRL) and several specific learning algorithms for different locally recurrent architectures. From this survey it comes out the need for a unifying view of all the specific procedures proposed for networks with local feedbacks, that keeps into account the general framework of recurrent networks learning: BPTT and RTRL. Therefore a learning method for local feedback network is proposed which combines together the best feature of BPTT, i.e. the lowest complexity, and of RTRL, i.e. the on-line operation, and includes as special case several specific algorithms already proposed, such as Temporal Back Propagation, Back Propagation for Sequences, Back-Tsoi algorithm and some others. In the general version, this new training method allows on-line efficient and accurate gradient calculation. It compares favourably with the previous algorithms in stability, speed/complexity trade off, accuracy.

This research was supported by the Italian MURST.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. R.J. Williams, J. Peng. An efficient gradient-based algorithm for on line training of recurrent network trajectories. Neural Computation 2: 490–501,1990.

    Google Scholar 

  2. R.J. Williams, D. Zipser. A Learning Algorithm for Continually Running Fully Recurrent Neural Networks. Neural Computation 1: 270–280, 1989.

    Article  Google Scholar 

  3. A.D. Back, A.C. Tsoi. FIR and IIR synapses, a new neural network architecture for time series modelling. Neural Computation 3: 375–385, 1991.

    Article  Google Scholar 

  4. A.C. Tsoi, A.D. Back. Locally recurrent globally feedforward networks: a critical review of architectures. IEEE Transactions on Neural Networks, vol. 5, no. 2, 229–239, March 1994.

    Article  Google Scholar 

  5. P.J. Werbos. Beyond regression: New tools for prediction and analysis in the behavioural sciences. Ph.D. dissertation, Committee on Appl. Math., Harvard Univ., Cambridge, MA, Nov. 1974.

    Google Scholar 

  6. P.J. Werbos. Backpropagation through time: what it does and how to do it. Proceedings of IEEEa Special issue on neural networks, vol. 78, No. 10, pp. 1550–1560, October 1990.

    Google Scholar 

  7. E.A. Wan. Temporal backpropagation for FIR neural networks. Proceedings of the International Joint Conference on Neural Networks, 1: 575–580, 1990.

    Article  Google Scholar 

  8. A. Waibel, T. Hanazawa, G. Hinton, K. Shikano, K.J. Lang. Phoneme recognition using time-delay neural networks. IEEE Trans, on Acoustic, Speech, and Signal Processing, Vol. 37, No. 3, March 1989.

    Google Scholar 

  9. N. Benvenuto, F. Piazza, A. Uncini. Comparison of four learning algorithms for multilayer perceptron with FIR synapses. Proceeding of the IEEE International Conference of Neural Networks, 1994.

    Google Scholar 

  10. J.J. Shynk. Adaptive IIR filtering. IEEE ASSP Magazine, April 1989.

    Google Scholar 

  11. P. Campolucci, F. Piazza, A. Uncini. On-line learning algorithms for neural networks with IIR synapses. Proc. of the IEEE International Conference of Neural Networks, Perth, Nov. 1995.

    Google Scholar 

  12. P. Campolucci, A. Uncini, F. Piazza. Causal Back Propagation Through Time for Locally Recurrent Neural Networks. Proc. of the IEEE International Symposium on Circuits and Systems, Atlanta, May 1996.

    Google Scholar 

  13. Y.Bengio, R. De Mori, M.Gori. Learning the dynamic of speech with back-propagation for sequences. Pattern Recognition Letters, 13: 375–385, 1992.

    Article  Google Scholar 

  14. P.Frasconi, M.Gori, G.Soda. Local Feedback Multilayered Networks. Neural Computation 4: 120–130, 1992.

    Article  Google Scholar 

  15. B.A.Pearlmutter. Gradient Calculations for Dynamic Recurrent Neural Networks: A Survey. IEEE Trans, on Neural Networks vol. 6, no. 5, September 1995.

    Google Scholar 

  16. B. Srinivasan, U.R. Prasad and N.J. Rao. Backpropagation through Adjoints for the identification of Non linear Dynamic Systems using Recurrent Neural Models. IEEE Trans, on Neural Networks pp. 213–228, March 1994.

    Google Scholar 

  17. E.A.Wan, F. Beaufays. Diagrammatic Derivation of Gradient Algorithms for Neural Networks. Neural Computation 8: 182–201, 1996.

    Article  Google Scholar 

  18. K.S. Narendra, K. Parthasarathy. Identification and control of dynamical systems using neural networks. IEEE Trans, on Neural Networks, vol. 1, pp. 4–27, March 1990.

    Article  Google Scholar 

  19. C.-C. Ku, K.Y.Lee. Diagonal Recurrent Neural Networks for Dynamic Systems Control. IEEE Trans, on Neural Networks vol. 6, no. 1, January 1995.

    Google Scholar 

  20. F.Beaufays, E.Wan. Relating Real-Time Backpropagation and Backpropagation-Through-Time: An Application of Flow Graph Interreciprocity. Neural Computation 6: 296–306, 1994.

    Article  Google Scholar 

  21. A.D.Back, E.Wan, S.Lawrence, A.C.Tsoi. A unifying view of some training algorithms for multilayer perceptrons with FIR filter synapses. Proc. EEEE workshop on Neural Networks for Signal Processing, pp. 146–154, 1994.

    Google Scholar 

  22. B.A. Pearlmutter. Two New Learning Procedures for Recurrent Networks. Neural Networks Rev. vol. 3, no. 3, pp. 99–101, 1990.

    Google Scholar 

  23. J.L. Elman. Finding Structure in Time. Cognitive Science 14: 179–211, 1990.

    Article  Google Scholar 

  24. M.C. Mozer. A Focused Back-propagation Algorithm for Temporal Pattern Recognition. Tech Rep. CRG-TR-88-3, University of Toronto, 1988 and Complex Systems 3: 349–381, 1989.

    MATH  MathSciNet  Google Scholar 

  25. R.R. Leighton and B.C. Conrath. The Autoregressive Backpropagation Algorithm. Proc. International Joint Conference on Neural Networks, pp. 369–377, 1991.

    Google Scholar 

  26. L.B. Almeida. A learning rule for asynchronous perceptrons with feedback in combinatorial environment. Proc. International Conference on Neural Networks, vol. 2, pp. 609–618, 1987.

    Google Scholar 

  27. R.J. Williams, D. Zipser. Gradient-based learning algorithms for recurrent networks and their computational complexity. In Backpropagation: Theory, Architectures and Applications Y. Chauvin and D.E. Rumelhart, Eds. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.

    Google Scholar 

  28. T. Uchiyama, K. Shimohara, Y. Tokunaga. A modified leaky integrator network for temporal pattern recognition. Proc. of the International Joint Conference on Neural Networks, vol. 1, pp. 469–475, 1989.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag London Limited

About this paper

Cite this paper

Campolucci, P., Uncini, A., Piazza, F. (1998). A Unifying View of Gradient Calculations and Learning for Locally Recurrent Neural Networks. In: Marinaro, M., Tagliaferri, R. (eds) Neural Nets WIRN VIETRI-97. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-1520-5_3

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-1520-5_3

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-1522-9

  • Online ISBN: 978-1-4471-1520-5

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics