Skip to main content

Comparing implementations of Radial Basis Function Neural Networks on three parallel machines

  • Implementation
  • Conference paper
  • First Online:
From Natural to Artificial Neural Computation (IWANN 1995)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 930))

Included in the following conference series:

Abstract

In this paper we compare the implementations of Radial Basis Function (RBF) Neural Network on three parallel Neuro-Computers: the DRA machine (1D), the SMART machine (1D) and the MANTRA machine (2D). RBF networks can be used as probability density function estimators in a classification framework. The amount of calculation required for the simulation of such networks grows rapidly with the size of the learning database. Due to the highly parallel nature of RBF networks, parallel architectures are ideal candidates for such simulations. In this work we have tried to make a comparison of the three architectures based on the efficiency measure. We conclude this paper by outlining the different algorithmic constraints imposed by the particularities of each of the three architectures. We also discuss the I/O limitations for real time classification. Finally, we consider two real data-bases examples on which we compare the different machines.

Part of this work has been funded by the ESPRIT-BRA project number 6891, ELENA-Nerves2, supported by the Commission of the European Communities (DG XIII)

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. P. Bessiere, A. Chams, and T. Muntean. A Virtual Machine Model for Artificial Neural Network Programming. In Int. Neural Network Conf., Paris, July 1990.

    Google Scholar 

  2. F. Blayo, A. Guérin-Dugué, and N. Maria. Implementing Radial Basis Function Neural Networks on the Systolic MANTRA machine. Submitted to IWANN 95, June 1995.

    Google Scholar 

  3. P. Comon, J.L. Voz, and M. Verleysen. Estimation of Performance Bounds in Supervised Classification. In Michel Verleysen, editor, ESANN: European Symposium on Artificial Neural Networks, pages 37–42, Brussels, Belgium, April 1994.

    Google Scholar 

  4. R Duda and P. Hart. Pattern Classification and Scene Analysis. John Willey & Sons, 1973.

    Google Scholar 

  5. C. Feng, A. Sutherland, and S. King. Comparison of machine learning classifiers to statistics and neural networks. In AI and Statistics Conference, 1993.

    Google Scholar 

  6. P. Ienne. Architectures for Neuro-Computers: Review and Performance. Technical Report 93/21, LAMI-EPFL, Lauzanne, Switzerland, 1993.

    Google Scholar 

  7. A. Johannet, L. Personnaz, G. Dreyfus, J.D. Gascuel, and M. Weinfeld. Specification and implementation of a digital Hopfield-type associative memory with on-chip training. IEEE Transactions on Neural Networks, 3(4):529–539, July 1992.

    Article  Google Scholar 

  8. K. T. Johnson and A.R. Hurson. General-Purpose Systolic Arrays. IEEE Comp., pages 20–31, November 1993.

    Google Scholar 

  9. C. Jutten and P. Comon. Neural Bayesian Classifier. In New Trends in Neural Computation, number 686 in Lecture Notes in Computer Science, pages 119–124. Springer-Verlag, 1993.

    Google Scholar 

  10. H. T. Kung. Why systolic architectures. IEEE Comp., 15(1):37–46, January 1982.

    Google Scholar 

  11. S. Y. Kung and J. N. Hwang. Parallel Architectures for Artificial Neural Nets. In IEEE Int. Conf. On Neural Networks, volume 2, pages 165–172, San Diago, July 1988.

    Article  Google Scholar 

  12. J.C. Lawson, N. Maria, and Hérault J. SMART: A Neuro-computer Using Sparse Matrices. In Euro Micro PDF Workshop, Canari (Spain), January 1993.

    Google Scholar 

  13. C. Lehmann, M. Viredaz, and F. Blayo. A Generic Systolic Array Building Block for Neural Networks with On-Chip Learning. IEEE Transactions on Neural Networks, 4(3):400–407, May 1993.

    Article  Google Scholar 

  14. P. Maffezzoni and P. Gubian. VLSI Design of Radial Functions Hardware Generator for Neural Computations. In Fourth International Conference on Microelectronics for Neural Networks and Fuzzy Systems, pages 252–259. IEEE Computer Society Press, September 1994.

    Google Scholar 

  15. N. Maria, A. Guérin-Dugué, and Blayo N. 1D and 2D systolic implementations for Radial Basis Functions Networks. In Fourth International Conference on Microelectronics for Neural Networks and Fuzzy Systems, pages 34–45, Turin, Italy, September 1994. IEEE Computer Society Press.

    Google Scholar 

  16. J.M. Moreno. VLSI Architectures for Evolutive Neural Models. PhD thesis, Universitat Politècnica de Catalunya, Spain, December 1994.

    Google Scholar 

  17. R.M. Sanner and J.J.E. Stoline. Gaussian Networks for Direct Adaptive Control. IEEE Transactions on Neural Networks, 3(6):837–863, November 1992.

    Google Scholar 

  18. B.W. Silverman. Density Estimation for Statistics and Data Analysis. Chapman and Hall, 1986.

    Google Scholar 

  19. D. Specht. Probabilistic Neural Networks. Neural Networks, 3(1):109–118, 1989.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

José Mira Francisco Sandoval

Rights and permissions

Reprints and permissions

Copyright information

© 1995 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Maria, N., Guérin-Dugué, A., Moreno, J.M., Blayo, F. (1995). Comparing implementations of Radial Basis Function Neural Networks on three parallel machines. In: Mira, J., Sandoval, F. (eds) From Natural to Artificial Neural Computation. IWANN 1995. Lecture Notes in Computer Science, vol 930. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-59497-3_249

Download citation

  • DOI: https://doi.org/10.1007/3-540-59497-3_249

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-59497-0

  • Online ISBN: 978-3-540-49288-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics