Skip to main content

SNNS (Stuttgart Neural Network Simulator)

  • Chapter
Neural Network Simulation Environments

Abstract

We here describe SNNS, a neural network simulator for Unix workstations that has been developed at the University of Stuttgart, Germany. Our network simulation environment is a tool to generate, train, test, and visualize artificial neural networks. The simulator consists of three major components: a simulator kernel that operates on the internal representation of the neural networks, a graphical user interface based on X-Windows to interactively create, modify and visualize neural nets, and a compiler to generate large neural networks from a high level network description language.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Carpenter, G.A., Grossberg, S.: The ART of Adaptive Pattern Recognition by a Self-Organizing Neural Network, IEEE Computer, March 1988, 77–88

    Google Scholar 

  2. Carpenter, G.A., Grossberg, S.: ARTMAP: Supervised Real-Time Learning and Classification of Nonstationary Data by a Self-Organizing Neural Network, Neural Networks, Vol. 4, No. 5, pp 543–564, 1991

    Article  Google Scholar 

  3. Fahlman, S.E.: Faster Learning Variations on Backpropagation: An Empirical Study, in Touretzky et al. (Eds):Proc. 1988 Connect. Models Summer School, CMU, Morgan Kaufmann, 1988

    Google Scholar 

  4. S. E. Fahlman, C. Lebiere: The Cascade Correlation Learning Architecture, Report CMU-CS-90–100, CMU, Pittsburgh, PA 15213, Aug. 1991

    Google Scholar 

  5. Goddard, N.H., Lynne, K..I., Mintz, T., Bukys, L.: The Rochester Connectionist Simulator: User Manual, Tech Report 233 (revised), Univ. of Rochester, NY, 1989

    Google Scholar 

  6. K.A. Grajski, G. Chinn, C. Chen, C. Kuszmaul, S. Tomboulian: Neural Network Simulation on the MasPar MP-1 Massively Parallel Processor, INNC, Paris, France, 1990

    Google Scholar 

  7. Hecht-Nielsen, R.: Neurocomputing, Addison-Wesley, 1990

    Google Scholar 

  8. K.-U. Herrmann: ART Adaptive Resonance Theory, Diplomarbeit Nr. 929, Univ. Stuttgart, Fak. Informatik, Sept. 90 (in German)

    Google Scholar 

  9. M. Jurik: Backpercolation, (probably unpublished) paper distributed by Jurik Research and Consulting, PO 2379, Aptos, CA 95001 USA

    Google Scholar 

  10. J. Kindermann, A. Linden: Inversion of neural networks by gradient descent, Parallel Computing, 14(3): 277–286, August 1990

    Article  Google Scholar 

  11. Th. Korb, A. Zell: A Declarative Neural Network Description Language, Microprocessing and Microprogramming, Vol. 27, No’s 1–5, Proceedings EuroMicro 89, North Holland, Sept. 1989, 181–185

    Google Scholar 

  12. G. Kubiak: Vorhersage von Börsenkursen mit neuronalen Netzen, Diplomarbeit Nr. 822, Univ. Stuttgart, Fak. Informatik, Okt. 91 (in German)

    Google Scholar 

  13. K. J. Lang, A. H. Waibel, G. E. Hinton: A time delay neural network for isolated word recognition, Neural Networks, Vol. 3, pp 23–44, 1990

    Article  Google Scholar 

  14. R. R. Leighton: Tha Aspirin/MIGRAINES Software Tools, User Manual, Release 5.0, The MITRE Corporation, MP-91W00050, Dez. 1991

    Google Scholar 

  15. N. Mache: Entwicklung eines massiv parallelen Simulatorkerns für neuronale Netze auf der MasPar MP-1216, Diplomarbeit Nr. 845, Univ. Stuttgart, Fak. Informatik, Feb. 92 (in German)

    Google Scholar 

  16. J. McClelland, D. Rnrnelliart: Explorations in Parallel Distributed Processing, MIT Press, 1988

    Google Scholar 

  17. E. Mesrobian, M. Stilber, J. Skrzypek: UCLA SFINX: Structure and Function in Neural Networks, UCLA-MPL-TR89–8, CSD, UCLA, 1989

    Google Scholar 

  18. Y. Miyata: A User’s Guide to PlaNet Version 5.6, Comp. Science Dept., Univ. of Colorado, Boulder, 1991

    Google Scholar 

  19. NeuralWorks Professional II: Neural Computing, Users Guide and Reference Guide, NeuralWare Inc., 1990

    Google Scholar 

  20. M. Hewetson: Pygmalion Neurocomputing, Graphic Monitor Tutorial v 1.1 and Manual, Dept. Comp. Science, University College, London

    Google Scholar 

  21. T. Poggio, F. Girosi: A Theory of Networks for Approximation and Learning, A.I. Memo No. 1140, A.I. Lab., M.I.T., 1989

    Google Scholar 

  22. M. Riedmiller, U. Braun: R prop - A Fast Adaptive Learning Algorithm, Comp. Sc. Dept., Univ. Karlsruhe, 1992, to appear in Proc. of ISCIS VII

    Google Scholar 

  23. Rumelhart, D.E., McClelland, J.A., the PDP Research Group: Parallel Distributed Processing, Vol. 1, 2, MIT Press, Cambridge MA, 1986

    Google Scholar 

  24. M. Schmalzl: Lernverfahren neuronaler Netze mit automatischer Bestimmung der Netzwerktopologie, Diplomarbeit Nr. 968, Univ. Stuttgart, Fak. Informatik, Feb. 93 (in German)

    Google Scholar 

  25. A. Singer: Implementations of Artificial Neural Networks on the Connection Machine, TMC Tech. Rep. RL 90–2, Jan. 1990 (also in Parallel Computing, 14(3), Aug. 1990, 305–316)

    Article  Google Scholar 

  26. M. Vogt: Implementierung und Anwendung von Generalized Radial Basis Functions in einem Simulator neuronaler Netze, Diplomarbeit Nr. 875, Univ. Stuttgart, Fak. Informatik, Jan. 92 (in German)

    Google Scholar 

  27. X. Zhang, M. Mckenna., J.P. Mesirov, D. L. Waltz: An efficient implementation of the backpropagation algorithm on the Connection Machine CM-2, Thinking Machines Corp. TR

    Google Scholar 

  28. A. Zell, Th. Korb, T. Sommer, R. Bayer: A Neural Network Simulation Environment, Applic. of Neural Networks Conf., SPIE Vol. 1294, April 1990, Orlando, FA, pp. 535–544

    ADS  Google Scholar 

  29. A. Zell, N. Mache, T. Sommer. T. Korb: Recent Developments of the SNNS Neural Network Simulator, Applic. Neural Networks Conf., SPIE 1991 Aerospace Sensing, Vol. 1469, April 1991, Orlando, 708–719

    ADS  Google Scholar 

  30. A. Zell, N. Mache, T. Sommer. T. Korb: The SNNS Neural Network Simulator, Mustererkennung 1091, Inf.-Fachber. 290, Springer, pp. 454–461

    Google Scholar 

  31. A. Zell, N. Mache, R. Hiibner, M. Schmalzl, T. Sommer, G. Mamier, M. Vogt: SNNS User Manual, Version 2.1, Univ. Stuttgart, Fak. Informatik, Report No. 8/92

    Google Scholar 

  32. A. Zell (Ed.) Workshop Simulation Neuronaler Netze mit SNNS, Sept. 1992, Univ. Stuttgart, Fak. Informatik, Report No. 10/92 (in German)

    Google Scholar 

  33. P. Zimmerer, A. Zell: Translations-und rotationsinvariante Erkennung von Werkstücken mit neuronaler Netzen, Mustererkennung 1991, Inf.-Fachber. 290, Springer, pp. 51–58 (in German)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1994 Springer Science+Business Media New York

About this chapter

Cite this chapter

Zell, A. et al. (1994). SNNS (Stuttgart Neural Network Simulator). In: Skrzypek, J. (eds) Neural Network Simulation Environments. The Kluwer International Series in Engineering and Computer Science, vol 254. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-2736-7_9

Download citation

  • DOI: https://doi.org/10.1007/978-1-4615-2736-7_9

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-1-4613-6180-0

  • Online ISBN: 978-1-4615-2736-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics