Skip to main content

Asymptotic Law of Likelihood Ratio for Multilayer Perceptron Models

  • Conference paper
Advances in Neural Networks - ISNN 2008 (ISNN 2008)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5263))

Included in the following conference series:

Abstract

We consider regression models involving multilayer perceptrons (MLP) with one hidden layer and a Gaussian noise. The data are assumed to be generated by a true MLP model and the estimation of the parameters of the MLP is done by maximizing the likelihood of the model. When the number of hidden units of the model is over-estimated, the Fischer information matrix of the model is singular and the asymptotic behavior of the LR statistic is unknown or can be divergent if the set of possible parameter is too large. This paper deals with this case, and gives the exact asymptotic law of the LR statistic. Namely, if the parameters of the MLP lie in a suitable compact set, we show that the LR statistic converges to the maximum of the square of a Gaussian process indexed by a class of limit score functions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Amari, S., Park, H., Ozeki, T.: Singularities Affect Dynamics of Learning in Neuromanifolds. Neural computation 18, 1007–1065 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  2. Cottrell, M., Girard, B., Girard, Y., Mangeas, M., Muller, C.: Neural Modeling for Time Series: a Statistical Stepwise Method for Weight Elimination. IEEE Transaction on Neural Networks 6, 1355–1364 (1995)

    Article  Google Scholar 

  3. Dacunha-Castelle, D., Gassiat, E.: Testing the Order of a Model Using Locally Conic Parameterization: Population Mixtures and Stationary ARMA process. The Annals of Statistics 27, 1178–1209 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  4. Fukumizu, K.: A Regularity Condition of the Information Matrix of a Multilayer Perceptron Network. Neural networks 9, 871–879 (1996)

    Article  Google Scholar 

  5. Fukumizu, K.: Likelihood Ratio of Unidentifiable Models and Multilayer Neural Networks. The Annals of Statistics 31, 833–851 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  6. Gassiat, E., Keribin, C.: The Likelihood Ratio Test for the Number of Components in a Mixture with Markov Regime. ESAIM Probability and statistics 4, 25–52 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  7. Gassiat, E.: Likelihood Ratio Inequalities with Applications to Various Mixtures. Annales de l’Institut Henri Poincaré 38, 897–906 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  8. Mangeas, M.: Neural Model Selection: How to Determine the Fittest Criterion. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 987–992. Springer, Heidelberg (1997)

    Chapter  Google Scholar 

  9. Liu, X., Shao, Y.: Asymptotics for Likelihood Ratio Tests Under Loss of Identifiability. The Annals of Statistics 31, 807–832 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  10. Sussmann, H.J.: Uniqueness of the Weights for Minimal Feed-Forward Nets with a Given Input-Output Map. Neural networks 5, 589–593 (1992)

    Article  Google Scholar 

  11. Van der Vaart, A.W.: Asymptotic Statistics. Cambridge University Press, Cambridge (1998)

    MATH  Google Scholar 

  12. White, H.: Artificial Neural Networks: Approximation and Learning Theory. Oxford, Basil Blackwell (1992)

    Google Scholar 

  13. Yao, J.: On Least Square Estimation for Stable Nonlinear AR Processes. The Annals of Institut. of Mathematical Statistics 52, 316–331 (2000)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Rynkiewicz, J. (2008). Asymptotic Law of Likelihood Ratio for Multilayer Perceptron Models. In: Sun, F., Zhang, J., Tan, Y., Cao, J., Yu, W. (eds) Advances in Neural Networks - ISNN 2008. ISNN 2008. Lecture Notes in Computer Science, vol 5263. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87732-5_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-87732-5_21

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-87731-8

  • Online ISBN: 978-3-540-87732-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics