Abstract
We consider the regression model (3.2) where the errors \(\varepsilon _{i}\) are independently distributed, \(\varepsilon _{i}\) having the p.d.f.\(\bar{\varphi }_{x_{i}}(\cdot )\).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
A more standard condition, used in more general situations than regression models, is that the support of the density of the observations should not depend on the value \(\theta\) of the parameters in the model generating these observations.
- 2.
See Sect. 4.6 for a brief discussion on the application of maximum likelihood estimation to dynamical systems, for which the independence assumption does not hold.
- 3.
\(\mu _{x}\) and \(\varphi _{x,\theta }(\cdot )\) only need to satisfy the condition (4.34).
- 4.
- 5.
That is, \(\sqrt{N}(\hat{\theta }_{1}^{N} -\bar{\theta })\) is bounded in probability; see page 33.
- 6.
Typically, this implies that unknown initial values have been replaced by zero in the dynamical systems; the ML method is then called conditional ML, where conditional refers to this choice of initial values.
- 7.
Many methods exist, they receive different names (recursive ML, recursive pseudo-linear regression, recursive generalized LS, extended LS…) depending on the type of model to which they are applied and on the type of approximations used in the implementation of the Newton step.
References
Barndorff-Nielsen, O. (1978). Information and Exponential Families in Statistical Theory. Chichester: Wiley.
Beran, R. (1974). Asymptotically efficient rank estimates in location models. Ann. Statist. 2, 63–74.
Bickel, P. (1982). On adaptive estimation. Ann. Statist. 10, 647–671.
Bickel, P., C. Klassen, Y. Ritov, and J. Wellner (1993). Efficient and Adaptive Estimation for Semiparametric Models. Baltimore: Johns Hopkins Univ. Press.
Bierens, H. (1994). Topics in Advanced Econometrics. Cambridge: Cambridge Univ. Press.
Caines, P. (1988). Linear Stochastic Systems. New York: Wiley.
Cox, D. and D. Hinkley (1974). Theoretical Statistics. London: Chapman & Hall.
del Pino, G. (1989). The unifying role of iterative generalized least squares in statistical algorithms (with discussion). Statist. Sci. 4(4), 394–408.
Downing, D., V. Fedorov, and S. Leonov (2001). Extracting information from the variance function: optimal design. In A. Atkinson, P. Hackl, and W. Müller (Eds.), mODa’6 – Advances in Model–Oriented Design and Analysis, Proc. 6th Int. Workshop, Puchberg/Schneberg (Austria), pp. 45–52. Heidelberg: Physica Verlag.
Goodwin, G. and R. Payne (1977). Dynamic System Identification: Experiment Design and Data Analysis. New York: Academic Press.
Gorman, J. and A. Hero (1990). Lower bounds for parametric estimation with constraints. IEEE Trans. Information Theory 26, 1285–1301.
Green, P. (1984). Iteratively reweighted least squares for maximum likelihood estimation, and some robust and resistant alternatives (with discussion). J. Roy. Statist. Soc. B-46(2), 149–192.
Heyde, C. (1997). Quasi-likelihood and its Application. A General Approach to Optimal Parameter Estimation. New York: Springer.
Huber, P. (1981). Robust Statistics. New York: John Wiley.
Ibragimov, I. and R. Has’minskii (1981). Statistical Estimation. Asymptotic Theory. Heidelberg: Springer.
Jørgensen, B. (1997). Exponential dispersion models. J. Roy. Statist. Soc. B49, 127–167.
Le Cam, L. (1953). On some asymptotic properties of maximum likelihood estimates and related Bayes’ estimates. Univ. California Pub. in Stat. 1, 277–230.
Le Cam, L. (1960). Local asymptotically normal families of distributions. Univ. California Pub. in Stat. 3, 37–98.
Lehmann, E. and G. Casella (1998). Theory of Point Estimation. Heidelberg: Springer.
Liang, K.-Y. and S. Zeger (1995). Inference based on estimating functions in the presence of nuisance parameters. Statist. Sci. 10(2), 158–173.
Ljung, L. (1987). System Identification, Theory for the User. Englewood Cliffs: Prentice Hall.
Manski, C. (1984). Adaptive estimation of nonlinear regression models. Econometric Rev. 3(2), 145–194.
McCullagh, P. and J. Nelder (1989). Generalized Linear Models. London: Chapman & Hall. [2nd ed.].
Pázman, A. (2002b). Results on nonlinear least squares estimators under nonlinear equality constraints. J. Stat. Plann. Inference 103, 401–420.
Söderström, T. and P. Stoica (1981). Comparison of some instrumental variable methods—consistency and accuracy aspects. Automatica 17(1), 101–115.
Söderström, T. and P. Stoica (1983). Instrumental Variable Methods for System Identification. New York: Springer.
Söderström, T. and P. Stoica (1989). System Identification. New York: Prentice Hall.
Stein, C. (1956). Efficient nonparametric testing and estimation. In Proc. 3rd Berkeley Symp. Math. Stat. Prob., Volume 1, pp. 187–196. Berkeley: Univ. of California Press.
Stoica, P. (1998). On the Cramér-Rao bound under parametric constraints. IEEE Signal Proc. Lett. 5(7), 177–179.
Stoica, P. (2001). Parameter estimation problems with singular information matrices. IEEE Trans. Signal Proc. 49, 87–90.
Stone, C. (1975). Adaptive maximum likelihood estimators of a location parameter. Ann. Statist. 3(2), 267–284.
van de Geer, S. (2000). Empirical Processes in M-estimation. Cambridge: Cambridge Univ. Press.
van der Vaart, A. (1998). Asymptotic Statistics. Cambridge: Cambridge Univ. Press.
van der Vaart, A. (2002). The statistical work of Lucien Le Cam. Ann. Statist. 30(3), 631–682.
Walter, E. and L. Pronzato (1997). Identification of Parametric Models from Experimental Data. Heidelberg: Springer.
Zarrop, M. (1979). Optimal Experiment Design for Dynamic System Identification. Heidelberg: Springer.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer Science+Business Media New York
About this chapter
Cite this chapter
Pronzato, L., Pázman, A. (2013). Asymptotic Properties of M, ML, and Maximum A Posteriori Estimators. In: Design of Experiments in Nonlinear Models. Lecture Notes in Statistics, vol 212. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-6363-4_4
Download citation
DOI: https://doi.org/10.1007/978-1-4614-6363-4_4
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-6362-7
Online ISBN: 978-1-4614-6363-4
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)