Skip to main content

Part of the book series: Lecture Notes in Economics and Mathematical Systems ((LNE,volume 497))

  • 135 Accesses

Abstract

For both consistency and asymptotic normality of the GMM estimator it is not necessary to assume that \(\hat \theta \) precisely minimizes the GMM objective function (2.1.6). Andrews (1997) points out that for Theorem 2 (consistency) \(\hat \theta \) is required to be within Op(1) of the global minimum and for Theorem 3 (asymptotic normality), \(\hat \theta \) is required to be within Op(n-0.5), where Xn = Op(an) conveniently abbreviates plimXn/an =0 (cf. Amemiya, 1985, p. 89). The estimator \(\hat \theta \) is usually obtained by iterative numerical optimization methods like the Newton-Raphson algorithm (cf. Amemiya, 1985, ch. 4.4). Starting from any value of the parameter space this procedure produces a sequence of estimates \(\hat \theta \)j ( j = 0,1,2,…) which hopefully converges to the global minimum of the objective function. A typical Newton-Raphson iteration to the solution of the minimization problem (2.1.6) has the form

$$ \tilde \theta _{j + 1} = \tilde \theta _j - \left[ {\left( {\tfrac{1} {n}\sum\limits_{i = 1}^n {G\left( {Z_i ,\tilde \theta _j } \right)} } \right)^\prime \hat W\left( {\tfrac{1} {n}\sum\limits_{i = 1}^n {G\left( {Z_i ,\tilde \theta _j } \right)} } \right)} \right]^{ - 1} x \left( {\tfrac{1} {n}\sum\limits_{i = 1}^n {G\left( {Z_i ,\tilde \theta _j } \right)} } \right)^\prime \hat W\left( {\tfrac{1} {n}\sum\limits_{i = 1}^n {G\left( {Z_i ,\tilde \theta _j } \right)} } \right) $$
(4.1.1)

Convergence to a global minimum is ensured by this algorithm if the objective function is convex which, however, should be the exception for many nonlinear models encountered in microeconometric applications as discussed in the previous chapter. Otherwise the iteration routines could run into a local minimum which renders the parameter estimators inconsistent and alters their asymptotic distribution. To circumvent this problem Andrews (1997) proposes an optimization algorithm which guarantees consistency and asymptotic normality of the resulting GMM estimators provided that r > q holds. Andrews’ method is described in detail in the next section.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Inkmann, J. (2001). Computation of GMM Estimators. In: Conditional Moment Estimation of Nonlinear Equation Systems. Lecture Notes in Economics and Mathematical Systems, vol 497. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-56571-7_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-56571-7_4

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-41207-6

  • Online ISBN: 978-3-642-56571-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics