Skip to main content

Bootstrap Methods: Another Look at the Jackknife

  • Chapter
Breakthroughs in Statistics

Part of the book series: Springer Series in Statistics ((PSS))

Abstract

We discuss the following problem given a random sample X = (X 1, X 2,…, X n) from an unknown probability distribution F, estimate the sampling distribution of some prespecified random variable R(X, F), on the basis of the observed data x. (Standard jackknife theory gives an approximate mean and variance in the case R(X, F) = \(\theta \left( {\hat F} \right) - \theta \left( F \right)\), θ some parameter of interest.) A general method, called the “bootstrap”, is introduced, and shown to work satisfactorily on a variety of estimation problems. The jackknife is shown to be a linear approximation method for the bootstrap. The exposition proceeds by a series of examples: variance of the sample median, error rates in a linear discriminant analysis, ratio estimation, estimating regression parameters, etc.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anderson, T.W. (1958). An Introduction to Multivariate Statistical Analysis. Wiley, New York.

    MATH  Google Scholar 

  2. Barnard. G. (1974) Conditionality. pivotals, and robust estimation. Proceedings of the Conference on Foundational Questions in Statistical Inference. Memoirs No. 1. Dept. of Theoretical Statist., Univ. of Aarhus. Denmark.

    Google Scholar 

  3. Cramér, H. (1946). Mathematical Methods in Statistics. Princeton Univ. Press.

    Google Scholar 

  4. Gray, H., Schucany. W. and Watkins, T. (1975). On the generalized jackknifc and its relation to statistical differentials. Biometrika 62 637–642.

    MATH  Google Scholar 

  5. Hartigan, J.A. (1969). Using subsamplc values as typical values. J. Amer. Statist. Assoc. 64 1303–1317.

    Article  MathSciNet  Google Scholar 

  6. Hartigan, J.A. (1971). Error analysis by replaced samples. J. Roy. Statist. Soc. Ser. B 33 98–110.

    MATH  Google Scholar 

  7. Hartigan, J.A. (1975). Necessary and sufficient conditions for asymptotic joint normality of a statistic and its subsample values. Ann. Statist. 3 573–580.

    MathSciNet  MATH  Google Scholar 

  8. Hinkley, D. (1976a). On estimating a symmetric distribution. Biometrika 63 680.

    Article  MathSciNet  MATH  Google Scholar 

  9. Hinkley. D. (1976b). On jackknifing in unbalanced situations. Technical Report No. 22, Division of Biostatistics. Stanford Univ.

    Google Scholar 

  10. Jaeckel. L (1972). The infinitesimal jackknifc. Bell Laboratories Memorandum #MM 72–1215–11.

    Google Scholar 

  11. Kendall. M. and Stuart, A. (1950). The Advanced Theory of Statistics. Hafner, New York.

    Google Scholar 

  12. Lachenbruch. P. and Mickey, R. (1968). Estimation of error rates in discriminant analysis. Technometrics 10 1–11.

    Article  MathSciNet  Google Scholar 

  13. Maritz, J.S. and Jarrett, R.G. (1978). A note on estimating the variance of the sample median. J. Amer. Statist. Assoc. 73 194–196.

    Article  Google Scholar 

  14. Miller. R.G. (1974a). The jackknife—a review. Biometrika 61 1–15.

    MathSciNet  MATH  Google Scholar 

  15. Miller, R.G. (1974b). An unbalanced jackknife. Ann. Statist. 2 880–891.

    Article  MathSciNet  MATH  Google Scholar 

  16. Noether. G. (1967). Elements of Nonparametric Statistics. Wiley, New York.

    MATH  Google Scholar 

  17. Toussaint, G. (1974). Bibliography on estimation of misclassification. IEEE Trans. Information Theory 20 472–479.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1992 Springer-Verlag New York, Inc.

About this chapter

Cite this chapter

Efron, B. (1992). Bootstrap Methods: Another Look at the Jackknife. In: Kotz, S., Johnson, N.L. (eds) Breakthroughs in Statistics. Springer Series in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4612-4380-9_41

Download citation

  • DOI: https://doi.org/10.1007/978-1-4612-4380-9_41

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-0-387-94039-7

  • Online ISBN: 978-1-4612-4380-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics