Abstract
This paper consists of three sections. The first section gives an overview of the basic information functions, their interpretations, and dynamic information measures that have been recently developed for lifetime distributions. The second section summarizes the information features of univariate Pareto distributions, tabulates transformations of a Pareto random variable under which information measures of numerous distributions can be obtained, and gives a few characterizations of the generalized Pareto distribution. The final section summarizes information measures for order statistics and tabulates the expressions for Shannon entropies of order statistics for numerous distributions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Arnold, B. C. (1983). Pareto Distributions, International Co-operative, Publishing House, Baltimore, MD.
Arnold, B. C., Balakrishnan, N., and Nagaraja, H. N. (1992). A First Course in Order Statistics, John Wiley_& Sons, New York.
Asadi, M., and Ebrahimi, N. (2000). Residual entropy and its characterizations in terms of hazard function and mean residual life function, Statistics_& Probability Letters, 49, 263–269.
Asadi, M., Ebrahimi, N., Hamedani, G. G., and Soofi, E. S. (2004). Maximum dynamic entropy models, Journal of Applied Probability, 41, 379–390.
Asadi, M., Ebrahimi, N., and Soofi, E. S. (2005). Dynamic generalized information measures, Statistics_& Probability Letters, 71, 85–98.
Bernardo, J. M. (1979). Expected information as expected utility, Annals of Statistics, 7, 686–690.
Darbellay, G. A., and Vajda, I. (2000). Entropy expressions for multivariate continuous distributions, IEEE Transactions on Information Theory, 46, 709–712.
Di Crescenzo, A., and Longobardi, M. (2002). Entropy-based measure of uncertainty in past lifetime distributions, Journal of Applied Probability, 39, 434–440.
Di Crescenzo, A., and Longobardi, M. (2004). A measure of discrimination between past lifetime distributions, Statistics_& Probability Letters, 67, 173–182.
Ebrahimi, N. (1996). How to measure uncertainty in the residual lifetime distributions, Sankhyā, Series A, 58, 48–57.
Ebrahimi, N., and Kirmani, S. N. U. A. (1996a). A characterization of the proportional hazards model through a measure of discrimination between two residual life distributions, Biometrika, 83, 233–235.
Ebrahimi, N., and Kirmani, S. N. U. A. (1996b). A measure of discrimination between two residual lifetime distributions and its applications, Annals of Institute of Statistical Mathematics, 48, 257–265.
Ebrahimi, N., Soofi, E. S., and Zahedi, H. (2004). Information properties of order statistics and spacings, IEEE Transactions on Information Theory, 50, 177–183.
Golan, A., and Perloff, J. M. (2002). Comparison of maximum entropy and higher-order entropy estimators, Journal of Econometrics, 107, 195–211.
Good, I. J. (1950). Probability and Weighting of Evidence, Griffin, London.
Jaynes, E. T. (1957). Information theory and statistical mechanics, Physics Review, 106, 620–630.
Jaynes, E. T. (1982). On the rationale of maximum-entropy methods, Proceedings of IEEE, 70, 939–952.
Kullback, S. (1959). Information Theory and Statistics, John Wiley _& Sons, New York.
Kullback, S., and Leibler, R. A. (1951). On information and sufficiency, Annals of Mathematical Statistics, 22, 79–86.
Nadarajah, S., and Zografos, K. (2003). Formulas for Rényi information and related measures for univariate distributions, Information Science, 155, 119–138.
Park, S. (1995). The entropy of consecutive order statistics, IEEE Transactions on Information Theory, 41, 2003–2007.
Park, S. (1996). Fisher information on order statistics, Journal of the American Statistical Association, 91, 385–390.
Rényi, A. (1961). On measures of entropy and information, Proceedings of the Fourth Berkeley Symposium, 1, 547–561, University of California Press, Berkeley.
Shannon, C. E. (1948). A mathematical theory of communication, Bell System Technical Journal, 27, 379–423.
Shore, J. E., and Johnson R. W. (1980). Axiomatic derivation of the principle of maximum entropy and principle of minimum cross-entropy, IEEE Transactions on Information Theory, 26, 26–37.
Song, K. (2001). Rényi information, loglikelihood and an intrinsic distribution measure, Journal of Statistical Planning and Inference, 93, 51–69.
Soofi, E. S. (1997). Information theoretic regression methods, In Advances in Econometrics: Applying Maximum Entropy to Econometric Problems, 12 (Eds., T. B. Fomby and R. C. Hill), pp. 25–83, JAI Press, Greenwich, CT.
Wong, K.M., and Chen, S. (1990). The entropy of ordered sequences and order statistics, IEEE Transactions on Information Theory, 36, 276–284.
Zellner, A. (1971). An Introduction to Bayesian Inference in Econometrics, John Wiley_& Sons, New York (reprinted in 1996).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Birkhäuser Boston
About this chapter
Cite this chapter
Asadi, M., Ebrahimi, N., Hamedani, G.G., Soofi, E.S. (2006). Information Measures for Pareto Distributions and Order Statistics. In: Balakrishnan, N., Sarabia, J.M., Castillo, E. (eds) Advances in Distribution Theory, Order Statistics, and Inference. Statistics for Industry and Technology. Birkhäuser Boston. https://doi.org/10.1007/0-8176-4487-3_13
Download citation
DOI: https://doi.org/10.1007/0-8176-4487-3_13
Publisher Name: Birkhäuser Boston
Print ISBN: 978-0-8176-4361-4
Online ISBN: 978-0-8176-4487-1
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)