Abstract
We investigate an algorithm to find a point of an affine subspace in the positive orthant such that it is the closest one to the original point with respect to the Kullback–Leibler distance. This problem is solved by means of the classical Darroch–Ratcliff algorithm (see [1]), while we use ideas of the information geometry founded by Chentsov (see [2]) and Csiszar (see [3]). The main theorem of the present work proves the convergence of that algorithm (the method of the proof is different from previous ones). The proposed algorithm can be applied, e.g., to find the maximum likelihood estimates in an exponential family (see the last section of the paper).
Similar content being viewed by others
References
Darroch, J. and Ratcliff, D., Generalized iterative scaling for log-linear models, Ann. Math. Stat., 1972, vol. 43, pp. 1470–1480.
Chentsov, N.N., Statisticheskie reshayushchie pravila i optimal’nye vyvody (Statistical Decision Rules and Optimal Conclusions), Moscow: Nauka, 1972.
Csiszar, I., I-divergence geometry of probability distributions and minimizationproblems, Ann. Probab., 1975, vol. 3, pp. 146–158.
Csiczar, I. and Shields, P., Information Theory and Statistics: A Tutorial, Boston: Now Publishers Inc., 2004.
Author information
Authors and Affiliations
Corresponding author
Additional information
Original Russian Text © D.V. Vinogradov, 2016, published in Nauchno-Tekhnicheskaya Informatsiya, Seriya 2: Informatsionnye Protsessy i Sistemy, 2016, No. 6, pp. 23–29.
About this article
Cite this article
Vinogradov, D.V. An algorithm for information projection to an affine subspace. Autom. Doc. Math. Linguist. 50, 133–138 (2016). https://doi.org/10.3103/S0005105516030109
Received:
Published:
Issue Date:
DOI: https://doi.org/10.3103/S0005105516030109