Informatics and Applications
2020, Volume 14, Issue 4, pp 47-54
DETERMINISTIC AND RANDOMIZED METHODS OF ENTROPY PROJECTION FOR DIMENSIONALITY REDUCTION PROBLEMS
- Y. S. Popkov
- A. Y. Popkov
- Y. A. Dubnov
Abstract
The work is devoted to development of methods for deterministic and randomized projection aimed at dimensionality reduction problems. In the deterministic case, the authors develop the parallel reduction procedure minimizing Kullback-Leibler cross-entropy target to condition on information capacity based on the gradient projection method. In the randomized case, the authors solve the problem of reduction of feature space. The idea of application of projection procedures for reduction of data matrix is implemented in the proposed method of randomized entropy projection where the authors use the principle of keeping average distances between high- and low-dimensional points in the corresponding spaces. The problem leads to searching of a probability distribution maximizing Fermi entropy target to average distance between points.
[+] References (13)
- Bruckstein A.M., Donoho D.L., Elad M. 2009. From sparse solutions of systems of
equations to sparse modeling of signals and images. SIAM Rev. 51(1):34–81.
- Kendall M.G., Stuart A. 1961. The advanced theory of statistics. London:Charles Griffin, 1961. Vol.2. 676p.
- Jolliffe I. Principal component analysis. — New York, NY: Springer. 488 p. doi: 10.1007/b98835.
- Polyak, B.T., and M.T. Khlebnikov. 2017. Principal component analysis: Robust versions. Automat. Rem. Contr.
78:490–506.
- Bingham, E., and H. Mannila. 2001. Random projection in dimensionality reduction: Applications to image and
text data. 7th ACMSIGKDD Conference (International) on Knowledge Discovery and Data Mining Proceedings. ACM. 245-250. doi: 10.1145/502512.502546.
- Vempala, S. S. 2004. The random projection method. DIMACS ser. in discrete mathematics and theoretical computer science. Providence, RI: American Mathematical Society. Vol. 65. 105 p.
- Popkov, Y. S., YA. Dubnov, and A. Y. Popkov. 2018. En-tropy dimension reduction method for randomized ma-chine learning problems. Automat. Rem. Contr. 79(11): 2038-2051.
- Kullback, S., andR. A. Leibler. 1951. On information and sufficiency. Ann. Math. Stat. 22(1):79-86.
- Popkov, Y. S., and A. Y Popkov. 2019. Cross-entropy op-timal dimensionality reduction with a condition on infor-mation capacity. Dokl. Math. 100:420-422.
- Magnus, J. R., and H. Neudecker. 1988. Matrix differential calculus with applications in statistics and econometrics. Chichester - New York - Brisbane - Toronto - Singapore: John Wiley & Sons. 393 p.
- Popkov, Y. S. 2020. Asymptotic efficiency of maximum entropy estimates. Dokl. Math. 102:350-352. doi: org/ 10.1134/S106456242004016X.
- Joffe, A. D., and V. M. Tikhomirov. 1984. Teoriya ekstre- mal'-nykh zadach [Theory of extreme problems]. Moscow: Nauka. 481 p.
- Popkov, Yu. S. 1995. Macrosystems theory and its appli-cations. Lecture notes in control and information sciences ser. Berlin-Heidelberg: Springer-Verlag. Vol. 203. 327 p.
[+] About this article
Title
DETERMINISTIC AND RANDOMIZED METHODS OF ENTROPY PROJECTION FOR DIMENSIONALITY REDUCTION PROBLEMS
Journal
Informatics and Applications
2020, Volume 14, Issue 4, pp 47-54
Cover Date
2020-12-30
DOI
10.14357/19922264200407
Print ISSN
1992-2264
Publisher
Institute of Informatics Problems, Russian Academy of Sciences
Additional Links
Key words
dimensionality reduction; Kullback-Leibler cross-entropy; entropy
Authors
Y. S. Popkov , , , A. Y. Popkov , and Y. A. Dubnov ,
Author Affiliations
Federal Research Center "Computer Science and Control" of the Russian Academy of Sciences, 44-2 Vavilov Str., Moscow 119333, Russian Federation
V. A. Trapeznikov Institute of Control Sciences, Russian Academy of Sciences, 65 Profsoyuznaya Str., Moscow 117997, Russian Federation
ORT Braude College, Karmiel 2161002, Israel
National Research University Higher School of Economics, 20 Myasnitskaya Str., Moscow 101000, Russian Federation
|