Институт проблем информатики Российской Академии наук
Институт проблем информатики Российской Академии наук
Российская Академия наук

Институт проблем информатики Российской Академии наук



«Systems and Means of Informatics»
Scientific journal
Volume 30, Issue 4, 2020

Content | About  Authors

Abstract and Keywords.

SOFTWARE OF RESEARCH IN STATISTICAL DATA ANALYSIS
  • M. P. Krivenko

Abstract: When developing statistical methods for data analysis in relation to the problems of observation classification in the field of diagnostic medicine, sociology, word processing, and image recognition, it is necessary to conduct studies of the viability of the data models used, to experiment with data and algorithms for their processing. To do this, one has to turn to the capabilities of computer technology and create the appropriate software. The article discusses the experience of developing a version of software aimed at solving data classification problems. The composition of the modules is given, which ensures the construction and use of probabilistic data models, the estimation of their parameters, and the study of the effectiveness of the proposed decision-making procedures. It also provides examples of solving specific applied problems and briefly describes the software elements used. The conditions and possible ways of further development of the structure, composition, and content of statistical analysis procedures are characterized.

Keywords: software; statistical data analysis; Delphi; applications in various areas

AVERAGE PROBABILITY OF ERROR IN CALCULATING WAVELET-VAGUELETTE COEFFICIENTS WHILE INVERTING THE RADON TRANSFORM
  • A. A. Kudryavtsev
  • O. V. Shestakov

Abstract: Image reconstruction methods based on decomposition of the image function in a special wavelet basis and subsequent thresholding of the decomposition coefficients are used to solve computational tomography problems. Their attractiveness lies in adaptation to spatial inhomogeneities of images and the possibility of reconstructing local areas of the image from incomplete projection data that is of key importance, for example, for medical applications where it is undesirable to expose a patient to an unnecessary dose of radiation. The analysis of errors of these methods is an important practical task, since it allows one to assess the quality of both the methods themselves and the equipment used. The paper considers the wavelet-vaguelette decomposition method for reconstructing tomographic images in a model with an additive Gaussian noise. The order of the loss function based on the average probability of error in calculating wavelet coefficients is estimated.

Keywords: Radon transform; wavelet-vaguelette decomposition; thresholding; loss function

APPLICATION OF CLUSTERING IN DEPLOYMENT OF MOBILE ACCESS POINTS IN AIR-GROUND WIRELESS NETWORKS
  • E. G. Medvedeva
  • E. M. Khayrov
  • N. A. Polyakov
  • Yu. V. Gaidamaka

Abstract: The paper provides an overview of tasks that arise in wireless networks with mobile base stations located on unmanned aerial vehicles (UAV).
The authors choose two methods of adaptive navigation based on user clustering for a comparative analysis of effectiveness of network deployment - the k- means method and the particle swarm method. The solution to the optimization problem is the positions of UAV that maximize the probability of coverage in the communication provision area with restrictions on interference from neighboring base stations. The application of the methods is illustrated for the scenario of a concert event which is defined as a case for providing communication between participants of a mass event in an open area.

Keywords: UAV; air-ground network; aerial-terrestrial communication; particle swarm; k-means; coverage probability

FUZZY CONTROL OF HETEROGENEOUS THINKING OF THE HYBRID INTELLIGENT MULTIAGENT SYSTEM'S AGENTS
  • S. V. Listopad
  • S. B. Rumovskaya

Abstract: The paper is devoted to the development of the fuzzy inference subsystem for controlling the processes of collective heterogeneous thinking of agents of a hybrid intelligent multiagent system by an agent-facilitator. It allows the agent-facilitator to organize the work of the agents of the system in accordance with S. Kaner's diamond of participatory decision-making model.
The model reduces the group dynamics of the system into three sequential phases: the divergent thinking stage at which alternative solutions to the problem are developed; the groan stage at which the agent-facilitator uses methods that enhance "mutual understanding" between the agents; and the convergent thinking stage when the proposed alternatives are classified, ranked, and refined to make an agreed decision. Thanks to this mechanism, the relevance of the system to the practice of collective problem solving by experts under the guidance of a decision maker, with the mediation of a facilitator, increases. In combination with a hybrid component of an intelligent system and a multiagent approach underlying its architecture, modeling of collective heterogeneous thinking provides it with the ability to solve practical problems without significant simplification in a dynamic environment that limits the time to develop and make a decision.

Keywords: heterogeneous thinking; hybrid intelligent multiagent system; fuzzy inference system; expert team

BIOSIGNAL STATISTICAL ANALYSIS IN THE STUDY OF HUMAN VISUAL WORKING MEMORY
  • A. V. Erofeeva
  • T. V. Zakharova

Abstract: The article is devoted to EEG (electroencephalography) studying of connectivity of cortical areas involved in keeping vision information in the working memory. There was used VAR-modeling for describing signals got from connected with working memory brain zones. Brain connections were estimated the based on Granger Causality Partial Directed Coherence (PDC) and then compared by Wilcoxon signed-rank test. Connection intensity dependence on the executing task was found.

Keywords: EEG; functional connectivity; vision working memory; PDC; VAR-model; Wilcoxon signed-rank test; Granger causality

CONSTRUCTING PROCESS MODELS REPRESENTED BY SIMPLE PETRI NETS
  • I. Yu. Teryokhina
  • A. A. Grusho
  • E. E. Timonina
  • S. Ya. Shorgin

Abstract: The paper deals with the problem of "workflow mining." Workflow mining is numerous techniques for discovering processes' models represented by their workflow log. The paper considers process models in terms of simple Petri nets. It is shown that constructing a correct model when a process contains equal tasks is not always an attainable goal. Moreover, it was revealed that in the case when a model has transitions with no correspondence to any process task, the relation between the causal relations detected in the log and the presence of places connecting transitions in the Petri net is violated.

Keywords: Petri nets; workflow mining; process modeling

END-TO-END INFORMATION SECURITY OF PRIVATE CLOUD COMPUTING
  • A. A. Grusho
  • A. V. Nikolaev
  • V. O. Piskovski
  • V. V. Senchilo
  • E. E. Timonina

Abstract: The paper discusses building of a secure enterprise cloud that uses unsecure components. The most important solution is for the organization's employees to work remotely. The solution is based on provision of a simple subcloud that implements secure interactions of remote employees among themselves, cloud services, and interaction with resources in the global Internet.

Keywords: information security; cloud computing; security of remote work on unsecure terminals

ON DECODING ALGORITHMS FOR GENERALIZED REED-SOLOMON CODES
  • S. M. Ratseev
  • O. I. Cherevatenko

Abstract: The paper is devoted to decoding algorithms for generalized Reed- Solomon codes that are based on algorithms for Reed-Solomon codes. The Gao, Sugiyama, and Berlekamp-Massey algorithms (Peterson-Gorenstein-Zierler algorithm) are given. The first of these algorithms belongs to syndrome-free decoding algorithms, the others - to syndrome decoding algorithms. The relevance of these algorithms is that they are applicable for decoding Goppa codes which are the basis of some promising postquantum cryptosystems. These algorithms are applicable for Goppa codes over an arbitrary field as opposed to the well-known Patterson decoding algorithm for binary Goppa codes.

Keywords: error-correcting codes; Reed-Solomon codes; Goppa codes; code decoding

MULTICORE HYBRID RECURRENT ARCHITECTURE EXPANSION ON FPGA
  • Yu. A. Stepchenkov
  • N. V. Morozov
  • Yu. G. Diachenko
  • D. V. Khilko
  • D. Yu. Stepchenkov

Abstract: The paper presents the result of modification of the multicore hybrid architecture for recurrent signal processing (HARSP) and discusses its approbation as a prototype on the next-generation HAN Pilot Platform development board with FPGA (field-programmable gate array) Intel Arria10 SoC 10AS066K3F40E2SG on the basis of the register transfer level VHDL (very high speed integrated circuits) model. Hybrid architecture for recurrent signal processing contains the control level, implemented as von Neumann processor, and the operational level represented by the data-flow processor with eight computing cores. A capsule distributor combines all computing cores. It provides algorithmic capsule explication into a parallel-serial command flow and processes 32-bit data. Hardware implementation of the control level dual-core processor Cortex-A9 improved HARSP performance radically and increased data processing accuracy due to using 32-bit fixed-point operands. Modified HARSP VHDL-model approbation on a typical data processing application, namely, isolated word recognition, proved HARSP high efficiency in real-time mode operation.

Keywords: recurrent signal processor; multicore hybrid architecture; data-flow; VHDL-model; FPGA; development board; isolated word recognizer

QUADTREE BASED COLOR IMAGE SEGMENTATION METHOD
  • Yu. A. Maniakov
  • A. I. Sorokin

Abstract: The paper presents a color image segmentation method and an algorithm based on quadtree. The proposed method consists of several steps.
First of them is border detection based on three-channel color and two masks.
Then, the authors apply the thinning algorithm to decrease the area of the found boundary. The segmentation algorithm is divided into two parts. In the first part, the image is divided into segments as much as possible. In the second part, the segments union algorithm uses the finding neighbor's ID based on the FSM table and applies a link to ID to create a graph. The results of color images segmentation obtained on the basis of the described algorithm are presented.

Keywords: color image; segmentation; quadtree; edge; border; thinning; color reduction; split; merge; pixel

SOFTWARE PACKAGE FOR POSITIONING OF ACOUSTIC BOTTOM SYSTEMS
  • V. A. Smirnov
  • N. N. Skvortsova
  • E. M. Konchekov
  • V. A. Larichev
  • G. A. Maximov
Abstract: The problem of accurate positioning of the receiving elements of the bottom seismic streamers is considered. A new software package for processing, based on the algorithms of correlation and spectral analysis, of radio-physical signals is developed. Seismic streamers were tested in the Gelendzhik bay on the Black Sea. The system positioning results, obtained using the developed algorithms according to the measured acoustic distances, are presented and compared with the echo sounder measurement data. It is shown that measurements of acoustic distances using the developed software meet the requirements for positioning of two-dimensional marine seismic systems.

Keywords: hydroacoustics; monitoring; marine seismic; solid-state bottom digital seismic streamer; positioning; correlation analysis; spectral analysis

MACHINE TRANSLATION: INDICATOR-BASED EVALUATION OF TRAINING PROGRESS IN NEURAL PROCESSING
  • A. Yu. Egorova
  • I. M. Zatsman
  • M. G. Kruzhkov
  • V. A. Nuriev

Abstract: The paper presents data collected while observing training progress of a neural machine translation (NMT) engine. The observed training progress received qualitative evaluation based on a set of indicators. Two hundred and fifty text fragments in Russian were used as experimental material for the study. For the duration of one year, every month these fragments were translated into French using the publicly available Google's NMT engine. The produced translations were recorded and annotated by language experts in a supracorpora database which resulted in a series of 12 annotated translations for each of the 250 Russian fragments. The annotations include labels of translation errors which enables researchers to determine the NMT instability types according to the changes of translation quality or lack thereof. The goal of this paper is to describe the newly developed indicator-based approach and to provide an example of its application to evaluation of a neural network training progress.

Keywords: neural machine translation; instability of machine translation; indicator-based evaluation; linguistic annotation; instability types

MECHANISM OF TEMPORAL COMPARISON OF CONCRETE-HISTORICAL FACTS
  • I. M. Adamovich
  • O. I. Volkov

Abstract: The article continues the series of works devoted to the technology of concrete historical research supporting. The technology is based on the principles of co-creation and crowdsourcing and is designed for a wide range of users which are not professional historians and biographers. The article is devoted to the further development of the technology by integrating the mechanism of temporal comparison of concrete-historical facts. The main directions of facts comparison in concrete-historical research are given. The high significance of the subtask of facts comparison and linking along the time axis in order to determine the sequence of events in the life of the object of research and clarify their dating is established. The approach to automation of this subtask in the context of the technology based on the mechanism for the automated search for contradictions in concrete-historical information is proposed. The possibility of reducing the problem of finding admissible time intervals to the linear programming problem is shown. The use of the Simplex algorithm for implementation of the algorithm of temporal comparison of concrete-historical facts is reasoned.

Keywords: concrete historical investigation; distributed technology; facts comparison; historical-biographical fact; linear programming

BLOCKCHAIN TECHNOLOGIES FOR NORMALIZED BUDGET SUPPORT OF NATIONAL PROJECTS
  • A. V. Ilyin
  • V. D. Ilyin

Abstract: Blockchain technologies for normalized budget support (NBS- technologies) are considered as means necessary for resource support of design and implementation of national projects for protection and development of the country's potential. The part of the methodological support for development of NBS-technologies considered in the article includes methodologies of situational online budgeting and normalized commodity-money circulation. Online budget planning is considered as a problem of the interval cost planning, taking into account the situationally dependent mandatory and orienting requirements for the desired solution. The problem is solved by means of the method of target displacement of solution in the computational experiment mode. The methodology of normalized commodity-money circulation includes methods for formation and implementation of contractual relations in the digital environment, payment and commodity lending, and online banking based on banks-providers, corporate, and personal electronic banks. Blockchain technologies for normalized budjet support practically eliminate the unplanned use of budget funds.

Keywords: blockchain technologies for normalized budget support (NBS- technologies); resource support of national projects for protection and development of the country's potential; situational online budgeting; normalized commodity-money circulation

PARABOLIC INTEGRODIFFERENTIAL SPLINES AS ACTIVATION FUNCTIONS TO INCREASE THE EFFICIENCY OF INFORMATION PROCESSING BY NEURAL NETWORKS
  • T. K. Biryukova

Abstract: The paper considers the method to increase the efficiency of information processing by neural networks by using the parabolic integrodifferential splines (ID-splines) developed by the author as an activation function (AF) for neurons. If the coefficients of parabolic ID-splines along with the weights of the neurons are the trainable parameters of the neural network, then the AF in the form of a parabolic ID spline changes in the learning process to minimize the error function. This increases the accuracy of the results of the neural network calculations and accelerates its training and operation. The prospects for modifying neural networks with known architectures (such as ResNet) by introducing ID-spline as AF are analyzed. Apparently, such an approach can improve the quality of functioning of some popular neural networks. It is concluded that parabolic ID splines as AF can increase the efficiency of artificial intelligence technologies in such tasks as decision making, computer games development, approximating and predicting data (in the financial and social spheres, in science, etc.), classification of information, processing of images and videos, application of computer vision, processing of texts, speech, and music, etc.

Keywords: аrtificial intelligence; deep learning; neural network; activation function; spline interpolation; integrodifferential spline; parabolic spline