«Systems and Means of Informatics» Scientific journal Volume 29, Issue 2, 2019
Content | About Authors
Abstract and Keywords.
- A. A. Grusho
- A. A. Zatsarinny
- E. E. Timonina
Abstract: The model of an electronic ledger for fixing transactions of participants of economic activity of digital economy at the regional level is constructed and investigated. As basic infrastructure for realization of the model, a net of situational centers is suggested. The modernized variant of blockchain - tangle - is the basis for the model. In the Russian Federation, it is convenient to use centralized consensus allowing one to use partially created infrastructure of situational centers and guarantees of the state on security to participants of economic activity of digital economy. The technology of the electronic ledger on the basis of tangles solves the main problems of electronic interaction for each participant of economic activity in digital economy, in particular, provides the legal importance of information on agreements and their performance, provides control of consistency of economic activity and informing authorities on results of this activity. The use of infrastructure of situational centers for electronic ledgers allows solving many problems, necessary for digital economy, connected with providing tools, resources, and mechanisms of information security.
Keywords: information security; blockchain; tangle; digital economy; situational center
- K. R. Usmanova
- V. V. Strijov
Abstract: The problem of forecasting multiple time series requires detection of relationship between them. Engagement of related time series in a forecast model boosts the forecast quality. This paper introduces the convergent cross mapping (CCM) method used to detect a relationship between time series. This method estimates accuracy of reconstruction of one time series using the other series. The CCM method detects relationship between series not only in full trajectory spaces, but also in some trajectory subspaces. The computational experiment was carried out on two sets of time series: electricity consumption and air temperature, oil transportation volume and oil production volume.
Keywords: time series; forecasting; trajectory subspace; phase trajectory; convergent cross mapping
Abstract: The popularity of signal processing algorithms using wavelet analysis methods has increased significantly over the past decades. This is explained by the fact that the wavelet decomposition is a convenient mathematical apparatus capable of solving problems in which the use of traditional Fourier analysis is ineffective. The main tasks for which the methods of wavelet analysis are used are signal compression and noise removal. In this case, the most commonly used method is threshold processing of wavelet expansion coefficients, which zeroes coefficients not exceeding a given threshold. The presence of noise and threshold processing procedures inevitably lead to errors in the estimated signal.
The properties of estimates of such errors (mean square risk) have been studied in many papers. In particular, it has been shown that under certain conditions, the risk estimate is strongly consistent and asymptotically normal. When using threshold processing methods, it is usually assumed that the number of wavelet coefficients is fixed. However, in some situations, the sample size is not known in advance and is modeled by a random variable. In this paper, a model with a random number of observations is considered and a class of distributions is described that can be limiting for the mean-square risk estimate.
Keywords: threshold processing; random sample size; mean square risk estimate
- A. A. Kudryavtsev
- S. I. Palionnaia
- V. S. Shorgin
Abstract: This article continues a series of authors' works in the field of applying the Bayesian approach to queuing, reliability and balance models. In the framework of this approach, when complex aggregates are considered, all parameters affecting the functioning of the system are divided into two classes: contributing and preventing correct functioning of the system. The probabilistic characteristics of the balance index - the ratio of factors that negatively affect the operation of the system to positively influencing factors - are studied under the assumption that the factors are random variables with known a priori distributions. In this work, the probabilistic characteristics of the system's balance index are concerned in the case when both factors have an a priori generalized Frechet distribution. The results are presented in terms of a special gamma exponential function.
Keywords: Bayesian approach; generalized Frechet distribution; gamma exponential function; balance models; mixed distributions
- A. V. Kolesnikov
- S. V. Listopad
- F. G. Maitakov
Abstract: The problems of operational dispatch management of regional power systems are characterized by heterogeneity, partial observability of the control object, as well as its dynamic nature, which determine the interdependence of actions performed and the difficulty of correcting erroneous decisions. The applied mapping methods are not relevant to the mental image that the operator is guided by in his(her) activities, prevent him(her) from working and performing actions in his(her) mind, and contribute to errors in the interpretation of data. As a result, the creation of information and communication technologies of computer imitation of cognitive formations is relevant to enhance human intelligence in operational work, by supplementing the operator's natural abilities to work with operational technological information, software, and hardware (mechanisms) that expand human thought processes.
Keywords: cognitive hybrid intelligent systems; operational dispatch management; regional dynamic power system
Abstract: The paper analyzes the processes typical for the sphere of geodata processing. These processes are grouped into three specific groups, in the structure of which elementary information transformations are identified and their features are considered. On this basis, a two-level typology of the main types of information transformations is proposed, which is aimed at generalizing the existing methods of concrete-abstract transformations and studying the reversibility properties of these maps in information systems.
Keywords: geodata; informational transformations; generalization of spatial data; digital cartography; geoinformatics
- M. V. Bobyr
- A. E. Arkhipov
- N. A. Milostnaya
Abstract: The fuzzy method of depth map calculation using stereo images based on SAD (Sum of Absolute Differences) algorithm composition and fuzzy inference is considered. The feature of this method is soft arithmetic operators with fuzzy implication usage. The accuracy of the method of depth map construction is estimated by RMSE (root mean square error). The best soft operator has minimum RMSE. The method of depth map calculation which has 7 steps is presented. The proposed method showed that the accuracy of the SAD algorithm increases by 20% when soft operators are used. This conclusion is confirmed by the simulation results presented in the article.
Keywords: stereo vision; depth map; soft computing; SAD; fuzzy logic
Abstract: In the new millennium, the "software-defining" concept has entered information technologies, with increasing number of software-defined objects being involved as discussion subjects: networks, storages, servers, and finally entirely software-defined datacenters. However, despite the multiplying subjects and frequent reiteration of the set expression "software-defined, " its real meaning is not so easy to perceive. Moreover, a sufficiently widespread opinion considers the expression rather a marketing trick than a technical term. Actual encyclopedias do not suggest a clear commonly accepted definition of this phenomenon as well. In the article, an attempt is made to uncover the technical essence of "software-defined," and to formulate for it a plain practical definition. Basing on unbiased analysis of peculiarities of datacenter infrastructure components declared as software-defined, a conclusion was drawn that "software-defined" may be regarded as a technical term and provided with a simple definition in conventional generally accepted notions.
Keywords: control automation; infrastructure as a service (IaaS); policy-based; policy-driven; SDDC; SDN; SDS; software-defined
Abstract: The mathematical behavior model of signal between two next selections of it is suggested. New general formulas for finding the sinusoidal criterion of the greatest deviation are received. The theorem of reduction of Nyquist interval which connects the criterion of the greatest deviation and the size of constant interval of sampling is proved. The results allow to determine the size of this interval reasonably.
Keywords: systems of real time; sinusoidal criterion of the greatest deviation of signal; step or linear approximation of signal on selections; admissible error of approximation; constant interval of sampling of time
- S. I. Spivak
- L. A. Lukyanov
- N. D. Morozkin
Abstract: Nowadays, combining enterprise business processes within a single information system has a widespread trend. Work optimization of enterprises in condition of flexible market and rapid technical progress is impossible without using a single information system. There are many ERP (Enterprise Resource Planning) systems designed to solve the main problem of automation of all enterprise business processes. However, neither one of them is able to take into account existing automation mechanisms of the enterprise. These mechanisms are often designed not only using documents with technical order for any type of business process but also using software developed with outdated technics. In this paper, the authors consider their own ERP system, which contains a completely new approach to the application, development, and organization of a single information space for all business units. The proposed approach is based on sequential replacement of the existing implementation of business processes by platform subsystems. The metaobject approach to the implementation of business processes is considered, which is applied by specialists of subject areas working directly in the enterprise. The proposed ERP system does not change the logic of existing business processes but ensures their integration, transparency, and common data model.
Keywords: MRP; MRPII; ERP; MES; CRUD; common information space; resource planning; information system modeling; platform; software architecture
- I. S. Ozhereliev
- O. V. Senko
- N. N. Kiseleva
Abstract: The paper describes a new method of outliers detection in pattern recognition tasks. The authors define an outlier as an object which deviates significantly from the other objects of the same class. The method is based on simultaneous use of evaluated object estimates for classes and integral distortion of recognition algorithm that is caused by evaluated object. Usefulness of the developed technique was shown for the task of predicting if an inorganic compound of composition A+3B+3C+2O4 is formed under ordinary conditions.
The method may be used for erroneous observations detection that is aimed to improve training information in different recognition tasks.
Keywords: outliers; databases; recognition; instability of training; nonorganic compounds
- A. A. Zatsarinny
- A. I. Garanin
- V.A. Kondrashev
- K. I. Volovich
- S. I. Malkovsky
Abstract: The necessity of using hybrid solutions in creation of high-performance computing systems is substantiated. A brief description of the hybrid high- performance computing complex (HHPCC) of FRC CSC RAS is given and its enlarged block diagram is presented. The paper offers the methodical approach to estimation of reliability of HHPCC on the basis of which calculations of reliability of the allocated functional subsystems are carried out. Separately, reliability of the "computing infrastructure" of HHPCC (without peripheral elements) was evaluated. Recommendations for improving reliability of functional subsystems are given.
Keywords: hybrid high-performance computing system; reliability; functional subsystems; failure; equivalent circuit for reliability calculation
- A. A. Goncharov
- O. Yu. Inkova
- M. G. Kruzhkov
Abstract: The paper considers methodological principles of annotating linguistic units in parallel corpora using supracorpora databases. Supracorpora databases are a novel information resource in linguistics that allows researchers to save the results of linguistic analysis of corpus data in the form of annotations structured according to the research objectives. When dealing with parallel corpora, the annotation procedure consists of 4 basic stages: annotation objects lookup; definition of the linguistic unit and its context (both in original and translated texts); definition of the linguistic unit ' s attributes (both in original and translated texts); and combination of two linguistic units into a translation correspondence and definition of its attributes. The paper summarizes the previously described annotation techniques, examines functional potential of supracorpora databases, and concludes that it is possible to apply the developed methodology to a wide variety of research objects.
Keywords: suprocorpora databases; faceted classifications; linguistic annotation; annotation methodology; contrastive linguistics
- I. M. Adamovich
- O. I. Volkov
Abstract: This article continues the series on technology of concrete historical investigation support which was built on the principles of co-creation and crowdsourcing and based on the system of automatic facts extraction from the historical and biographical texts T-parser and program complex n-Factograph oriented to the research process support and designed for a broad range of users which are not professional historians and biographers. The article is devoted to the description and reasoning of the data physical storage using the distributed technology environment principles which prevent unlimited increase of central storage volume without losses of important information. The efficiency of proposed measures was checked with the use of the simulation modeling method.
The method of model adequacy checked by the results of comparison and analysis of its runs with the different values of such document's attribute as its public value was rationalized. As a result of the above tests, the reasonable conclusion about the efficiency of the proposed data physical storage principles for the technology of concrete historical research support was made.
Keywords: concrete historical research; distributed technology; partially replicated data placement; model of distributed system; central storage
Abstract: The model of cooperative problem solver based on digital twins is proposed. Participants of cooperation (called "dt-infs") are considered as learnable machines for solving problems. In the processes of solving problems, such machine endowed with a finite set of states and a goal (understood as a set of solvable problems in the subject area) interacts with other members of cooperation and environmental elements through the exchange of unified messages. For each dt-inf, the subject area is represented by a task graph, designed to search for resolving structures and formalize knowledge of the tasks. The set of graph vertices is composed of task constructive objects. Each vertex has a memory represented by the task memory of the task (simple or composite) or the task area. An edge of the task graph is a pair of vertices with nonempty intersection by memory. Load of the edge is determined by the set of all pairs of memory elements included in the intersection. The proposed model is designed for use in the development of information technologies and online services.
Keywords: digital twins; cooperative problem solver; task constructive object; task graph; information technology; online service
|