«Systems and Means of Informatics» Scientific journal Volume 30, Issue 2, 2020
Content | About Authors
Abstract and Keywords.
- Yu. A. Stepchenkov
- Yu. G. Diachenko
- Yu. V. Rogdestvenski
- N. V. Morozov
- D. Yu. Stepchenkov
- D. Yu. Diachenko
Abstract: The paper considers self-timed (ST) complementary metal-oxide- semiconductor (CMOS) combinational circuit tolerance to short-term soft errors caused by the external sources or internal noises that do not lead to semiconductor structure destruction. The paper discusses the consequences of physical causes impact, leading to soft errors in a chip manufactured by the 65-nanometer and below CMOS process. It introduces soft error classification in CMOS ST combinational circuits depending on their appearance time and the type of failure. Self-timed circuits have a higher degree of resistance to short-term soft errors than their synchronous counterparts due to the two-phase operation discipline, request-acknowledge interaction, and dual-rail information signal coding. The paper proposes circuitry and layout methods ensuring the lowering of CMOS ST combinational circuit sensitivity to soft errors due to the guaranteed absence of the bipolar influence of the soft error source on the cells forming dual-rail signals and on their wires in the circuit layout.
Keywords: self-timed circuit; soft error; fault tolerance; CMOS; working phase; spacer; layout
- L. P. Plekhanov
- V. N. Zakharov
Abstract: One of the main tasks of creating self-timed circuits is to analyze their self-synchronicity. Known event-based methods do not provide a complete analysis of self-timed circuits of real complexity due to the excessive amount of calculations. Within the framework of the functional approach, a universal method based on the automatic division of the scheme into minimal self-timed cells is proposed. The method allows one to radically reduce the necessary calculations and analyze self-timed circuits of any size.
Keywords: self-timed circuits; analysis of self-timed circuits
- A. A. Grusho
- N. A. Grusho
- V. V. Senchilo
- E. E. Timonina
Abstract: The aim of the paper is to develop cheap architectural measures to prevent mass failure of mobile systems supporting the digital economy.
The possibility of cost-effective prevention of mass failure of weak protected mobile devices supporting the digital economy has been considered. It has been shown that ignoring the problem of mass disruption of mobile devices can cause significant economic damage. The methods of organization of adequate protection on the basis of affordable and cheap tools are proposed. It is assumed that mass failure of mobile devices is possible by creating a botnet that implements a malicious impact in C&C mode in a short period of time. At the same time, the preparation of the attack, which implements the introduction of bots and the creation of a botnet, can be carried out for a long time in the P2P mode.
Keywords: digital economy; heterogeneous systems; information security
- A. S. Kabanov
- A. A. Vodolazhenko
Abstract: Different aspects of introduction of means to counteract insider activity are considered. The pragmatic approach to assessing the necessity and cost of implementing countermeasures to insider activity of a certain organization is shown. The condition of the economic feasibility of introducing measures to counteract insider activities is given, the approaches to evaluating insider information are proposed, and the basic conclusions are formulated. The article is analytical and can be useful for heads of information security services, teachers, and students.
Keywords: insider; insider activity; value of insider information
- D. Y. Kovalev
- I. A. Shanin
- E. M. Tirikov
Abstract: Neuroinformatics lies at the intersection of computer science and neuroscience, making it possible to use methods and tools from one domain for accumulating, processing, analyzing, and managing data and modeling techniques from another. Nowadays, neuroinformatics is evolving very fast, this leads to a rapid expansion of the range of scientific problems that need to be solved.
This article deals with a number of urgent problems in the area of cognitive functions modeling of the neurophysiology domain. Problems are analyzed from the point of view of neuroinformatics. Common pitfalls, methods, processing tools, and implementation issues are examined. In total, four problem statements are discussed with data sets residing in the resting state and task functional magnetic resonance imaging as well as in electroencephalograms. The methods vary from simple linear models to highly sophisticated deep neural networks. Justifications for using distributed computing infrastructures are discussed for each problem, including high dimensionality in data that requires, on the one hand, distributed implementation and, on the other hand, using computationally extensive methods that require low-level GPU-based parallelization.
Keywords: data intensive research; neuroinformatics; distributed computing infrastructures
- D. Y. Kovalev
- E.A. Tarasov
- V. N. Zakharov
- N. M. Filimonov
Abstract: The problem of carrying out virtual experiments driven by hypotheses is considered. The analysis of methods for managing virtual experiments in existing systems for working with experiments is carried out. According to the results of the analysis, a life cycle of a virtual experiment is formed. Basic operations of managing virtual experiments are presented for the stages of an experiment life cycle, as well as the main stages of the expert's work with the platform for executing a virtual experiment. The program architecture of the platform for managing virtual experiments and hypotheses is proposed with a description of the main components of the platform and their functions that implement the basic operations of the life cycle.
Keywords: virtual experiment life cycle; hypotheses; data intensive research
- S. K. Dulin
- D. A. Nikishin
Abstract: An approach is presented to ensure the coordination of heterogeneous applied conceptual schemes that describe individual subject areas with sufficient completeness, based on their unification and integration in the form of a universal, unified geoontology. Possible factors of heterogeneity of particular conceptual schemes are considered, requirements and main design solutions for building an unified geoontology intended for use in the context of a multiaspect geodatabase are presented. The data structure for building such an unified geoontology is shown.
Keywords: harmonization of the application conceptual schema; unified geoontology
Abstract: The effectiveness of the hybrid and synergetic artificial intelligence systems is achieved, mainly, by the proper organization of the interaction of their elements and not by increasing the complexity or intelligence of the latter. In the case of hybrid intelligent multiagent systems, simulating teams of experts solving problems "at round-table," for the effective organization of interaction between agents, the conditions for the consistency of their goals and domain models, the unity of the protocol for solving the problem, and the compatibility of message transfer languages should be fulfilled. These conditions are especially relevant when a hybrid intelligent multiagent system is built from agents developed by various independent teams. In this case, agents should be able to independently coordinate goals, domain models and develop a protocol to solve the problem posed to them within the framework of a new class of intelligent systems, namely, a cohesive hybrid intelligent multiagent system. This article discusses the functional structure of such a system.
Keywords: cohesion; hybrid intelligent multiagent system; expert team
Abstract: The problem of setting availability targets for diagnostic systems (diagnostics) in the ITER project is studied. Such systems must provide the required level of availability, with which numerous parameters of ITER are measured. There are a number of difficulties on the way of problem statement and its solution. These are, in particular, the uniqueness of the object, the diversity and plurality of parameters and diagnostics, the complex interrelation between them, and the absence of analogues in the literature on reliability. The article describes a mathematical model of measurement reliability which takes into account both strict limitations and vaguely expressed wishes of developers.
An effective algorithm for calculating diagnostic availability targets is developed.
The illustration example of calculation was prepared with the help of a specially created software which uses along with the developed algorithm the possibility of volitional decision-making.
Keywords: ITER; mathematical models of reliability; large system reliability analysis; availability
Abstract: A next stage in the data centers perfection process develops as software- defined data centers (SDDC). First publications on the SDDC show increasing interest to this topic revealing at the same time considerable discrepancies in estimations of the subject and difficulties with understanding the essence of the matter. To turn a data center into an SDDC, availability of both the software- defined networking and the software-defined storage is insufficient. It is also conceptually needed that the data center control would be definitely centralized and automated by means of special software (a so-called "orchestrator") on the policy-driven basis. Aspiration for expenditures reduction and troubles abatement would impel data center owners to move toward the SDDC. At present, however, real SDDC examples are demonstrated only by few big internet companies possessing the up-to-date infrastructure and resources to create an adequate orchestrator, since no universal commercial orchestration products are available on the market. Nevertheless, other organizations may possibly choose a compromise palliative solution of partially software-defined or "double-speed" data centers.
Keywords: control automation; infrastructure as a service (IaaS); orchestrator; policy-driven basis; SDDC; SDN; SDS; software-defined
- V. V. Vakulenko
- A. A. Goncharov
- A. A. Durnovo
- I. M. Zatsman
Abstract: The design of the database (DB) for a bilingual phraseologic dictionary which supports the generation of paper and digital versions depends largely on the structure of the dictionary entry. This structure defines the DB's architecture in the following aspects: (i) the connectedness of the components of a single dictionary entry (for a bilingual phraseologic dictionary, the components are: idiom and its meanings represented by translation variants, examples, commentary, etc.); (ii) the ability to add hyperlinks between two and more entries and their components; and (iii) searching, visualizing, and editing entries. Additionally, when designing lexicographical DBs, one of the conditions includes a separate description of the logical data structure and the representation forms of this data during its visualization in accordance with the user's query.
The goal of this paper is to describe the approach to expanding the range of dictionary entry representation forms by clustering the entries according to the faceted classification and to display the stages of implementing this approach.
Keywords: dictionary entry structure; lexicographical database; data structure; dictionary entry visualization; database design
- A. Yu. Egorova
- I. M. Zatsman
- V. V. Kosarik
- V. A. Nuriev
Abstract: The paper describes an experiment focused on studying the instability of neural machine translation (NMT). In the course of a year, an array of text fragments in Russian was repeatedly translated into French. The time step was one month. To produce translations, the Google's NMT system was used. The experiment helps reveal the instability of NMT, i.e., it shows that translations of a given text fragment tend to change with time but not always improving the quality. The generated translations were linguistically annotated, which led to uncovering several different types of the NMT instability. While annotating, a previously designed classification of machine translation errors was employed.
It was altered to meet the objectives of the experiment, the ultimate goal of which was to obtain a frequency distribution of different types of the NMT instability.
Yet, the first step of the experiment limited itself to only categorizing the NMT instability, and it is this very step that the paper describes. As the empirical data, the experiment uses Russian-French annotations generated in a supracorpora database. Each annotation contains a fragment of the source Russian text, its translation into French, and the description of translation errors occurring there.
Keywords: machine translation; instability; translation monitoring; linguistic annotation; instability types
- A. A. Zatsarinny
- Yu. S. Ionenkov
Abstract: The article is devoted to evaluating the contribution of information systems (IS) to the effectiveness of relevant organizational systems (ministries and departments). A general methodological approach to assessing the contribution of IS to the effectiveness of organizational systems is considered taking into account the features, principles, and conditions for building appropriate organizational systems. The authors present the list of performance indicators for each of the three groups of generalized indicators of the effectiveness of the IS (the index of rationality of organizational structures, the performance indicators for IS, and the indicators of organizational and technical level). A method for evaluating the contribution of IS to the effectiveness of organizational systems based on the hierarchy analysis method is proposed.
Keywords: organizational system; information system; efficiency; indicator; criterion; technology
- A. V. Bosov
- A. P. Suchkov
Abstract: The article proposes a variant of perspective architecture, information and analytical situation center (IASC) for management of strategic planning processes in the field of national security. The decision is based on previously published conceptual approaches and analysis of technical, technological, organizational, and legal factors that have a fundamental impact on the IASC project. The proposed architectural solution provides both the account of the selected factors of influence and the previously developed functional scheme of the IASC, based on the principles of situational analysis in the processes of strategic planning in the field of national security. The technical basis of the solution is provided by a group of modern technologies that make up the elements of building software-defined data processing centers. The subject area of the IASC, which involves the use of special methods of situational analysis, a wide coverage of heterogeneous information sources, flexible interaction with departmental systems, including those processing information of limited access, is subject to accounting within such a data center.
Keywords: strategic planning; situation center; architectural solutions
Abstract: The inclusion of scientific research in the digital sphere is largely due to the possibility of creating an effective system of widely available scientific services. The system of scientific services should provide management of processes of the automated selection and provision of relevant services and implementation of various formal and informal communications of the researcher and the state, scientific community and business. The effectiveness of such a system directly depends on the completeness and integrity of the information data model focused on relevant methods of analysis, which, in turn, is provided by the correct choice of methodology for managing scientific services, taking into account the totality of supporting processes. This model is based on the methods of situational management and situational analysis using digital platform tools that provide real-time decision support processes.
Keywords: scientific services; system of scientific services; information model; situational analysis
- I. M. Adamovich
- O. I. Volkov
Abstract: The article continues the series of works devoted to the technology of concrete historical research supporting. The technology is based on the principles of co-creation and crowdsourcing and is designed for a wide range of users which are not professional historians and biographers. This article is devoted to the description and rationale of the approach to the modeling of the mechanism of automated estimating the trustworthiness of processed information which is included in the technology for its further development. The proposed approach is to appropriately modify the semantic net model which is based on the concepts of graphodynamics and the Barabasi-Albert model and the support for user activity simulation mechanism. Using this model, the experimental tests of the effectiveness of iterative algorithms for the implementation of a new mechanism for automated estimating the trustworthiness of concrete-historical information were carried out and the influence of the iteration depth constraint on the average trustworthiness of the calculated estimates was generally evaluated.
Keywords: modeling; distributed technology; reliability of information; historical-biographical fact; automated procedure
Abstract: The review presents the basics of methodological support for technologies of commodity-money circulation in the digital environment (CMC- technologies) implemented by means of personal e-banks (PEBs) owned by individuals and corporate e-banks (CEBs) owned by legal entities. Personal and corporate e-banks are the specialized artificial intelligence systems implemented on the basis of portable computer devices (smartphones, tablets) and stationary computers. The applied meaning of each CMC-technology is defined by a system of rules that direct and control actions of the parties to the contract on commodity-money circulation in the digital environment (CMC-agreement).
The rules implemented in СMС-technologies are based on the assumption that the owners of PEBs and СEBs are granted the right to credit by law, the implementation of which is controlled by the bank-regulator. The CMC-technologies allow the lenders to use only their own funds. The rules for execution of each CMC-agreement are controlled by software interacting with the digital twins that track events defined in the agreement. The CMC-technologies are aimed at improving the economic security of deals and reducing the influence of factors causing a decrease in the commodity capacity of money.
Keywords: technology of commodity-money circulation in the digital environment; personal e-bank; corporate e-bank; contract on commodity-money circulation in the digital environment
|