Abstract: Maintaining and improving existing, large-scale systems, that are based on relational databases has proven to be a challenging task. Among many other aspects, it is crucial to develop actionable methods for estimating costs and durations in the process of assessing new feature requirements. This is a very frequent activity during the evolution of large database systems and data warehouses. This goal requires the analysis of program code, data structures and business level objectives at the same time, which is a daunting task if made manually by experts. Our industrial partner started to develop a static database analysis software package that would automate and ease this process in order to make more accurate estimations. The goal of this work was to create a quality assessment model that can effectively help developers to assess the data flow (lineage) quality and the database structure quality of data warehouse (DWH) and online transaction processing (OLTP) database systems. Based on the relevant literature, we created different models for these two interconnected topics, which were then evaluated by independent developers. The evaluation showed that the models are suitable for implementation, which are now included in a commercial product developed by our industrial partner, Clarity.
Keywords: database systems; data warehouses; cost estimation; software quality models; data flow; database structure; data lineage
Abstract: This paper aims to propose a fuzzy brain emotional learning classifier and applies it to medical diagnosis. To improve the generalization and learning ability, this classifier is combined with a fuzzy inference system and a brain emotional learning model. Meanwhile, different from a brain emotional learning controller, a novel definition of the reward signal is developed, which is more suitable for classification. In addition, a stable convergence is guaranteed by utilizing the Lyapunov stability theorem. Finally, the proposed method is applied for the leukemia classification and the diagnosis of heart disease. A comparison between the proposed method with other algorithms shows that this proposed classifier can be viewed as an efficient way to implement medical decision and diagnosis.
Keywords: brain emotional learning classifier; neural network; medical diagnosis
Abstract: This paper presents the development of a hybrid multi-criteria decision-making (MCDM) approach for Product Lifecycle Management (PLM) software selection, as an essential part of the PLM concept implementation. The approach is based on the hybrid MCDM process that integrates the Fuzzy Analytic Hierarchy Process (FAHP) and the Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE). The Fuzzy AHP has been applied in order to overcome the problem of the vagueness of decision-makers’ judgments in the process of the criteria relative significance assessment, whereas, the PROMETHEE method has been applied in order to evaluate the pieces of software. This paper’s findings should indicate the broad possibilities of the proposed model for an objective evaluation of PLM software, on the basis of their total suitability against the global goal according to the established criteria, and capability for efficiently overcoming the problem of data vagueness that decision-makers are facing during the process.
Keywords: PLM software; MCDM; software selection; Fuzzy AHP; PROMETHEE
Abstract: In this paper, we propose TopicAE, a simple autoencoder designed to perform topic modeling with input texts. Topic modeling has grown in popularity especially in recent years with a large number of digital documents and contributions from social media available. These texts usually contain useful information and methods in the area of topic modeling, show novel approaches to their automatic summarization, browsing and searching. The main idea of topic modeling is to uncover hidden semantic structures from the input collection of texts. There are several topic models to extract standard topics from with their evolution through time and hierarchical structure of the topics. In this paper, we propose techniques known from the area of neural networks. Our TopicAE model can be applied to solve all the tasks mentioned above. The performance of the proposed model was also tested and showed that TopicAE could solve the topic modeling problem and out performed standard methods (Latent Semantic Indexing and Latent Dirichlet Allocation) according to evaluation metrics.
Keywords: autoencoder; deep learning; neural networks; topic modeling
Abstract: Requirement Functional Logical Physical (RFLP) structure has emerged as one of the prominent approaches for modeling the multidisciplinary products. Information Content (IC) provides effective interaction between the human and multidisciplinary product model. Though it controls the RFLP level by the Multilevel Abstraction based Self-Adaptive Definition (MAAD) structure, it needs to be further enhanced in terms of Human-Computer Interaction (HCI), multidisciplinary product behavior representation and structured processing of interrelated engineering objects to obtain coordinated decisions. Therefore, this paper introduces the Object-Oriented Principle (OOP) concepts in the IC for behaviors representation of the multidisciplinary product where Info-Chunk is considered as an object. Here, Behavior Info-Chunk (BiC) and Context Info-Chunk (CxiC) objects are proposed in the MAAD structure to model the behavior of the multidisciplinary product. Further, the concepts of Info-Chunk objects are extended to Intelligent Property (IP) that uses Initiative Behavior Context and Action (IBCA) structure to handle the RFLP structure. Based on the communication between the MAAD and RFLP structure, an API (Application Programming Interface) called “InfoChunkLib” is proposed. It can generate the graphs to represent the behaviors of a multidisciplinary product model. The API is handled by the information content to represent the behavior information and store the results in a database.
Keywords: Behaviors representation; Multidisciplinary product modeling; Info-Chunk based Information Content; RFLP structure; MAAD structure; IBCA structure
Abstract: The impact of the institutional environment on the business activity was a subject of several previous studies. However, the ways in which changes in institutions affect business climate have not received proper attention from scholars as of yet. The purpose of this paper is to fill this gap in the literature by examining the relationship between selected formal institutions (business enabling policies and tax treatment) and informal institutions (corruption and political connections) and business climate in the context of the developing country. To test the proposed hypotheses an ordinal regression with two link functions was applied on an original dataset of 404 firms operating in Albania. Results show that neither formal institutions, nor informal ones act as a block concerning the impact on the business climate. Tax treatment and political connections affected business climate negatively, whereas corruption seemed to have a positive impact. A positive but insignificant effect was found between business enabling policies and the business climate. Our research triggers interest of policymakers who intend to design policies to improve the business environment.
Keywords: business climate; business enabling policies; tax treatment; corruption; political connection
Abstract: Technological advances have made possible the fault location detection on the low-voltage distribution network using the fault location determination algorithm (FLDa). The results obtained by operating this algorithm can be implemented into a system that schedules the faults toward the electrician teams in charge of the troubleshooting. This solution, however, only addresses the processing and evaluation of signals based on remote signaling and does not provide the possibility of automatic interventions. This present paper investigates and describes the possibilities of automatic interventions on low-voltage distribution networks. This paper examines the Smart Switchboard concept developed by the Research Group of Applied Disciplines and Technologies in Energetics.
Keywords: theory; low-voltage distribution network; smart switchboard
Abstract: Many of the services incorporated in the e-Government of the Republic of Serbia need a quick-answer system to meet the continually increasing demands of the citizens for easy, fast and effective obtaining of the requested information. However, the public administration of the Republic of Serbia contains a significant amount of unstructured data arranged in the documents. Thus, it is necessary to provide an automatic classification system based on the principle query-document. The question-answering (Q&A) system related to the Crime domain of the e-Government service of the Republic of Serbia represents a system for achieving the quick replies on citizens’ questions. The Q&A system is based on the data mining, text mining, natural language processing, question answer, Bag of Words and N-gram analysis. A similarity measure (distance) is a significant parameter of the Q&A system due to its direct impact on searching speed and distance from wanted documents. Here, three most commonly used similarity measures are used: Cos, Jaccard and Euclid. The primary goal is to determine the similarity measure which provides the most precise results in the crime domain, and that similarity measure is used as a referent one. Due to the high importance of a similarity measure, we use the above three similarity measures, in the process of selecting the most appropriate similarity measure. The selection of the similarity measure is performed using the principles of redundancy and fault tolerance. Specifically, the principle of triple modular redundancy (TMR) with one voter is used. The proposed system is verified by the experiments with real citizen queries. The results show that the proposed system achieves good performance.
Keywords: e-Government; text mining; redundancy; TMR; unstructured documents
Abstract: The effects of bank competition on the cost of credit are a much-debated topic in Small and Medium enterprises financing. In this paper, we would like to examine the relationship between the cost of credit and interbank-competition in the context of Visegrad countries - the Czech Republic, Poland, Hungary, and the Slovak Republic. The dataset of this paper comes from two different sources, the firm level data provided by the latest version of the Business Environment and Enterprise Performance Survey that was conducted by the European Bank for Reconstruction and Development and the World Bank during 2012 to 2014, and the country level bank competition measures are collected from the Global Financial Database, updated in 2017 . We have examined bank competition with four measures, including structural bank concentration measure and three non-structural (Lerner Index, H-Statistics, and Boone Index) measures. We find evidence that bank competition has a positive effect on the cost of credit and hence, our results are in-line with prior literature on information-based theories of bank competition. We have also assessed the firms in terms of their information opacity (micro, small, and medium), and we find that the cost of credit is higher for the information opaque firms. Thus, firm sizes have important implications for bank competition and cost of credit.
Keywords: Cost of credit; bank competition; SME; Visegrad countries
Abstract: Two opposing tendencies are observable in the field of industrial production in economically developed countries. From the perspective of the management staff, for whom economic effects are paramount, the most important are increased work efficiency and reduced cost of production. Such actions are intended to provide an appropriate competitive position for the company. From the standpoint of the employees, aware of their growing position on the labour market and their value, the most important are suitable working conditions and adequate compensation. These opposing tendencies can be reconciled by the widespread automation and robotisation of the production processes. This requires substantial investment and, considering the growing costs of labour, can provide an increase in efficiency and a reduction in the costs of production. Growth in robotisation and automation are also necessary due to the shrinking labour resources. This paper is intended to analyse the level of robotisation in selected countries, and to investigate the relations between labour costs and the degree of robot utilisation. The final part of the paper characterises the condition of Polish industry from the perspective of robotisation, based on a more in-depth analysis of selected factors. On this basis, the directions are outlined for the necessary changes to achieve further growth and approach the levels observed in European Union countries.
Keywords: robotisation; industrial production; efficiency; labour costs; labour market
Abstract: The Disturbance storm time (Dst) index is an important indicator of the occurrence of geomagnetic storms, which can damage communication and power systems, as well as, affect Astronauts performance. Such potential consequences of this fatal event has challenged researchers to develop Dst predictors, with some success. This paper presents the design of a computationally fast, neuro-fuzzy network to forecast Dst activity. The proposed network combines a class of emotional neural networks with neo-fuzzy neurons and is named, Neo-fuzzy integrated Competitive Brain Emotional Learning (NFCBEL) network. Equipped with five competing units, the hybrid model accepts only the past two samples of Dst time series, to predict future values. The model has been tested in the MATLAB programming environment and has been found to offer superior performance, as compared to other state-of-the-art Dst predictors.
Keywords: geomagnetic storms; Dst time series; emotional neural networks; neo-fuzzy neurons; MATLAB
Abstract: In this paper, we solve some special types of linear and non-linear Boolean programming problems. We present a method for transforming the used linear 0-1 inequalities into a weighted directed graph. Allowing equalities our conditions are non-linear, but the transformation to weighted directed graphs works also in these cases. In graph representations, the “critical edges” are used to represent the non-linear conditions. Basic, modified and extended Boolean programming problems are investigated. Linear goal-functions are used in optimization. The presented algorithm, similarly as algorithms for knapsack-problems, gives a relatively good solution, moreover, the algorithm extended by the backtrack graph-search strategy, guarantees optimal solutions for the considered problems.
Keywords: Boolean-programming; 0-1 inequalities; optimization; knapsack problem; greedy algorithm; backtracking; graphs