Abstract: As an extension of the robust H theory, the time domain design based on linear matrix inequalities (LMI) is a conceptually simple and efficient framework to obtain qLPV controllers. However, the constructed scheduling variables are not always suitable for an efficient implementation. This paper investigates the possibility of constructing the scheduling block of a qLPV controller explicitly, i.e., in the form of a linear fractional transformation (LFT). It is shown here that if both the primary and dual multiplier LMI equations lead to maximal indefinite subspaces and a coupling condition holds, the problem can be solved and a constructive algorithm results to build the desired scheduling variables.
Keywords: Linear Fractional Transform; robust control; qLPV design
Abstract: This paper reviews selected concepts and principles of structural causal analysis and adapts them for exploratory analysis of a hybrid dynamical system whose continuous dynamics are described by ordinary nonlinear differential equations. The proposed method employs partial derivatives in order to calculate “causal partitions” of the system’s state variables, which make it possible to quantify the extent to which various causes can be considered “responsible” for the emergent behavior of the simulated system. Causal partitions can be processed by machine learning techniques (e.g. clustering and classification), and so facilitate meaningful interpretations of the observed emergent behaviors. The method is applied to the simulated emotions of fear and anger in humans, in a hybrid agent-based model of human behavior in the context of EDA project EUSAS.
Keywords: Hybrid system; Causal analysis; Emergent behavior; Agent-based simulation; Human behavior modeling
Abstract: The irregularity of a graph can be defined by different so-called graph topological indices. In this paper, we consider the irregularities of graphs with respect to the Collatz-Sinogowitz index [8], the variance of the vertex degrees [6], the irregularity of a graph [4], and the total irregularity of a graph [1]. It is known that these irregularity measures are not always compatible. Here, we investigate the problem of determining pairs or classes of graphs for which two or more of the above mentioned irregularity measures are equal. While in [17] this problem was tackled in the case of bidegreed graphs, here we go a step further by considering tridegreed graphs and graphs with arbitrarily large degree sets. In addition we pressent the smallest graphs for which all above irregularity indices are equal.
Keywords: irregularity measures of graph; topological graph indices
Abstract: Nowadays, classifier combination methodsreceives great attention from machine learning researchers. It is a powerful tool to improve the accuracy of classifiers. This approach has become increasingly interesting, especially for real-world problems, which are often characterized by their imbalanced nature. The unbalanced distribution of data leads to poor performance of most of the conventional machine learning techniques. In this paper, we propose a novel weighted rough set as a Meta classifier framework for 14 classifiers to find the smallest and optimal ensemble, which maximize the overall ensemble accuracy. We propose a new entropy-based method to compute the weight of each classifier. Each classifier assigns a weight based on its contribution to classification accuracy. Thanks to the powerful reduct technique in rough set, this guarantees high diversity of the produced reduct ensembles. The higher diversity between the core classifiers has a positive impact on the performance of minority class as well as on the overall system performance. Experimental results with ozone dataset demonstrate the advantages of weighted rough set Meta classifier framework over the well-known Meta classifiers like Bagging, boosting and random forest as well as any individual classifiers
Keywords: Weighted Rough Set; real world web service; class imbalance learning; entropy
Abstract: This paper proposes a new formalization of the classical probability-possibility relation, which is further confirmed as a much complex, but natural provability – reachability - possibility - probability - fuzzy membership – integrability interconnection. Searching for the right context in which this relation can be consistently expressed for the particular case of experimentally obtained iris recognition results brought us to a natural (canonic) and universal fuzzification procedure available for an entire class of continuous distributions, to a confluence point of statistics, classical logic, modal logic, fuzzy logic, system theory, measure theory and topology. The applications - initially intended for iris recognition scenarios - can be easily extrapolated anywhere else where there is a need of expressing the relation possibility - probability - fuzzy membership without weakening the σ -additivity condition within the definition of probability, condition that is considered here as the actual principle of possibility-probability consistency
Keywords: additivity, principle of possibility-probability consistency, provability, reachability, possibility, probability, fuzzy membership, Riemann integrability, negation, consistent experimental frameworks, Turing test, iris recognition, biometrics
Abstract: The system of homogeneous nodes, cooperating with each other, pursuing strictly defined goals for the exchange of information… this is a Wireless Sensor Network (WSN). WSN’s are network radio equipment (transmitters, sensors and microcontrollers), that are geographically distributed in a defined area. These autonomous nodes communicate with each other and jointly submit their data through the network to the Base Station (BS). The basic objective of such activity is to monitor environmental conditions, such as: light, pressure, temperature, motion and/or chemical pollution. This paper describes a relational shape adaptive collective behavior. On the basis of formal set theory, as shown from the perspective of global managed adaptive WSN actions, and how local activity adaptation is guaranteed.
Keywords: adaptive systems; collective behavior; Wireless Sensors Network; WSN
Abstract: A complex mathematical model characterizing the centerline segregation level in the midregion of continuously cast slabs was developed. The basic heat transfer and solidification model connected to the semi-empirical liquid feeding model (LMI - Liquid Motion Intensity model) gives the possibility to estimate the centerline segregation parameters of slab cast under industrial circumstances. Solid shell deformation changes the volume of the space available for the liquid inside the slab and hereby also changes the conditions of liquid supply. In modelling slab casting in practical industrial cases the deformation of the solid shell cannot be ignored, especially from the point of view of centerline segregation formation. From this aspect, the most important effects resulting in deformation of the solid shell are as follows: shrinkage of the solid shell due to solidification and cooling; setting of the supporting rolls along the length of the casting machine i.e. decreasing the roll gaps as a function of cast length; bulging of the solid shell between successive supporting rolls; positioning errors and wear of rolls; eccentricity of individual rolls; etc. The critical parameter to describe the inhomogeneity in the center area of slabs is the porosity level in the mushy region. As a result of calculations performed by the model, ISD Dunaferr Co. Ltd. has changed the strategy of supporting roll settings in their continuous casting. After the modification had been implemented on casting machines, the quality problems due to centerline segregation of slabs decreased to a great extent.
Keywords: slab casting; centerline segregation; porosity; deformation of solid shell; mushy; permeability; pressure drop
Abstract: The paper presents the results of implementation of differential evolution algorithm on FPGA using floating point representation with double precision useful in real numeric problems. Verilog Hardware Description Language (HDL) was used for Altera hardware design. Schematics of the modules of differential evolution algorithm are presented. The performance of the design is evaluated through six different functions problems implemented in hardware.
Keywords: FPGA; Differential Evolution Algorithm; floating point
Abstract: In the European Union and Hungary more than one-third of the total energy consumption comes from households. Therefore, during both the planning of energy efficient investments and the design of energy production and consumption, one of the most important factors is estimating the rate of consumption. The determination of the exact consumption of households has not been achieved because the rates of consumption are calculated solely on the basis of technical parameters, and the results of these calculations are not sufficiently accurate or reliable, especially considering the uncertainties arising from consumer habits. To resolve the issue, we have created a database structure and a calculation model that helps to estimate a household's annual energy consumption based on factors that we have defined. Obviously, this does not mean that calculation based on technical parameters is now unnecessary, but applying the two methods together can significantly increase the accuracy of the estimation of household energy consumption at the individual, regional or national level.
Keywords: households; energy consumption; neural networks
Abstract: In this paper we study the robust PNS problem, which is the extension of the structural PNS problem used to model process network synthesis. This problem is NP-hard thus a heuristic algorithm can be very useful for large instances, where we do not have enough time for an exponential time algorithm, presenting a surely optimal solution, or they can be used to fasten branch and bound based algorithms. We present new heuristic algorithms for the solution of the problem which are extensions of the heuristic algorithms used to solve the classical structural PNS problem. The algorithms are analyzed empirically, where we compare the efficiency on randomly generated inputs.
Keywords: PNS problem; robust problems; heuristic algorithm, P-graphs; Business Process Modeling
Abstract: A precondition for the realization of the adaptive teaching process is a knowledge of the individual characteristics of learning, an understanding of the individual methods of learning and, through these, the selection and formation of a suitable teaching environment. Therefore the differences between students must be taken into consideration by the teacher. They are to be interpreted not only at the level of intellectual capacities but also with respect to the most different individual characteristics of sensation, perception, thought and learning. In the present empirical research the 12-item variant of Kolb's LSI questionnaire is applied for this purpose. First Kolb's learning model is briefly surveyed, then the objective of the research is stated, the results of the empirical research are shown and, finally, the most important statements of the research are presented.
Keywords: adaptive teching process; learning strategies; Kolb’s learning styles
Abstract: This paper describes the effects of applying at a large scale a new agricultural system, based on passive greenhouses, from a system engineering perspective. Passive greenhouses use only the renewable energy sources: geothermal, wind and sun, by means of cool water heat pumps, wind turbines and photovoltaic panels. Thereby they are fully free of any energetic infrastructure and can be installed in remote areas, even in deserts. They offer a fundamental sustainable agricultural resource and a global ecological recon-struction opportunity. The surface needed by a greenhouse is much smaller compared to an equivalent conventional agriculture exploitation. The huge unshackled surfaces that result if using passive greenhouses may be reconverted into forests, pastures, orchards or pounds, thereby decreasing the carbon footprint. The main obstacle is the high investment cost, which can be minimized by optimizing the size of each energy source according to the user's specifications, the local climate, and by intelligent control algorithms. The equip-ment prices are constantly decreasing and the newly created market will generate jobs and give a boost to related industries. A holistic view of a the passive greenhouse farming sys-tem reveals a set of synergies that increase chances for future implementation.
Keywords: sustainable energy; passive greenhouse; heat pump; wind generator; photovol-taic panel; Watergy; carbon footprint; environment reconstruction; intelligent control
Abstract: Nobody questions the reason for existence of fossil power plants meanwhile the acceptance of the renewable tools is often problematic. Of course the rationales based on the experiences of decade long operation the picture is not clear. The world turns to the consciousness that well-ruled renewables contribute to the sustainability. We summarize dozens of pros and cons and we are going to suggest ways to fit the renewable resources into the power system.
Keywords: power system, renewable sources, integration to the power mix
Abstract: This paper gives a global review of artificial immune systems in computer science and their implementation. The performance of the immunological algorithm in solving optimization problems is analysed using the Optimization Algorithm Toolkit, with emphasis on determining the impact of parameter values. It is shown that these types of algorithms are particularly sensitive to the choice of parameters that affect the functioning of the algorithm.
Keywords: immunological algorithm; implementation; Optimization Algorithm Toolkit; travelling salesman problem; function optimization, parameters
Abstract: In this paper, we emphasize the unprecedented problems that the age of shallow knowledge offers. We investigate how navigating in the world of information and communication technology (ICT) innovations and influencing the spreading of memes are affecting those who make the decisions about how we should behave in the “ICT labyrinth”. Drawing on the role of key influencers and communities of practice, we offer an alternative framework of emerging learning tools in the post-experiential informal learning ecosystem and investigate the ways of some implementations in corporate practice
Keywords: informal learning; learning ecosystems; learning tools