Abstract: The aim of this study is to check the proposed tenant-based resource allocation model in practice. In order to do this, two SaaS systems are developed. The first system utilizes traditional resource scaling based on a number of users. It acts as a reference point for the second system. Conducted tests were focused on measuring over- and underutilization in order to compare cost-effectiveness of the solutions. The tenant-based resource allocation model proved to decrease system's running costs. It also reduces system resource underutilization. Similar research has been done before, but the model was tested only in a private cloud. In this work, the systems are deployed into commercial, public cloud.
Keywords: Cloud computing; multi-tenancy; SaaS; TBRAM
Abstract: Regardless the level of technological development in a community, the unavoidable phenomenon is the appearance of queuing. This situation will continue as long as there is the customer’s need for the direct contact with the service suppliers, as the case is in the Post of Serbia. The aim of the paper is to estimate the time a customer will spend in queuing while approaching the counter for financial services in a postal network unit. The observed system comprises a single queue, three handling channels and the service according to the FIFO principle. This paper presents a developed model that is realized in the following phases: recording data, preparing data for training, training the neuro-fuzzy system, forming a data set for testing where the expected mean service speed is obtained using the moving average method, and testing the neuro-fuzzy model. Observing the mass service system has so far been directed towards, the evaluation of their behaviour in the past which presents a basis to evaluate whether the system provides satisfactory performances. This paper moves a step in the direction of the behaviour evaluation of a mass service system, in the future, in order to observe whether it is possible to predict the service quality level to be provided to a customer. System customer in this case is not limited by the number of demanded services.
Keywords: Waiting time; Postal network unit; Financial services; ANFIS
Abstract: Blind Speaker Clustering is a task within speech technology, where we have a collection of speech recordings (utterances), and the goal is to identify which utterances belong to the same speakers. To aid the clustering process in this task, we performed preprocessing steps such as feature selection and Principal Component Analysis (PCA); still, the choice of clustering method is not a trivial one. To find the best performing algorithm, we tested standard methods such as k-means (or hard c-means, HCM) and fuzzy c-means (FCM) as well as several improved versions of FCM. In the end, we were able to achieve the best performance using probabilistic-possibilistic mixture partitions. The obtained purity score of 83.9% is significantly higher than the baseline score of 46.9%.
Keywords: clustering; fuzzy c-means algorithm; possibilistic c-means algorithm; speech technology
Abstract: Software is not only difficult to create, but it is also difficult to understand. Even the authors themselves in a relatively short time become unable to readily interpret their own code and to explain what intent they have followed by it. Software is being created with the goal to satisfy the needs of a customer or directly of the end users. Out of these needs comes the intent, which is relatively well understandable to all stakeholders. By using other specialized modeling techniques (typically the UML language) or in the code itself, use cases and other high-level specification and analytical artifacts in common software development almost completely dissolve. Along with dedicated initiatives to improve preserving intent comprehensibility in software, such as literate programming, intentional programming, aspect-oriented programming, or the DCI (Data, Context and Interaction) approach, this issue is a subject of contemporary research in the re-revealed area of engaging end users in software development, which has its roots in Alan Kay's vision of a personal computer programmable by end users. From the perspective of the reality of complex software system development, the existing approaches are solving the problem of losing intent comprehensibility only partially by a simplified and limited perception of the intent and do this only at the code level. This paper explores the challenges in preserving the intent comprehensibility in software. The thorough treatment of this problem requires a number of techniques and approaches to be engaged, including preserving use cases in the code, dynamic code structuring, executable intent representation using domain specific languages, advanced UML modularization, 3D rendering of UML, and representation and animation of organizational patterns.
Keywords: intent; use cases; domain specific languages; 3D rendering of UML; organizational patterns
Abstract: The paper gives an analysis of some optimization algorithms in computer sciences and their implementation in solving the problem of binary character recognition. The performance of these algorithms is analyzed using the Optimization Algorithm Toolkit, with emphasis on determining the impact of parameter values. It is shown that these types of algorithms have shown a significant sensitivity as far as the choosing of the parameters was concerned which would have an effect on how the algorithm functions. In this sense, the existence of optimal value was found, as well as extremely unacceptable values.
Keywords: binary character recognition; Optimization Algorithm Toolkit; hill climbing; random search; clonal selection algorithm; parameter tuning
Abstract: Current energy consumption trends lead to rapidly growing consumption of local renewable energy sources. Such installations bring new requirements on energy consumption profiles. Due to the massive multiplication of the results, one of the most interesting elements of the power grid, in this respect, is formed by households. Smart profiling of household energy consumption may be crucial for the adaptability of the global grid. In this article, we present the design and usage of a demand-side, consumption profiling system named the Priority-driven Appliance Control System (PAX). We describe the main features of the PAX system and show its application using real-world data. The main benefits are presented as direct economic assets in connection with various household energy sources (energy grid and photovoltaic panels) and efficient usage with regard to government energy grants.
Keywords: energy consumption; renewable energy sources; demand-side management; Priority-driven Appliance Control System; PAX
Abstract: Forces during turning depend not only on material properties and cutting parameters but to a great extent on the edge geometry of the tool as well, which determines chip shape (thickness and width). In fine turning it is almost exclusively the nose radius of the tool that does the cutting. The study reviews the main directions and results of researches in recent years concerning cutting force. It also presents the technology of fine cutting. Due to geometric considerations, chip characteristics are used that allow an exact description of cutting on nose radius as a function of the cutting parameters used. Dynamic tests are performed on two aluminium casting alloys and a mathematical model is constructed specifically for fine turning, using which expected forces can be estimated quite precisely during technological process planning.
Keywords: fine turning; force measuring; force model; regression; Kienzle model
Abstract: This paper proposes an extension to CERIF compatible CRIS, enabling automated evaluation of research achievements by applying diverse (country/region/institution specific), or even multiple evaluation rulebooks and guidelines. It was implemented as an extension to the CERIF compliant CRIS system of the University of Novi Sad (CRIS UNS) that already contains information for assessment of results from scientific journals, so the focus of this research is an extension to the CERIF model aimed at evaluation of the results published through conferences. Based on a survey and an analysis of selected evaluation rulebooks and guidelines, the paper proposes an extended CERIF model that comprises conference evaluation related metadata and a machinereadable representation of a rulebook that enables automated evaluation. A rule-based expert system is proposed for representation of evaluation rules and evaluation of research results. The Serbian rulebook is represented and implemented using the expert system Jess in order to evaluate the proposed model. Reliance of the model on CERIF standard allows its easy application in any CERIF compatible CRIS system.
Keywords: CERIF; model extension; conferences; automated evaluation; Jess
Abstract: Privacy and overall information security are significantly affected by an Internet users’ awareness, knowledge and behavior. Therefore, there is a user’s awareness assessment needed before developing security solutions that will include the user of the information system. The present paper proposes a validated measurement instrument developed as a web based software security and privacy tool for self-assessment (ISPSA). This solution is based on a scientifically validated questionnaire, OWL ontology concept, evidential reasoning approach and the intelligent agent’s algorithm. The main goal of this paper is to propose the solution that will raise awareness among Internet users on privacy and information security issues.
Keywords: awareness; Evidential Reasoning; Information Security; Intelligent Agent; OWL ontology; UISAQ; users
Abstract: Modular fixtures are usually built from standard elements that can be found in the modular element set of a certain manufacturer. But in some cases the user besides using these elements modifies some semi-finished elements that can also be found in the modular element set, or even the user produces some brand new elements. The motivation behind this can be to simplify the fixture, or to make the locating or clamping of a workpiece possible or more precise. This paper presents the methods of automatic generation, and if required, automatic modification of such new or semi-finished elements.
Keywords: Modular fixture; automatic element modification; CAFD (Computer Aided Fixture Design)
Abstract: The aim of our research was to prove that a digital camera does not influence the quality of spectral reflectance estimation and that satisfactory results could be achieved with a low-cost camera instead of using more expensive and complex multispectral devices. For that purpose two digital cameras – Nikon D300 in Nikon D700 were compared by obtaining spectral data from RGB values of a digital camera. For calculation of spectral data two different methods were used – SpecSens method and ImaiBerns method. In the research, two different color charts – ColorChecker DC and ColorChecker SG, were used. Performance of each camera and reflectance estimation approach were evaluated based on RMSE and ΔE*ab. Results showed that in the case of the ImaiBerns method it could be concluded that the obtained reflectance spectra are independent from the used camera, as somewhat slightly better results were obtained with Nikon D700. In the case of SpecSens method, which is based on the determination of the spectral sensitivity of the camera, the choice of the camera had quite an impact on the results. These results are pretty unreliable due to the large color differences ΔE*ab, as this calculation takes into account the standard light (first if you want to calculate XYZ and second the LAB values) and standard colorimetric observer.
Keywords: digital camera Nikon D300; digital camera Nikon D700; spectral reflectance estimation
Abstract: In this study Analytical Network Process (ANP) was applied as a model for prioritizing generated strategies based on the factors and sub-factors within the SWOT analysis, in the case of the Technical Faculty in Bor (TFB), University of Belgrade (UB), Serbia. ANP methodology approach implies the establishment of a hierarchical model on four levels: Goal (selection of the best strategy) - SWOT factors - SWOT sub-factors - alternative strategies, which establishes the interaction between clusters at different hierarchical levels of the model as well as the interactions between the elements within each cluster. This paper demonstrates a process for quantitative SWOT analysis that can be performed even when there is dependence among strategic factors. The proposed algorithm uses the ANP, which allows measurement of the dependencies among the strategic factors, as well as AHP, which is based on the independences between the factors. Dependencies among the SWOT factors and sub-factors are observed and their relative importance weights are determined, as well as their impact on the prioritization of the development strategy. The resulting benchmarking and prioritization of the alternative strategies in a series WO1 - SO1 - ST1 - WT1 for the development period of the TFB until 2025, indicates the sequence of application of certain strategies. This sequence implies that after reaching the limits in the application of the first strategy the next strategy in the defined sequence is implemented, in accordance with the mission of the TFB, the adopted strategic goals (SC) and adopted vision for the next ten-year period.
Keywords: ANP; SWOT; factors; sub-factors; strategy prioritization
Abstract: Since the last decade, graphics processing units (GPU) have dominated the world of interactive computer graphics and visualization technology. Over the years, sophisticated tools and programming interfaces (like OpenGL, DirectX API) greatly facilitated the work of developers because these frameworks provide all fundamental visualization algorithms. The research target is to develop a traditional CPU-based pure software rasterizer. Although currently it has a lot of drawbacks compared with GPUs, the modern multi-core systems and the diversity of the platforms may require such implementations. In this paper, we are dealing with triangle rasterization, which is the cornerstone of rendering process. New model optimization as well as the improvement and combination of existing techniques have been presented, which dramatically improve performance of the well-known half-space rasterization algorithm. The presented techniques become applicable in areas such as graphics editors and even computer games.
Keywords: half-space rasterization; bisector algorithm; software rendering
Abstract: Smart grids are modern electric power infrastructures, which incorporate elements of traditional power systems and information and communication technology (ICT), with the aim to improve the reliability, efficiency and safety requirements of critical infrastructure systems. Due to its reliance on ICT, the Smart Grid exposes electrical power systems to new vulnerabilities and security issues. Therefore, security is becoming an ever increasing concern, in the physical and ICT domain as well. Access controls are one of the most important aspects of information security and a vital element of a layered security strategy. The role-based access control (RBAC) model is widely used in complex enterprise systems which are characterized by many participants accessing the system, but with different levels of access rights depending on their specific duties and responsibilities. The existing security models, which are primarily role-based, are usually not tailored for critical infrastructure systems with specialized features, such as high numbers of equipment and devices dispersed over vast geographical regions. In order to meet the security requirements of smart grids, it is important to manage their assets on a fine level of granularity. This paper proposes an access control management system for smart grids by considering the regional division of critical assets and concept of areas of responsibility (AOR). To this end, the standardized RBAC model was extended with the aim to improve the existing access control policy with greater level of granularity from the aspect of managing electrical utilities. In this paper, we propose the RBACAOR model, which was developed and tested on the Windows operating system platform using .NET Framework role-based security, with the use of different data stores for the RBACAOR configuration, namely Active Directory (AD), AD Lightweight Directory Services (AD LDS) and Microsoft SQL Server.
Keywords: smart grid; role-based access control; regional division; area of responsibility;