Logo PTI
Polish Information Processing Society
Logo FedCSIS

Federated Conference on Computer Science and Information Systems

September 8–11, 2013. Kraków, Poland

Proceedings of the 2013 Federated Conference on Computer Science and Information Systems

ISBN 978-1-4673-4471-5 (Web),
978-83-60810-53-8 (USB)

IEEE Catalog Number: CFP1385N-ART (Web),
CFP1385N-USB (USB)

Complete FedCSIS Proceedings (PDF, 104.391 M)

FedCSIS Position papers

Preface

Conference Keynote Papers

  • A General Divide and Conquer Approach for Process Mining
    004 Wil M. P. van der Aalst, pages 1 – 10. Show abstract Abstract. Operational processes leave trails in the information systems supporting them. Such event data are the starting point for process mining -- an emerging scientific discipline relating modeled and observed behavior. The relevance of process mining is increasing as more and more event data become available. The increasing volume of such data (``Big Data'') provides both opportunities and challenges for process mining. In this paper we focus on two particular types of process mining: \emph{process discovery} (learning a process model from example behavior recorded in an event log) and \emph{conformance checking} (diagnosing and quantifying discrepancies between observed behavior and modeled behavior). These tasks become challenging when there are hundreds or even thousands of different activities and millions of cases. Typically, process mining algorithms are linear in the number of cases and exponential in the number of different activities. This paper proposes a very general divide-and-conquer approach that decomposes the event log based on a partitioning of activities. Unlike existing approaches, this paper does not assume a particular process representation (e.g., Petri nets or BPMN) and allows for various decomposition strategies (e.g., SESE- or passage-based decomposition). Moreover, the generic divide-and-conquer approach reveals the core requirements for decomposing process discovery and conformance checking problems.
  • Nonnegative Matrix Factorization and Its Application to Pattern Analysis and Text Mining
    003 Nonnegative Matrix Factorization; Correntropy; Principal Component Analyis; Face recognition Jacek M. Zurada, Tolga Ensari, Ehsan Hosseini Asl, Jan Chorowski, pages 11 – 16. Show abstract Abstract. Nonnegative Matrix Factorization (NMF) is one of the most promising techniques to reduce the dimensionality of the data. This presentation compares the method with other popular matrix decomposition approaches for various pattern analysis tasks. Among others, NMF has been also widely applied for clustering and latent feature extraction. Several types of the objective functions have been used for NMF in the literature. Instead of minimizing the common Euclidean Distance (EucD) error, we review an alternative method that maximizes the correntropy similarity measure to produce the factorization. Correntropy is an entropy-based criterion defined as a nonlinear similarity measure. Following the discussion of maximization of the correntropy function, we use it to cluster document data set and compare the clustering performance with the EucD-based NMF. Our approach was applied and illustrated for the clustering of documents in the 20-Newsgroups data set. The comparison is illustrated with 20-Newsgroups data set. The results show that our approach produces per average better clustering compared with other methods which use EucD as an objective function.

8th International Symposium Advances in Artificial Intelligence and Applications

  • Underdetermined Blind Separation of an Unknown Number of Sources Based on Fourier Transform and Matrix Factorization
    308 Hybrid Intelligent Systems, Machine Learning, Real-world Applications of Intelligent Systems Ossama S. Alshabrawy, Mohamed E. Ghoneim, A. A. Salama, Aboul Ella Hassanien, pages 19 – 25. Show abstract Abstract. This paper presents an approach for underdetermined blind source separation that can be applied even if the number of sources is unknown. Moreover, the proposed approach is applicable in the case of separating I+3 sources from I mixtures without additive noise. This situation is more challenging and suitable to practical real world problems. Also, the sparsity conditions are not imposed unlike to those employed by some conventional approaches. Firstly, the number of source signals are estimated followed by the estimation of the mixing matrix based on the use of short time Fourier transform and rough-fuzzy clustering. Then, source signals are normalized and recovered using modified Lin's projected gradient algorithm with modified Armijo rule. The simulation results show that the proposed approach can separate I+3 source signals from I mixed signals, and it has superior evaluation performance compared to conventional approaches.
  • The Multiple Pheromone Ant Clustering Algorithm and its application to real world domains
    346 Applications in Bioinformatics, Data Mining, Evolutionary Computation, Knowledge Management, Machine Learning, Nature Inspired Methods, Real-world Applications of Intelligent Systems Jan Chircop, Christopher D. Buckingham, pages 27 – 34. Show abstract Abstract. The Multiple Pheromone Ant Clustering Algorithm (MPACA) models the collective behaviour of ants to find clusters in data and to assign objects to the most appropriate class. It is an ant colony optimisation approach that uses pheromones to mark paths linking objects that are similar and potentially members of the same cluster or class. Its novelty is in the way it uses separate pheromones for each descriptive attribute of the object rather than a single pheromone representing the whole object. Ants that encounter other ants frequently enough can combine the attribute values they are detecting, which enables the MPACA to learn influential variable interactions. This paper applies the model to real-world data from two domains. One is logistics, focusing on resource allocation rather than the more traditional vehicle-routing problem. The other is mental-health risk assessment. The task for the MPACA in each domain was to predict class membership where the classes for the logistics domain were the levels of demand on haulage company resources and the mental-health classes were levels of suicide risk. Results on these noisy real-world data were promising, demonstrating the ability of the MPACA to find patterns in the data with accuracy comparable to more traditional linear regression models.
  • Fuzziness in Partial Approximation Framework
    304 Approximate Reasoning Zoltán Ernő Csajbók, Tamás Mihálydeák, pages 35 – 41. Show abstract Abstract. In partial approximation spaces with Pawlakian approximation pairs, three partial membership functions are generated. These fuzzy functions rely on the lower and upper approximations of a set. They provide special type of fuzziness on the universe: all of them are partial functions and derived from the observed data relatively to available knowledge about the objects of the universe. With the help of these functions, three new approximation pairs are generated and so new approximation spaces appear effectively. Using not Pawlakian approximation pairs gives a special insight into the nature of general set approximations, and so new models of necessity and possibility can be given.
  • Comparison of Selected Textural Features as Global Content-Based Descriptors of VHR Satellite Image – the EROS-A Study
    241 Wojciech Drzewiecki, Anna Wawrzaszek, Michał Krupiński, Sebastian Aleksandrowicz, Katarzyna Bernat, pages 43 – 49. Show abstract Abstract. Texture is considered as one of the most crucial image features used commonly in computer vision. It is important source of information about image content, especially for single-band images. In this paper we present the results of research carried out to assess the usefulness of selected textural features of different groups in panchromatic very high resolution (VHR) satellite image classification. The study is based on images obtained from EROS A satellite. The aim of our tests was to estimate and compare the accuracy of main land cover types classification, with a particular focus on determining usefulness of textural features based on multifractal formalism. Presented research confirmed that it is possible to use the textural features as efficient global descriptors of VHR satellite image content. It was also prove that multifractal parameters should be considered as valuable textural features in the context of land cover classification.
  • On the computer certification of fuzzy numbers
    355 Fuzzy Modeling and Control, Knowledge Management Adam Grabowski, pages 51 – 54. Show abstract Abstract. The formalization of fuzzy sets in terms of corresponding membership functions is already available in machine verified mathematical knowledge base. We show how it can be extended to provide the development of fuzzy numbers fully benefitting from the existing framework. The flexibility which is offered by automated proof-assistants allowed us to overcome some initial difficulties. Although fuzziness stems from the same background as rough set theory, i.e. incomplete or imprecise information, both formal approaches are substantially different.
  • Cardiac disorders detection approach based on local transfer function classifier
    396 Data Mining, Decision Support Systems, Neural Networks, Real-world Applications of Intelligent Systems Ahmed Hamdy, Nashwa El-Bendary, Ashraf Khodeir, Mohamed Mostafa M. Fouad, Aboul Ella Hassanien, Hesham Hefny, pages 55 – 61. Show abstract Abstract. Truly, heart is successor to the brain in being the most significant vital organ in the body of a human. Heart, being a magnificent pump, has his performance orchestrated via a group of valves and highly sophisticated neural control. While the kinetics of the heart is accompanied by sound production, sound waves produced, by the heart, are reliable diagnostic tools to check heart activity. Chronologically, several data sets have been put forward to sneak on the heart performance and lead to medical intervention whenever necessary. The heart sounds data set, utilized in this paper, provides researchers with abundance of sound signals that was classified using different classification algorithms; decision tree, rotation forest, random forest are few to mention. This paper proposes an approach based on local transfer function classifier as a new model of neural networks for heart valve diseases detection. In order to achieve this objective, and to increase the efficiency of the predication model, boolean reasoning discretization algorithm is introduced to discretion the heart signal data set, then the rough set reduction technique is applied to find all reducts of the data which contains the minimal subset of attributes that are associated with a class label for classification. Then, the rough sets dependency rules are generated directly from all generated reducts. Rough confusion matrix is used to evaluate the performance of the predicted reducts and classes. Finally, a local transfer function classifier was employed to evaluate the ability of the selected descriptors for discrimination whether they represent healthy or unhealthy. The experimental results obtained, show that the overall accuracy offered by the employed local transfer function classifier was high compared with other techniques including decision table, rotation forest, random forest, and NBtree.
  • A Human Inspired Collision Avoidance Strategy for Moving Agents
    148 Pejman Kamkarian, Henry Hexmoor, pages 63 – 67. Show abstract Abstract. This paper presents an approach for controlling collision avoidance among a group of moving multi-agents such that they are not able to communicate with each other and hence, cannot share information. The basics and key features of our collision control algorithm are discussed to include practical examinations. Our approach is based on multi-agent systems and help moving agents to pursue their goals using collision free routes. In terms of validating our solution, we plan to apply into a configuration set of agents located in our experimental space. We also explain our solution algorithm that we have developed, along with the examination that we subjected it to, as well as sketching some of the most important challenges that remain to be addresses in our future researches.
  • Application of Ant-Colony Optimisation to Compute Diversified Entity Summarisation on Semantic Knowledge Graphs
    237 Witold Kosiński, Marcin Sydow, Tomasz Kuśmierczyk, Paweł Rembelski, pages 69 – 76. Show abstract Abstract. We present ant colony optimisation approach, enriched with a novel self-adaptation mechanism, applied to solve DIVERSUM Problem that consits of generating a small diversified entity summarisation in a knowledge graph. The recently proposed DIVERSUM problem is viewed in this paper in a novel way as a NP-hard combinatorial optimisation problem. The presented preliminary experimental results indicate superiority of this approach to the previously proposed solutions to the DIVERSUM problem.
  • Semantic Tagging of Heterogeneous Data: Labeling Fire & Rescue Incidents with Threats
    433 Data Mining, Decision Support Systems, Knowledge Fusion and Integration, Knowledge Management, Machine Learning, Natural Language Processing, Real-world Applications of Intelligent Systems Adam Krasuski, Andrzej Janusz, pages 77 – 82. Show abstract Abstract. In the article we present a comparison of the classification algorithms focused on labeling Fire\&Rescue incidents with threats appearing at the emergency scene. Each of the incidents is reported in a database and characterized by a set of quantitative attributes and by natural language descriptions of the cause, the scene and the course of actions undergone by firefighters. The training set for our experiments was manually labeled by the Fire Service commanders after deeper analysis of the emergency description. We also introduce a modified version of Explicit Semantic Analysis method and demonstrate how it can be employed for automatic labeling of the incident reports. The task we are trying to accomplish belongs to the multi-label classification problems. Its practical purpose is to support the commanders at a emergency scene and improve the analytics on the data collected by Polish State Fire Service.
  • Combining One-Class Support Vector Machines for Microarray Classification
    378 Applications in Bioinformatics, Machine Learning Bartosz Krawczyk, pages 83 – 89. Show abstract Abstract. The advance of high-throughput techniques, such as gene microarrays and protein chips have a major impact on contemporary biology and medicine. Due to the high-dimensionality and complexity of the data, it is impossible to analyze it manually. Therefore machine learning techniques play an important role in dealing with such data. In this paper we propose to use a one-class approach to classifying microarrays. Unlike canonical classifiers, these models rely only on objects coming from single class distributions. They distinguish observations coming from the given class from any other possible states of the object, that were unseen during the classification step. While having less information to dichotomize between classes, one-class models can easily learn the specific properties of a given dataset and are robust to difficulties embedded in the nature of the data. We show, that using one-class support vector machines can give as good results as canonical multi-class classifiers, while allowing to deal with imbalanced distribution and unexpected noise in the data. To cope with high dimensionality of the feature space, we propose to form an ensemble, based on Random Subspace and prune it with the usage of diversity measure. Experimental investigations, carried on public datasets, prove the usefulness of the proposed approach.
  • Flow-level Spam Modelling using separate data sources
    361 Data Mining, Decision Support Systems, Knowledge Fusion and Integration, Real-world Applications of Intelligent Systems Marcin Luckner, Robert Filasiak, pages 91 – 98. Show abstract Abstract. Spam detection based on flow-level statistics is a new approach in anti-spam techniques. The approach reduces number of collected data but still can obtain relative good results in a spam detection task. The main problems in the approach are selection of flow-level features that describe spam and detection of discrimination rules. In this work, flow-level model of spam is presented. The model describes spam subclasses and brings information about major features of a spam detection task. The model is the base for decision trees that detect spam. The analysis of detectors, which was learned from data collected from different mail servers, results in the universal spam description consists of the most significant features. Flows described by selected features and collected on Broadband Remote Access Server were analysed by an ensemble of created classifiers. The ensemble detected major sources of spam among senders IP addresses.
  • RBF ensemble based on reduction of DAG structure
    334 Machine Learning, Neural Networks Marcin Luckner, Karol Szyszko, pages 99 – 105. Show abstract Abstract. Binary classifiers are grouped into an ensemble to solve multi–class problems. One of proposed ensemble structure is a directed acyclic graph. In this structure, a classifier is created for each pair of classes. The number of classifiers can be reduced if groups of classes will be separated instead of individual classes. The proposed method is based on the similarity of classes defined as a distance between classes. For near classes the structure of DAG stays immutable. For the distant classes more than one is separated with a single classifier. In this paper, the proposed method is tested in variants based on various metrics. For the tests, several datasets from UCI repository was used and the results were compared with published works. The tests proved that grouping of radial basis functions into such ensemble reduces the classification cost and the recognition accuracy is not reduced significantly.
  • Recommender system for ground-level Ozone predictions in Kuwait
    68 Approximate Reasoning, Data Mining, Decision Support Systems, Granular Computing, Real-world Applications of Intelligent Systems Mahmood A. Mahmood, Eiman Tamah Al-Shammari, Nashwa El-Bendary, Aboul Ella Hassanien, Hesham A. Hefny, pages 107 – 110. Show abstract Abstract. This article presents a recommender system based on rough mereology for predicting Ozone concentration in Kuwait through testing the data gathered from Al-Jahra station. The proposed recommender system consists of three phases; namely pre-processing, classification, and recommendation phases. To evaluate the performance of the presented recommender system, fifteen parameters were used. Those parameters were developed and validated between Jan. 2006 and Sept. 2010. The initial three years of data are used to develop the predicting models and the remaining data is used for testing and verifying these models. The obtained results demonstrate the effectiveness and the reliability of the proposed recommender system. Based on the data resulted, the average Ozone level prediction in a certain time is characterized by a mean absolute error of 0.08947. In addition, both experimentally resulted and actual dataset values existed in the healthy region of the O3 value, which is less than 0.165 ppm according to the reference air quality index.
  • Prediction of School Dropout Risk Group Using Neural Network Fuzzy ARTMAP
    399 Neural Networks Valquiria R. C. Martinho, Clodoaldo Nunes, Carlos Roberto Minussi, pages 111 – 114. Show abstract Abstract. Dropping out of school is one of the most complex and crucial problems in education, causing social, economic, political, academic and financial losses. In order to contribute to solve the situation, this paper presents the potentials of an intelligent, robust and innovative system, developed for the prediction of risk groups of student dropout, using a Fuzzy-ARTMAP Neural Network, one of the techniques of artificial intelligence, with possibility of continued learning. This study was conducted under the Federal Institute of Education, Science and Technology of Mato Grosso, with students of the Colleges of Technology in Automation and Industrial Control, Control Works, Internet Systems, Computer Networks and Executive Secretary. The results showed that the proposed system is satisfactory, with global accuracy superior to 76% and significant degree of reliability, making possible the early identification, even in the first term of the course, the group of students likely to drop out.
  • Semantic Explorative Evaluation of Document Clustering Algorithms
    273 Applications in Bioinformatics, Data Mining, Hybrid Intelligent Systems, Knowledge Fusion and Integration, Knowledge Management, Natural Language Processing, Real-world Applications of Intelligent Systems Hung Son Nguyen, Sinh Hoa Nguyen, Wojciech Świeboda, pages 115 – 122. Show abstract Abstract. In this paper, we investigate the problem of quality analysis of clustering results using semantic annotations given by experts. We propose a novel approach to construction of evaluation measure, which is based on the Minimal Description Length (MDL) principle. In fact this proposed measure, called SEE (Semantic Evaluation by Exploration), is an improvement of the existing evaluation methods such as Rand Index or Normalized Mutual Information. However it fixes some of weaknesses of the original methods. We illustrate the proposed evaluation method on the freely accessed biomedical research articles from Pubmed Central (PMC). Many articles from Pubmed Central are annotated by the experts using Medical Subject Headings (MeSH) thesaurus. This paper is a part of the research on designing and developing a dialog-based semantic search engine for SONCA system which is a part of the SYNAT project. We compare different semantic techniques for search result clustering using the proposed measure.
  • Vickrey-Clarke-Groves for privacy-preserving collaborative classification
    164 Data Mining Anastasia Panoui, Sangarapillai Lambotharan, Raphael C.-W. Phan, pages 123 – 128. Show abstract Abstract. The combination of game theory and data mining opens new directions and opportunities for developing novel methods for extraction of knowledge among multiple collaborative agents. This paper extends on this combination, and motivated by the work of Nix and Kantarcioglu employs the Vickrey-Clarke-Groves (VCG) mechanism to achieve privacy-preserving collaborative classification. Specifically, in addition to encouraging multiple agents to share data truthfully, we facilitate preservation of privacy. In our model, privacy is accomplished by allowing the parties to supply a controlled amount of perturbed data, instead of randomised data, so long as this perturbation does not harm the overall result of classification. The critical point which determines when this perturbation is harmful is given by the VCG mechanism. Our experiment on real data confirms the potential of the theoretical model, in the sense that VCG mechanism can balance the tradeoff between privacy preservation and good data mining results.
  • dotRL: A platform for rapid Reinforcement Learning methods development and validation
    165 Bartosz Papis, Paweł Wawrzyński, pages 129 – 136. Show abstract Abstract. This paper introduces dotRL, a platform that enables fast implementation and testing of Reinforcement Learning algorithms against diverse environments. dotRL has been written under .NET framework and its main characteristics include: (i) adding a new learning algorithm or environment to the platform only requires implementing a simple interface, from then on it is ready to be coupled with other environments and algorithms, (ii) a set of tools is included that aid running and reporting experiments, (iii) a set of benchmark environments is included with as demanding as Octopus-Arm and Half-Cheetah, (iv) the platform is available for instantaneous download, compilation, and execution, without libraries from different sources.
  • An Emotional Learning-inspired Ensemble Classifier (ELiEC)
    266 Mahboobeh Parsapoor, Urban Bilstrup, pages 137 – 141. Show abstract Abstract. In this paper, we suggest an inspired architecture by brain emotional processing for classification applications. The architecture is a type of ensemble classifier and is referred to as ‘emotional learning-inspired ensemble classifier’ (ELiEC). In this paper, we suggest the weighted k-nearest neighbor classifier as the basic classifier of ELiEC. We evaluate the ELiEC’s performance by classifying some benchmark datasets.
  • Autonomous Input Management for Human Interaction-Oriented Systems Design
    377 Applications in Bioinformatics, Architectures of Intelligent Systems, Decision Support Systems, Knowledge Fusion and Integration, Machine Learning, Real-world Applications of Intelligent Systems, Robotics Michal Podpora, Aleksandra Kawala-Janik, Mary Kiernan, pages 143 – 144. Show abstract Abstract. In this paper evaluation of a policy-based algorithm for video inputs switching is presented. The term 'data quality' is not trivial for Human-Machine Interaction systems, yet a simple and efficient algorithm is needed for choosing the most valuable video source. This becomes particularly important for systems that support functional decomposition of image processing algorithm, which are designed for non-optimal working environment. In this paper an autonomous input management system is proposed, which consists of a data quality evaluation algorithm and a simple decision algorithm.
  • Knowledge-based Named Entity Recognition in Polish
    300 Knowledge Fusion and Integration, Natural Language Processing Aleksander Pohl, pages 145 – 151. Show abstract Abstract. This document describes an algorithm aimed at recognizing Named Entities in Polish text, which is powered by two knowledge sources: the Polish Wikipedia and the Cyc ontology. Besides providing the rough types for the recognized entities, the algorithm links them to the Wikipedia pages and assigns precise semantic types taken from Cyc. The algorithm is verified against manually identified Named Entities in the one- million sub-corpus of the National Corpus of Polish.
  • Tabu Search approach for Multi-Skill Resource-Constrained Project Scheduling Problem
    418 Real-world Applications of Intelligent Systems Marek E. Skowroński, Paweł B. Myszkowski, Marcin Adamski, Paweł Kwiatek, pages 153 – 158. Show abstract Abstract. In this article two approaches of Tabu Search in Multi-Skill Resource-Constrained Project Scheduling Problem (MS-RCPSP) have been proposed, based on different neighbourhood generation methods. The first approach assumes swapping resources assigned to pair of tasks, while the second one proposes assigning any resource that could perform given task. Both approaches need to respect the skill constraints. The objective of this paper is to research the usability and robustness of proposed approaches in solving MS-RCPSP. Experiments have been performed using artificially created dataset instances, based on real--world instances, got from Volvo IT and verified by experienced project manager. Presented results show that Tabu Search (TS) based methods are efficient approaches that could be developed in the further work.
  • Novel heuristic solutions for Multi-Skill Resource-Constrained Project Scheduling Problem
    158 Real-world Applications of Intelligent Systems Marek E. Skowroński, Paweł B. Myszkowski, Łukasz Podlodowski, pages 159 – 166. Show abstract Abstract. In this article some novel scheduling heuristics for Multi--Skill Resource--Constrained Project Scheduling Problem have been proposed and compared to state-of-the-art priority rules, based on task duration, resource salaries and precedence relations. New heuristics stand an aggregation of known methods, but are enhanced by skills domain. The goal of the paper is to investigate, whether evaluated methods can be used as robustness enhancement tools in metaheuristics, mostly evolutionary algorithms. Experiments have been performed using artificially created dataset instances, based on real--world instances. Obtained results prove that such methods stand interesting feature that can be included to more complex methods and increase their robustness.
  • Object Tracking and Video Event Recognition with Fuzzy Semantic Petri Nets
    54 Piotr Szwed, Mateusz Komorkiewicz, pages 167 – 174. Show abstract Abstract. Automated recognition of video events is an important research area in computer vision having many potential applications, e.g. intelligent video surveillance systems or video indexing engines. In this paper we describe components of an event recognition system building up a full processing chain from low-level features extraction to high-level semantic information on detected events. It is comprised of three components: object detection and tracking algorithms, a fuzzy ontology and Fuzzy Semantic Petri Nets (FSPN), a formalism that can be used to specify events and to reason on their occurrence. FSPN are Petri nets coupled with an underlying fuzzy ontology. The ontology stores assertions (facts) concerning object classification and detected relations being an abstraction of the information originating from object tracking algorithms. Fuzzy predicates querying the ontology are used in Petri net transitions guards. Places in FSPN represent scenario steps. Tokens carry information on objects participating in an event and have weights expressing likelihood of an event's step occurrence. Introduced fuzziness allow to cope with imprecise information delivered by image analysis algorithms. We describe the architecture of video event recognition system and show examples of successfully recognized events.
  • Collective Belief Revision in Linear Algebra
    254 Architectures of Intelligent Systems, Knowledge Fusion and Integration, Knowledge Management Satoshi Tojo, pages 175 – 178. Show abstract Abstract. Although the logic of belief update has mainly concerned a belief state of one agent thus far, the real world settings require us to implement simultaneous belief changes. Here, however, we need to manage so many indices: agent names, time stamps, and the difference of information. In this paper, we introduce the notation of vectors and matrices for the simultaneous informing action. By this, we show that a matrix can represent a public announcement and/or a consecutive message passing, with the time of the change of belief states properly. A collective belief state multiplied by a communication matrix, including matrices of accessibility in Kripke semantics, becomes a hypercuboid.
  • Medical Decision Support System Architecture for Diagnosis of Down's Syndrome
    134 Architectures of Intelligent Systems, Decision Support Systems, Image Processing and Interpreting Hubert Wojtowicz, Jolanta Wojtowicz, Wojciech Kozioł, Wiesław Wajs, pages 179 – 182. Show abstract Abstract. The paper presents the development of a new system that is used to solve the problem of the recognition of the dermatoglyphic pattern and the understanding of the classification process of the symptoms of Down's syndrome. The method used in the system for diagnosing Down's syndrome in infants is based on the combination of text knowledge found in the scientific literature describing Down's syndrome with the knowledge obtained from the analysis of dermatoglyphic indices characteristic of Down's syndrome with the use of digital pattern recognition techniques. The scientific goal is to design a classifier system that realizes automatic medical diagnosis through the application of an expert system designed on the basis of knowledge included in the scientific text descriptions of the Down's syndrome. One other aim is the application of the pattern recognition algorithms to the analysis of indices present in the images of dermatoglyphic patterns. This approach, similar to the approach used by anthropologists, is realized by the system through the juxtaposition of the knowledge described in the form of expert system rules and the information provided by the appropriate digital equipment, and on the basis of this juxtaposition an arbitrary classification of the investigated patterns is performed.
  • An Investment Strategy for the Stock Exchange Using Neural Networks
    277 Antoni Wysocki, Maciej Ławryńczuk, pages 183 – 190. Show abstract Abstract. This paper describes a neural investment strategy system. Basing on some well known financial indicators considered by investors the system helps to make the current decision. The basic problem is to select appropriately the indicators which would give the best predictor. Two methods are used and compared: the combination method and the correlation method.

3rd International Workshop on Artificial Intelligence in Medical Applications

  • Automatic computer aided segmentation for liver and hepatic lesions using hybrid segmentations techniques
    143 Ahmed M. Anter, Ahmad Taher Azar, Aboul Ella Hassanien, Mohamed Abu ElSoud, Nashwa El Bendary, pages 193 – 198. Show abstract Abstract. Liver cancer is one of the major death factors in the world. Transplantation and tumor resection are two main therapies in common clinical practice. Both tasks need image assisted planning and quantitative evaluations. An efficient and effective automatic liver segmentation is required for corresponding quantitative evaluations. Computed Tomography (CT) is highly accurate for liver cancer diagnosis. Manual identification of hepatic lesions done by trained physicians is a time-consuming task and can be subjective depending on the skill, expertise and experience of the physician. Computer aided segmentation of CT images would thus be a great step forward to scientific advancement for medical purposes. The sophisticated hybrid system was proposed in this paper which is capable to segment liver from abdominal CT and detect hepatic lesions automatically. The proposed system based on two different datasets and experimental results show that the proposed system robust, fastest and effectively detect the presence of lesions in the liver, count the distinctly identifiable lesions and compute the area of liver affected as tumors lesion, and provided good quality results, which could segment liver and extract lesions from abdominal CT in less than 0.15 s/slice.
  • An Improved Ant Colony System for Retinal Blood Vessel Segmentation
    96 Biomedical Applications Ahmed Hamza Asad, Ahmad Taher Azar, Mohamed Mostafa M. Fouad, Aboul Ella Hassanien, pages 199 – 205. Show abstract Abstract. The diabetic retinopathy disease spreads diabetes on the retina vessels thus they lose blood supply that causes blindness in short time, so early detection of diabetes prevents blindness in more than 50\% of cases. The early detection can be achieved by automatic segmentation of retinal blood vessels in retinal images which is two-class classification problem. This paper proposes two improvements in previous approach uses ant colony system for automatic segmentation of retinal blood vessels. The first improvement is done by adding new discriminant feature to the features pool used in classification. The second improvement is done by applying new heuristic function based on probability theory in the ant colony system instead of the old that based on Euclidean distance used before. The results of the improvements are promising when applying the improved approach on STARE database of retinal images.
  • Comparison of methods for hand gesture recognition based on Dynamic Time Warping algorithm
    448 Artificial Intelligence Techniques in Health Sciences, Biomedical Applications Katarzyna Barczewska, Aleksandra Drozd, pages 207 – 210. Show abstract Abstract. Gesture recognition may find applications in rehabilitation systems, sign language translation or smart environments. The aim of nowadays science is to improve the recognition systems' efficiency but also to allow the user to perform the gesture in a natural way. The article presents different methods (DTW, DDTW, PDTW) based on Dynamic Time Warping algorithm, which is commonly used for hand gesture recognition using small wearable three-axial inertial sensor. Additionally, different approaches to signal definitions and preprocessing are discussed and tested. To verify which of the methods presented is more accurate in case of gesture recognition, database of 2160 simple gestures was collected, and recognition procedure was implemented. The main goal was to compare the efficiency of each method assuming that each person should perform the movement naturally. Obtained results suggest that that the most efficient method for the presented problem was the DDTW. The worst recognition performance was achieved with the PDTW method.
  • Designing multiple user perspectives and functionality for clinical decision support systems
    281 Applications of AI in Health Care and Surgery Systems, Artificial Intelligence Techniques in Health Sciences, Diagnoses and Therapy Support Systems, Knowledge Management of Medical Data, Medical Data- and Knowledge Bases, Medical Expert Systems, Ontology and Medical Information Christopher D. Buckingham, Abu Ahmed, Ann Adams, pages 211 – 218. Show abstract Abstract. Clinical Decision Support Systems (CDSSs) need to disseminate expertise in formats that suit different end users and with functionality tuned to the context of assessment. This paper reports research into a method for designing and implementing knowledge structures that facilitate the required flexibility. A psychological model of expertise is represented using a series of formally specified and linked XML trees that capture increasing elements of the model, starting with hierarchical structuring, incorporating reasoning with uncertainty, and ending with delivering the final CDSS. The method was applied to the Galatean Risk and Safety Tool, GRiST, which is a web-based clinical decision support system (www.egrist.org) for assessing mental-health risks. Results of its clinical implementation demonstrate that the method can produce a system that is able to deliver expertise targetted and formatted for specific patient groups, different clinical disciplines, and alternative assessment settings. The approach may be useful for developing other real-world systems using human expertise and is currently being applied to a logistics domain.
  • Towards Determining Syntactic Complexity of Visual Stimuli Used in Art Therapy
    404 Diagnoses and Therapy Support Systems Bolesław Jaskuła, Jarosław Szkoła, Krzysztof Pancerz, pages 219 – 223. Show abstract Abstract. In the paper, we deal with the problem of automatic determining syntactic complexity of visual stimuli. This problem is important in case of using paintings in eye-tracking based diagnosis and therapy of some kinds of neuropsychological and emotional disorders. Our approach to solving the considered problem is based on the clustering procedure using Self Organizing Feature Maps. The clustering results are compared with the heat maps obtained in the eye-tracking process.
  • Simulating of Schistosomatidae (Trematoda: Digenea) Behavior by Physarum Spatial Logic
    209 Andrew Schumann, Ludmila Akimova, pages 225 – 230. Show abstract Abstract. In this paper we consider possibilities of simulating behavior of the group of trematode larvaе (miracidiae and cercariae) by the abstract slime mould based computer that is programmed by attractants and repellents. For describing this simulation, we appeal to the language which is a kind of π-calculus called Physarum spatial logic. This language contains labels for attractants and repellents. Taking into account the fact that the behavior of miracidiae and cercariae can be programmed only by attractants (repellents for miracidiae and cercariae are not known still), we can claim that the behavior of miracidiae and cercariae is a restricted (poorer) form of Physarum spatial logic.
  • A Fuzzy Logic Approach to The Evaluation of Health Risks Associated with Obesity
    66 Artificial Intelligence Techniques in Health Sciences, Data Mining and Knowledge Discovery in Medicine, Diagnoses and Therapy Support Systems Tadeusz Nawarycz, Krzysztof Pytel, Maciej Gazicki-Lipman, Wojciech Drygas, Lidia Ostrowska-Nawarycz, pages 231 – 234. Show abstract Abstract. Excessive body weight, especially in the form of the so-called abdominal obesity (AO) is an important factor of the cardio-metabolic risks (CMR). The paper presents a fuzzy model of AO and CMR assessments based on such key indicators of anthropometric measurements as body mass index (BMI as a measure of the global adiposity) as well as waist circumference (WC) and waist-to-height ratio (WHtR) as AO indicators. For the construction of a membership function (MF) the Zadeh’s Extension Principle (EP) and mapping of the BMI fuzzy sets into adequate AO fuzzy sets using different transformation functions have been applied. Taking advantage of the results of a screening study, the AO membership functions for adult population of Lodz (WHO-CINDI project) are presented. MF design based on the EP theory is a useful methodology for assessing the AO and, consequently for a better assessment of CMR.
  • Failure Analysis and Estimation of the Healthcare System
    169 Clinical Information Systems, Health Care Information Systems Elena Zaitseva, Jozef Kostolny, Miroslav Kvassay, Vitaly Levashenko, Krzysztof Pancerz, pages 235 – 240. Show abstract Abstract. The principal goal of information technologies application in medicine is improvement and conditioning of medical care. Modern healthcare systems have to perfect the care of a patient. Therefore, the healthcare has to be characterized, first of all, by high reliability and reliability analysis of such a system is an important problem. The new method for estimation of system reliability is considered in this paper. This method permits to investigate the influence of any system component failure to the system functioning.

3rd International Workshop on Advances in Semantic Information Retrieval

  • Information Retrieval Using an Ontological Web-Trading Model
    184 José-Andrés Asensio, Nicolás Padilla, Luis Iribarne, pages 243 – 249. Show abstract Abstract. One of the biggest problems facing Web-based Information Systems (WIS) is the complexity of the information searching/retrieval processes, especially the information overload, to distinguish between relevant and irrelevant content. In an attempt to solve this problem, a wide range of techniques based on different areas has been developed and applied to WIS. One of these techniques is the information retrieval. In this paper we described an information retrieval mechanism (only for structured data) with a client/server implementation based on the Query-Searching/Recovering-Response (QS/RR) model by means of a trading model, guided and managed by ontologies. This mechanism is part of SOLERES system, an Environmental Management Information System (EMIS).
  • Rhetorical Browzing in Journalistic Texts: Preliminary Investigations
    447 Domain-specific semantic applications., Natural language semantic processing., Searching and ranking. Patrice Enjalbert, Alexandre Labadié, Stéphane Ferrari, pages 251 – 256. Show abstract Abstract. The work presented in this paper concerns discourse structure analysis and its applications to intra- and inter-document search. In a typical application, which could be called "rhetorical browsing", the system will provide assistance to a journal reader in order to focus on texts and passages presenting certain \emph{kind} of information and comments, according to his/her current interest: may be raw information, possibly with chronological dimension, or on contrary analyses, recommendations, debates, etc.. The discourse model can be related to Swales's "discourse moves" and the derived "argumentative zoning" procedures for scientific documents. However due to the nature of the considered texts, zones are defined in more "generalist" terms, following the classic Narration-Description-Argumentation-Prescription typology and especially C.~Smith's notion of "discourse modes". The paper presents some preliminary steps performed in order to test the feasibility of the project. First of all, in order to ground our research on firm observations, we decided to build a corpus of journalistic texts, annotated according to the discourse model in view. Quantified results concerning the organization of discourse modes within texts could be obtained thanks to these annotations. In a second step, an experimental procedure for automatic tagging of text passages according to discourse modes has been designed, implemented and tested on the corpus.
  • Similarities in Spaces of Features and Concepts: Towards Semantic Evaluations
    324 Domain-specific semantic applications., Evaluation methodologies for semantic search and retrieval., Ontology for semantic information retrieval. Wladyslaw Homenda, Agnieszka Jastrzebska, pages 257 – 260. Show abstract Abstract. The article discusses abstract spaces of concepts and features. Concepts correspond to real-world objects. Concepts are described by their features. The study is devoted to relations in the space of concepts and in the space of features. Of greatest interest is similarity of structures in the concepts and features spaces. There is a direct link between features and concepts. Therefore, similarity may be analyzed through structures of both concepts and features. Authors propose generalized similarity relation, applicable to the developed framework. In addition, similarity of nested sets of the space of features and concepts is discussed. Authors introduce an algorithm, which calculates similarity of two structures of nested structures. Developed semantics leads to the set-theoretic model, which allows to flexibly describe abstract information.
  • Antisocial Behavior Corpus for Harmful Language Detection
    154 Domain-specific semantic applications., Ontology for semantic information retrieval. Myriam Munezero, Maxim Mozgovoy, Tuomo Kakkonen, Vitaly Klyuev, Erkki Sutinen, pages 261 – 265. Show abstract Abstract. We report on experiments that demonstrate the relevance of our AntiSocial Behavior (ASB) corpus as a machine learning resource to detect antisocial behavior from text. We first describe the corpus and then, by using the corpus for training machine learning algorithms, we build a set of binary classifiers. Experimental evaluations revealed that classifiers built based on the ASB corpus produce reliable classification results with up to 98% accuracy. We believe that the dataset will be valuable to researchers and practitioners working in preventing, controlling and diagnosing antisocial behavior and related problems.
  • An Approach for Developing a Mobile Accessed Music Search Integration Platform
    56 Marina Purgina, Andrey Kuznetsov, Evgeny Pyshkin, pages 267 – 273. Show abstract Abstract. We introduce the architecture and the data model of the software for integrated access to music searching web services. We illustrate our approach by developing a mobile accessed application which allows users of Android running touch screen devices accessing several music searchers including Musipedia, Music Ngram Viewer, and FolkTuneFinder. The application supports various styles of music input query. We pay special attention to query style transformation aimed to fit well the requirements of the supported searching services. By examples of using developed tools we show how they are helpful while discovering citations and similarity in music compositions.
  • Evaluation of beef production and consumption ontology and presentation of its actual and potential applications
    393 Rafał Trójczak, Robert Trypuz, Przemysław Grądzki, Jerzy Wierzbicki, Alicja Woźniak, pages 275 – 278. Show abstract Abstract. The paper concerns beef production and consumption ontology (OntoBeef) and its applications. It is presented the three-stage OntoBeef evaluation process with a special focus on description of interaction of ontologists with domain experts. We also describe Linked Open Data (LOD) philosophy and show how links between OntoBeef and four other ontologies were established. We also present the components of OntoBeef-driven information system, a technology used to its creation and its functionalities. In particular we describe thesaurus component of the information system incorporating LOD connections.
  • Query Construction for Related Document Search Based on User Annotations
    186 Query interfaces., Searching and ranking. Jakub Ševcech, Mária Bieliková, pages 279 – 286. Show abstract Abstract. We often use various services for creating bookmarks, tags, highlights and other types of annotations while surfing the Web or just reading electronic documents. These annotations represent additional information on particular information source. We proposed a method for query construction to search for related documents to currently studied document. We use the document content where we concentrate on user created annotations as indicators of user’s interest in particular parts of the document. Our method for query construction is based on spreading activation in a graph created from the document content. We evaluated proposed method within a service called Annota, which allows users to insert various types of annotations into web pages and PDF documents displayed in the web browser. We analyzed properties of various types of annotations inserted by users of Annota into documents. Based on these properties, we also performed a simulation to determine optimal parameters and compare proposed method against commonly used tf-idf based method.
  • Ontology of architectural decisions supporting ATAM based assessment of SOA architectures
    70 Piotr Szwed, Paweł Skrzynski, Grzegorz Rogus, Jan Werewka, pages 287 – 290. Show abstract Abstract. Nowadays, Service Oriented Architecture (SOA) might be treated as a state of the art approach to the design and implementation of enterprise software. Contemporary software developed according to SOA paradigm is a complex structure, often integrating various platforms, technologies, products and design patterns. Hence, it arises a problem of early evaluation of a software architecture to detect design flaws that might compromise expected system qualities. Such assessment requires extensive knowledge gathering information on various types of architectural decisions, their relations and influences on quality attributes. In this paper we describe SOAROAD (SOA Related Ontology of Architectural Decisions), which was developed to support the evaluation of architectures of information systems using SOA technologies. The main goal of the ontology is to provide constructs for documenting SOA. However, it is designed to support future reasoning about architecture quality and for building a common knowledge base. When building the ontology we focused on the requirements of Architecture Tradeoff Analysis Method (ATAM) which was chosen as a reference methodology of architecture evaluation.

6th Workshop on Computational Optimization

  • A quasi self-stabilizing algorithm for detecting fundamental cycles in a graph with DFS spanning tree given
    69 Halina Bielak, Michał Pańczyk, pages 293 – 297. Show abstract Abstract. This paper presents a linear time quasi self-stabilizing algorithm for detecting the set of fundamental cycles on an undirected connected graph modelling asynchronous distributed system. Previous known algorithm has O(n²) time complexity, whereas we prove that our stabilizes after O(n) moves. Distributed adversarial scheduler is considered. Both algorithms assume that the depth-search spanning tree of the graph is given. The output is given in a distributed manner as a state of variables in the nodes.
  • Anticipation in the Dial-a-Ride Problem: an introduction to the robustness
    270 combinatorial optimization, computational optimization methods Samuel Deleplanque, Jean-Pierre Derutin, Alain Quilliot, pages 299 – 305. Show abstract Abstract. The Dial-a-Ride Problems (DARP) models an operation research problem related to the on demand transport. This paper introduces one of the fundamental features of this type of transport: the robustness. This paper solves the Dial-a-Ride Problem by integrating an Inserability measurement. The technique used is a greedy insertion algorithm based on time constraint propagation (time windows, maximum ride time and maximum route time). In the present work, we integrate a new way to measure the impact of each insertion on the other not inserted demands. We propose its calculation, study its behavior, discuss the transition to dynamic context and present a way to make the system more robust.
  • Multiple shooting SQP-line search algorithm for optimal control of pressure-constrained batch reactor
    119 computational optimization methods, large scale optimization Paweł Drąg, Krystyn Styczeń, pages 307 – 313. Show abstract Abstract. In the article a new approach for control of the pressure-constrained batch reactor and a new multi-step optimization algorithm were presented. The considered batch reactor was described by both differential and algebraic equations. State constraints incorporate always difficulties into the mathematical model of the reactor, so a new algorithm based on multiple shooting SQP-line search method was proposed and tested. Multiple shooting method was used not only to ensure stability of the solution, but to divide the system into smaller subsystems, so the large-scale problem is considered. The considerations were made for simultaneous approach, which allows to apply this algorithm to a wide class of differential-algebraic systems. The simulations were executed in Matlab environment by using Wroclaw Centre for Networking and Supercomputing.
  • Bicriteria Fuzzy Optimization Location-Allocation Approach
    223 Santiago García-Carbajal, Belarmino Adenso-Díaz, Sebastián Lozano, pages 315 – 319. Show abstract Abstract. Distribution network design deals with defining which elements will be part of the supply chain and how they will be interrelated. Many authors have studied this problem from a cost minimization point of view. Nowadays the sustainability factor is increasing its importance in the logistics operations and must be considered in the design process. We deal here with the problem of determining the location of the links in a supply chain and the assignment of the final customers considering at the same time cost and environmental objectives. We use a fuzzy bicriteria model for solving the problem, embedded in a genetic algorithm that looks for the best trade-off solution. A set of experiments have been carried out to check the performance of the procedure, using some instances for which we know a priori a good reference solution.
  • Branch and Price for Preemptive Resource Constrained Project Scheduling Problem Based on Interval Orders in Precedence Graphs
    220 Aziz Moukrim, Alain Quilliot, Hélène Toussaint, pages 321 – 328. Show abstract Abstract. This paper describes an efficient exact algorithm to solve Preemptive RCPSP. We propose a very original and efficient branch and bound procedure based upon minimal interval order enumeration, which involves column generation as well as constraint propagation and which is implemented with the help of the generic SCIP software. We perform tests on the famous PSPLIB instances which provide very satisfactory results. To the best of our knowledge it is the first algorithm able to solve at optimality all the set of j30 instances of PSPLIB in a preemptive way. Moreover, this algorithm allows us to update several best known lower bounds for the j60, j90 and j120 instances of PSPLIB.
  • A Beam Search Based Algorithm for the Capacitated Vehicle Routing Problem with Time Windows
    181 combinatorial optimization, computational optimization methods Hakim Akeb, Adel Bouchakhchoukha, Mhand Hifi, pages 329 – 336. Show abstract Abstract. In this paper the capacitated vehicle routing problem with time windows is tackled with a beam-search based approximate algorithm. An instance of this problem is defined by a set of customers and a fleet of identical vehicles. A time window is associated with each customer and a maximum capacity characterizes a vehicle. The aim is then to serve all the customers by minimizing the number of vehicles used as well as the total distance and by respecting the time windows.
  • Real life cable constraints in designing Passive Optical Network architecture
    23 large scale optimization, unconstrained and constrained optimization Stanislas Francfort, Cédric Hervet, Matthieu Chardy, Frédéric Moulis, pages 337 – 339. Show abstract Abstract. Fiber To The Home (FTTH) deployment is crucial for telecommunication operators for both economical and quality of service reasons. This paper deals with a real-life Passive Optical Network (PON) design problem focusing on optical cabling constraints. This decision problem is formulated as an integer linear program (ILP) and several solving approaches are designed. Tests performed on real instances assess the efficiency of the proposed solution algorithms.
  • Energy-based Pruning Devices for the BP Algorithm applied to Distance Geometry
    196 Douglas Gonçalves, Antonio Mucherino, Carlile Lavor, pages 341 – 346. Show abstract Abstract. The Molecular Distance Geometry Problem (MDGP) is the one of finding the embedding of a molecular graph in the three dimensional space, where graph vertices represent atoms and edges represent known distances between some pairs of atoms. The MDGP is a constraint satisfaction problem and it is generally cast as a continuous global optimization problem. Moreover, under some assumptions, this optimization problem can be discretized and so that it becomes combinatorial, and it can be solved by a Branch & Prune (BP) algorithm. The solution set found by BP, however, can be very large for some instances, while only the most energetically stable conformations are of interest. In this work, we propose and integrate the BP algorithm with two new energy-based pruning devices. Computational experiments show that the newly added pruning devices are able to improve the performance of the BP algorithm, as well as the quality (in terms of energy) of the conformations in the solution set.
  • A Maximum Matching Based Heuristic Algorithm for Partial Latin Square Extension Problem
    102 combinatorial optimization, computational optimization methods, large scale optimization Kazuya Haraguchi, Masaki Ishigaki, Akira Maruoka, pages 347 – 354. Show abstract Abstract. A partial Latin square (PLS) is an assignment of $n$ symbols to an $n\times n$ grid such that, in each row and in each column, each symbol appears at most once. The partial Latin square extension (PLSE) problem asks to find such a PLS that is a maximum extension of a given PLS. The PLSE problem is NP-hard, and in this paper, we propose a heuristic algorithm for this problem. To design a heuristic, we extend the previous $\frac{1}{2}$-approximation algorithm that utilizes the notion of maximum matching. We show the empirical effectiveness of the proposed algorithm through computational experiments. Specifically, the proposed algorithm delivers a better solution than the original one and local search. Besides, when computation time is limited due to an application reason, it delivers a better solution than IBM ILOG CPLEX, a state-of-the-art optimization solver, especially for large scale ``hard'' instances.
  • Fair optimization with advanced aggregation operators in a multicriteria facility layout problem
    321 combinatorial optimization, multiobjective optimization, random search algorithms Jarosław Hurkała, Adam Hurkała, pages 355 – 362. Show abstract Abstract. In this paper we address a mining operation problem that is a special case of Quadratic Assignment Problem and which belongs to the class of facility layout problems. The considered problem is static and discrete, but the set of possible locations is larger than the set of facilities. We distinguish multiple types of equal-area facilities (mines, processing and auxiliary facilities). Mines can be placed only on selected locations (deposits of various resources), and the production volume of each type of facility depends on the adjacency of other facilities. We examine two situations: when the number of each type of facility is given, and when only the total number of facilities is specified. The goal is to maximize the production. This problem is multi-objective and we use advanced aggregation operators (OWA/WOWA) to achieve fair solutions. A comparison of results obtained with list-based threshold accepting meta-heuristic and simulated annealing algorithm is presented.
  • Time dependent global optimization via Bayesian inference and Sequential Monte Carlo sampling
    338 computational optimization methods, global optimization, random search algorithms Piotr Kopka, Anna Wawrzynczak, Mieczyslaw Borysiewicz, pages 363 – 370. Show abstract Abstract. In many areas of application it is important to estimate unknown model parameters in order to model precisely the underlying dynamics of a physical system. In recent years, Sequential Monte Carlo (SMC) methods have become a very popular tool for Bayesian parameter estimation. In this case, the problem of finding the best parameters configuration comes to the optimization issue which is to determine the best fit. In this paper, the application of this approach to the classical global optimization problem is described. We consider the situation when optimized functions are dynamical i.e. the global extremum is changing in time. For this purpose, we adapt two dimensional Ackley and four-dimensional Wood functions. Our aim is to find the most probable localization of the extremum in each time with the use of the Bayesian approach joined with the Markov Chain Monte Carlo (MCMC) and SMC algorithms. We propose a mechanism for dynamic tuning of the proposal distribution in SMC. The approach is based on the Metropolis-Hastings algorithm, combined with a resampling mechanism to achieve better results. We have examined different version of the proposed SMC and MCMC algorithms in terms of effectiveness to estimate the probabilistic distributions. The effect is demonstrated using two benchmark optimization problems. Computed results show that the proposed mechanisms can significantly improve optimization results compared to standard MCMC.
  • Influence of the Population Size on the Genetic Algorithm Performance in Case of Cultivation Process Modelling
    167 combinatorial optimization, computational optimization methods, global optimization, nature inspired optimization methods, random search algorithms Olympia Roeva, Stefka Fidanova, Marcin Paprzycki pages 371 – 376. Show abstract Abstract. In this paper, an investigation of the influence of the population size on the genetic algorithm (GA) performance for a model parameter identification problem, is considered. The mathematical model of an E. coli fed-batch cultivation process is studied. The three model parameters -- maximum specific growth rate ($\mu_{max}$), saturation constant ($k_{S}$) and yield coefficient ($Y_{S/X}$) are estimated using different population sizes. Population sizes between 5 and 200 chromosomes in the population are tested with constant number of generations. In order to obtain meaningful information about the influence of the population size a considerable number of independent runs of the GA are preformed. The observed results show that the optimal population size is 100 chromosomes for 200 generations. In this case accurate model parameters values are obtained in reasonable computational time. Further increase of the population size, above 100 chromosomes, does not improve the solution accuracy. Moreover, the computational time is increased significantly.
  • Quadratic TSP: A lower bounding procedure and a column generation approach
    202 Borzou Rostami, Federico Malucelli, Pietro Belotti, Stefano Gualandi, pages 377 – 384. Show abstract Abstract. In this paper we present a Column Generation approach to the Quadratic Traveling Salesman Problem. Given a graph and a function that maps every pair of edges to a cost, the problem consists in finding a cycle that visits every vertex exactly once and such that the sum of the costs over all pairs of consecutive edges of the cycle is minimum. We propose a Linear Programming formulation that has a variable for each cycle in the graph. Since the number of cycles is exponential in the graph size, we solve our formulation via column generation. Computational results on some set of instances used in the literature show that our approach is promising. As it obtains a lower bound close to the optimal solutions for all instances.
  • A hybrid method for modeling and solving constrained search problems
    89 combinatorial optimization, hybrid optimization algorithms, unconstrained and constrained optimization Paweł Sitek, Jarosław Wikarek, pages 385 – 392. Show abstract Abstract. The paper presents the concept and description of the implementation of hybrid approach to modeling and solving constrained problems. In this approach, integration of two environments of mathematical programming (MP) and logic programming (LP) was proposed. In particular, the connection was made between integer programming (IP) and constraint logic programming (CLP). The idea of the proposed approach is to use the strengths of each of the proposed environments for modeling and optimization of the constrained search problems. This is due to the different treatment of the optimization constraints and the method for each of them. This is particularly important for decision models in which there are many constraints summing discrete decision variables, and the objective function is also of similar nature. This structure of decision-making models is very common for optimization problems in transportation, distribution, supply chain and manufacturing. For verification and clarification of the proposed approach, two illustrative examples were presented and solved. The first example is the original model devised by the authors, of optimization of integrated supply chain costs. The second one is the known 2E-CVRP benchmark (Two-Echelon Capacitated Vehicle Routing Problem).
  • Biased Random Key Genetic Algorithm with Hybrid Decoding for Multi-objective Optimization
    256 Panwadee Tangpattanakul, Nicolas Jozefowiez, Pierre Lopez, pages 393 – 400. Show abstract Abstract. A biased random key genetic algorithm (BRKGA) is an efficient method for solving combinatorial optimization problems. It can be applied to solve both single-objective and multi-objective optimization problems. The BRKGA operates on a chromosome encoded as a key vector of real values between [0,1]. Generally, the chromosome has to be decoded by using a single decoding method in order to obtain a feasible solution. This paper presents a hybrid decoding, which combines the operation of two single decoding methods. This hybrid decoding gives two feasible solutions from the decoding of one chromosome. Experiments are conducted on realistic instances, which concern acquisition scheduling of agile Earth observing satellites.
  • Efficient and Scalable Computation of the Energy and Makespan Pareto Front for Heterogeneous Computing Systems
    47 Kyle M. Tarplee, Ryan Friese, Anthony A. Maciejewski, Howard Jay Siegel, pages 401 – 408. Show abstract Abstract. The rising costs and demand of electricity for high-performance computing systems pose difficult challenges to system administrators that are trying to simultaneously reduce operating costs and offer state-of-the-art performance. However, system performance and energy consumption are often conflicting objectives. Algorithms are necessary to help system administrators gain insight into this energy/performance trade-off. Through the use of intelligent resource allocation techniques, system administrators can examine this tradeoff space to quantify how much a given performance level will cost in electricity, or see what kind of performance can be expected when given an energy budget. A novel algorithm is presented that efficiently computes tight lower bounds and high quality solutions for energy and makespan. These solutions are used to bound the Pareto front to easily trade-off energy and performance. These new algorithms are shown to be highly scalable in terms of solution quality and computation time compared to existing algorithms.
  • Efficient Models for Special Types of Non-Linear Maximum Flow Problems
    340 combinatorial optimization Marina Tvorogova, pages 409 – 416. Show abstract Abstract. In this paper, we consider the maximum flow problem on networks with non-linear transfer functions. We consider special types of transfer functions, which are particularly relevant for applications. For concave transfer functions, we reduce the NL-flow problem to the generalized flow problem and solve it using a polynomial-time approximation scheme. For convex, s-shaped and monotonically growing piecewise linear (PWL) transfer functions (the latter can always be divided into s-shaped fragments), we present an equivalent network representation that allows us to build a MILP model with a better performance than if we were using standard MILP formulations of PWL functions. The latter requires additional variables and constraints to force the correct (depending on the amount of flow) linear segment of PWL functions to be taken. In our model, the correct segment in an s-shaped fragment is chosen automatically due to the network's structure. For the case when transfer functions are non-linear, we provide an error estimation for the approximated solution.
  • A Hybrid Algorithm based on Differential Evolution, Particle Swarm Optimization and Harmony Search Algorithms
    15 hybrid optimization algorithms, nature inspired optimization methods Ezgi Deniz Ulker, Ali Haydar, pages 417 – 420. Show abstract Abstract. Evolutionary optimization algorithms and their hybrid forms have become popular for solving multimodal complex problems which are very difficult to solve by traditional methods in the recent years. In the literature, many hybrid algorithms are proposed in order to achieve a better performance than the well-known evolutionary optimization methods being used alone by combining their features for balancing the exploration and exploitation goals of the optimization algorithms. This paper proposes a novel hybrid algorithm composed of Differential Evolution algorithm, Particle Swarm Optimization algorithm and Harmony Search algorithm which is called HDPH. The proposed algorithm is compared with these three algorithms on the basis of solution quality and robustness. Numerical results based on several well-studied benchmark functions have shown that HDPH has a good solution quality with high robustness. Also, in HDPH all parameters are randomized which prevents the disadvantage of selecting all possible combination of parameter values in the selected ranges and of finding the best value set by parameter tuning.

Computer Aspects of Numerical Algorithms

  • Mixed precision iterative refinement techniques for the WZ factorization
    316 Analysis of rounding errors of numerical algorithms, Contemporary computer architectures, Numerical algorithms testing and benchmarking, Paradigms of programming numerical algorithms Beata Bylina, Jarosław Bylina, pages 425 – 431. Show abstract Abstract. The aim of the paper is to analyze the potential of the mixed precision iterative refinement technique for the WZ factorization. We design and implement a mixed precision iterative refinement algorithm for the WZ factorization with the use of the single (a.k.a. float), double and long double precision. For random dense square matrices with the dominant diagonal we report the performance and the speedup of the solvers using different machines and we investigate the accuracy of such solutions. Additionally, the results (performance, speedup and accuracy) for our mixed precision implementation based on the WZ factorization were compared to the similar ones based on the LU factorization.
  • Surface Reconstruction from Scattered Point via RBF Interpolation on GPU
    279 Salvatore Cuomo, Ardelio Galletti, Giulio Giunta, Alfredo Starace, pages 433 – 440. Show abstract Abstract. In this paper we describe a parallel implicit method based on radial basis functions (RBF) for surface reconstruction. The applicability of RBF methods is hindered by its computational demand, that requires the solution of linear systems of size equal to the number of data points. Our reconstruction implementation relies on parallel scientific libraries and is supported for massively multi-core architectures, namely Graphic Processor Units (GPUs). The performance of the proposed method in terms of accuracy of the reconstruction and computing time shows that the RBF interpolant can be very effective for such problem.
  • Towards an Efficient Multi-Stage Riemann Solver for Nuclear Physics Simulations
    298 Applications of numerical algorithms in science and technology, Numerical algorithms on GPUs, Numerical algorithms testing and benchmarking, Parallel numerical algorithms Sebastian Cygert, Joanna Porter-Sobieraj, Daniel Kikoła, Jan Sikorski, Marcin Słodkowski, pages 441 – 446. Show abstract Abstract. Relativistic numerical hydrodynamics is an important tool in high energy nuclear science. However, such simulations are extremely demanding in terms of computing power. This paper focuses on improving the speed of solving the Riemann problem with the MUSTA-FORCE algorithm by employing the CUDA parallel programming model. We also propose a new approach to 3D finite difference algorithms, which employ a GPU that uses surface memory. Numerical experiments show an unprecedented increase in the computing power compared to a CPU.
  • Application of AVX (Advanced Vector Extensions) for Improved Performance of the PARFES - Finite Element Parallel Direct Solver
    98 Applications of numerical algorithms in science and technology Sergiy Fialko, pages 447 – 454. Show abstract Abstract. The paper considers application of the AVX (Advanced Vector Extensions) technique to improve the performance of the PARFES parallel finite element solver, intended for finite element analysis of large-scale problems of structural and solid mechanics using multi-core computers. The basis for this paper was the fact that the dgemm matrix multiplication procedure implemented in the Intel MKL (Math Kernel Library) and ACML (AMD Core Math Library) libraries, which lays down the foundations for achieving high performance of direct methods for sparse matrices, does not provide for satisfactory performance with the AMD Opteron 6276 processor, Bulldozer architecture, when used with the algorithm required for PARFES. The procedure presented herein significantly improves the performance of PARFES on computers with processors of the above architecture, while maintaining the competitiveness of PARFES with the Intel MKL dgemm procedure on computers with Intel processors.
  • Library for Matrix Multiplication-based Data Manipulation on a “Mesh-of-Tori” Architecture
    222 Contemporary computer architectures, Libraries for numerical computations, Novel data formats for dense and sparse matrices, Paradigms of programming numerical algorithms, Parallel numerical algorithms Maria Ganzha, Marcin Paprzycki, Stanislav Sedukhin, pages 455 – 462. Show abstract Abstract. Recent developments in computational sciences, involving both hardware and software, allow reflection on the way that computers of the future will be assembled and software for them written. In this contribution we combine recent results concerning possible designs of future processors, ways they will be combined to build scalable (super)computers, and generalized matrix multiplication. As a result we propose a novel library of routines, based on generalized matrix multiplication that allows for data (matrix / image) manipulations.
  • Automatic Connections in IEC 61131-3 Function Block Diagrams
    18 Applications of numerical algorithms in science and technology Marcin Jamro, Dariusz Rzonca, pages 463 – 469. Show abstract Abstract. IEC 61131-3 standard defines five languages for programming industrial controllers. They support both textual and graphical development approaches. In case of Function Block Diagram graphical language, diagrams consist of a set of elements connected with lines, which have various length and shape. Development of an editor supporting diagrams design involves implementation of an algorithm, which is able to automatically find a suitable connection between blocks. In the paper an appropriate application of A* algorithm is proposed. The authors have ensured that the proposed solution is efficient and work smoothly. Relations between implementation details and performance are discussed. Achieved results caused that the mechanism has been introduced into graphics editors available in CPDev engineering environment for programming controllers.
  • N-body simulation based on the Particle Mesh method using Multigrid schemes
    201 Applications of numerical algorithms in science and technology, Parallel numerical algorithms P.E. Kyziropoulos, C.K. Filelis-Papadopoulos, G.A. Gravvanis, pages 471 – 478. Show abstract Abstract. Through the last decades multigrid methods have been used extensively in the solution of large sparse linear systems derived from the discretization of Partial Differential Equations in two or three space variables, subject to a variety of boundary conditions. Due to their efficiency and convergence behavior, multigrid methods are used in many scientific fields as solvers or preconditioners. Herewith, we propose a new algorithm for N-body simulation, based on the V-Cycle multigrid method in conjunction with Generic Approximate SParse Inverses (GenAspI). The N-body problem chosen is in toroidal 3D space and the bodies are subject only to gravitational forces. In each time step, a large sparse linear system is solved to compute the gravity potential at each nodal point in order to interpolate the solution to each body and through the velocity Verlet method compute the new position, velocity and acceleration of each respective body. Moreover, a parallel version of the multigrid algorithm with a truncated approach in the parallel levels is utilized for the fast solution of the linear system. Furthermore parallel results are provided which depict the efficiency and performance for the proposed multigrid N-body scheme.
  • Storing Sparse Matrices to Files in the Adaptive-Blocking Hierarchical Storage Format
    21 Novel data formats for dense and sparse matrices, Parallel numerical algorithms Daniel Langr, Ivan Šimeček, Pavel Tvrdík, pages 479 – 486. Show abstract Abstract. When there is a need to store a sparse matrix into a file system, is it worth to convert it first into some space-efficient storage format? This paper tries to answer such question for the adaptive-blocking hierarchical storage format (ABHSF), provided that the matrix is present in memory either in the coordinate (COO) or in the compressed sparse row (CSR) storage format. The conversion algorithms from COO and CSR to ABHSF are introduced and the results of performed experiments are then presented and discussed.
  • Schur Complement Domain Decomposition in conjunction with Algebraic Multigrid methods based on Generic Approximate Inverses
    200 Applications of numerical algorithms in science and technology, Numerical algorithms testing and benchmarking P.I. Matskanidis, G.A. Gravvanis, pages 487 – 493. Show abstract Abstract. For decades, Domain Decomposition (DD) techniques have been used for the numerical solution of boundary value problems. In recent years, the Algebraic Multigrid (AMG) method has also seen significant rise in popularity as well as rapid evolution. In this article, a Domain Decomposition method is presented, based on the Schur complement system and an AMG solver, using generic approximate banded inverses based on incomplete LU factorization. Finally, the applicability and effectiveness of the proposed method on characteristic two dimensional boundary value problems is demonstrated and numerical results on the convergence behavior are given.
  • 3D Non-Local Means denoising via multi-GPU
    87 Applications of numerical algorithms in science and technology, Languages, tools and environments for programming numerical algorithms, Numerical algorithms on GPUs, Parallel numerical algorithms Giuseppe Palma, Francesco Piccialli, Pasquale De Michele, Salvatore Cuomo, Marco Comerci, Pasquale Borrelli, Bruno Alfano, pages 495 – 498. Show abstract Abstract. Non-Local Means (NLM) algorithm is widely considered as a state-of-the-art denoising filter in many research fields. High computational complexity led to implementations on Graphic Processor Unit (GPU) architectures, which achieve reasonable running times by filtering, slice-by-slice, 3D datasets with a 2D NLM approach. Here we present a fully 3D NLM implementation on a multi-GPU architecture and suggest its high scalability. The performance results we discuss encourage the coding of further filter improvements and the investigation of a large spectrum of applicative scenarios.
  • Examples of Ramanujan and expander graphs for practical applications
    331 Applications of numerical algorithms in science and technology Monika Polak, Vasyl Ustimenko, pages 499 – 505. Show abstract Abstract. Expander graphs are highly connected sparse finite graphs. The property of being an expander seems significant in many of these mathematical, computational and physical contexts. Even more, expanders are surprizingly applicably applicable in other computational aspects: in the theory of error corecting codes and the theory of pseudorandomness, which are used in probabilistic algorithms. In this article we present a method to obtain a new examples of families of expanders graphs and some examples of Ramanujan graphs which are the best expanders. We describe properties of obtained graphs in comparison to previously known results. Numerical computations of eigenvalues presented in this paper have been computed with MATLAB.
  • Performance Impact of Reconfigurable L1 Cache on GPU Devices
    389 Contemporary computer architectures, Numerical algorithms on GPUs, Numerical algorithms testing and benchmarking Sasko Ristov, Marjan Gusev, Leonid Djinevski, Sime Arsenovski, pages 507 – 510. Show abstract Abstract. The newest GPU Kepler architecture offers a reconfigurable L1 cache per Streaming Multiprocessor with different cache size and cache associativity. Both these cache parameters affect the overall performance of cache intensive algorithms, i.e. the algorithms which intensively reuse the data. In this paper, we analyze the impact of different configurations of L1 cache on execution of matrix multiplication algorithm for different problem sizes. The basis of our research is the existing theoretical analysis of performance drawbacks which appear for matrix multiplication while executed on multicore CPU. We perform series of experiments to analyze the matrix multiplication execution behavior on GPU and its set associative L1 and L2 cache memory with three different configurations: cache size of 16KB, 32KB and 48KB with appropriate set associativity of 4 and 6, respectively. The results show that only L2 cache impacts the algorithm's overall performance, particularly the L2 capacity and set associativity. However, the configuration of the L1 cache with 48KB and 6-way set associativity slightly reduces these performance drawbacks, compared to other configurations of L1 with 32KB and 16KB using 4-way cache set associativity, due to greater set associativity.
  • Analyzing of Some Performance Measures for Parallel Matrix Multiplication
    240 Halil Snopce, Azir Aliu, pages 511 – 514. Show abstract Abstract. In order to make a proper selection for the given matrix-matrix multiplication operation and to decide which is the best suitable algorithm that generates a high throughput with a minimum time, a comparison analysis and a performance evaluation for some algorithms is carried out using the identical performance parameters
  • Template Library for Multi-GPU Pseudorandom Number Generation
    171 Libraries for numerical computations, Numerical algorithms on GPUs, Parallel numerical algorithms Dominik Szałkowski, Przemysław Stpiczyński, pages 515 – 519. Show abstract Abstract. The aim of the paper is to show how to design and implement fast parallel algorithms for Linear Congruential, Lagged Fibonacci and Wichmann-Hill pseudorandom number generators. The new algorithms employ the {\em divide-and-conquer} approach for solving linear recurrence systems. They are implemented on multi GPU-accelerated systems using CUDA. Numerical experiments performed on a computer system with two Fermi GPU cards show that our software achieve good performance in comparison to the widely used NVIDIA CURAND Library.

International Symposium on Multimedia Applications and Processing

  • Design of Digital Watermarking System Robust to the Number of Removal Attacks
    145 Sergey Anfinogenov, pages 523 – 527. Show abstract Abstract. In this paper it is proved that in fact the zero-bit digital watermarking system based on local maxima embedding in frequency area heuristically proposed recently is resistant to a number of removal attacks. It is shown how the watermark can survive after such conversions as shift cropping rescaling rotation and jpeg transform. The theoretical base of each transformation is given. Also it is shown how the image Fourier amplitude spectrum is affected by the image distortions and how the watermark can overcome those distortions and stay untouched.
  • A Robust Cattle Identification Scheme Using Muzzle Print Images
    121 Security in Multimedia Applications: Authentication and Watermarking Ali Ismail Awad, Hossam M. Zawbaa, Hamdi A. Mahmoud, Eman Hany Hassan Abdel Nabi, Rabie Hassan Fayed, Aboul Ella Hassanien, pages 529 – 534. Show abstract Abstract. Cattle identification receives a great research attention as an important way to maintain the livestock. The identification accuracy and the processing time are two key challenges of any cattle identification methodology. This paper presents a robust and fast cattle identification scheme from muzzle print images using local invariant features. The presented scheme compensates some weakness of ear tag and electrical-based traditional identification techniques in terms of accuracy and processing time. The proposed scheme uses Scale Invariant Feature Transform (SIFT) for detecting the interesting points for image matching. For a robust identification scheme, a Random Sample Consensus (RANSAC) algorithm has been coupled with the SIFT output to remove the outlier points and achieve more robustness. The experimental evaluations prove the superiority of the presented scheme as it achieves 93.3% identification accuracy in reasonable processing time compared to 90% identification accuracy achieved by some traditional identification approaches.
  • Logo identification algorithm for TV Internet
    24 Marta Chodyka, Volodymyr Mosorov, pages 535 – 540. Show abstract Abstract. Content inappropriate for children on Internet television is a serious problem in today's multimedia world. There are numerous methods which are used to control the content of the transmitted television programmes. However, these well-known methods do not solve the above mentioned problem completely. The paper presents a more effective method for automatic identification of the provider’s logo based on an original image sequence analysis. The automatic identification of the provider’s logo can be used to block access to video programmes of the selected providers. The method has been tested on some chosen video transmissions on-line, achieving over 98% of correct identification.
  • Semantic Multi-layered Design of Interactive 3D Presentations
    416 Animation, Virtual Reality, 3D and Stereo Imaging, Human Computer Interaction and Interfaces in Multimedia Applications, Machine Learning, Data Mining, Information Retrieval in Multimedia Applications, Multimedia in Internet and Web Based Systems Jakub Flotyński, Krzysztof Walczak, pages 541 – 548. Show abstract Abstract. Dependencies between interactive 3D content elements are typically more complex than dependencies between standard web pages as they may relate to different aspects of the content---spatial, temporal, structural, logical and behavioural. The Semantic Web approach helps in making data understandable and processable for both humans and computers by providing common concepts for describing web resources. However, semantic concepts may be also used to improve the process of designing content. In this paper, a new approach to semantic multi-layered design of interactive 3D content is presented. The proposed solution provides a semantic representation of 3D content in multiple layers reflecting diverse aspects of 3D content. The presented solution conforms to well-established 3D content and semantic description standards and - therefore - may facilitate creation, dissemination and reuse of 3D content in a variety of application domains on the web.
  • Microformat and Microdata schemas for interactive 3D web content
    231 Jakub Flotyński, Krzysztof Walczak, pages 549 – 556. Show abstract Abstract. The paper presents new Microformat and Microdata schemas for creating descriptions of interactive 3D web content. Microformats and Microdata are increasingly popular solutions for creating lightweight attribute-based built-in semantic metadata of web content. However, although Microformats and Microdata enable basic description of media objects, they have not been intended for 3D content. Describing 3D components is more complex than describing standard web pages as the descriptions may relate to different aspects of 3D content - spatial, temporal, structural, logical and behavioural. The main contribution of this paper are new Microformat and Microdata schemas for describing 3D web components and 3D scenes with metadata and semantic properties. The proposed schemas may be combined with X3D, a well-established 3D content description standard. Thanks to the use of the standardized solutions, the presented approach facilitates widespread dissemination of 3D content for use in a variety of multimedia applications on the web.
  • Exploring inexperienced user performance of a mobile tablet application through usability testing.
    124 Human Computer Interaction and Interfaces in Multimedia Applications Chrysoula Gatsou, Anastasios Politis, Dimitrios Zevgolis, pages 557 – 564. Show abstract Abstract. This paper explores inexperienced user performance through a usability testing of three alternative prototypes of a mobile tablet application. One key factor in inexperienced users adopting mobile technology is the ease of use of mobile devices. The interface layout one of the three prototypes was built on the basis of previous research conducted in collaboration with users. More specifically, our study involves five navigation tasks which novice users were required to complete with each of the three prototypes. Our results showed that participants displayed better task performance with the prototype F1, which was created in collaboration with participants, in contrast to prototypes F2 and F3, which both caused navigation problems.
  • Universal approach for sequential audio pattern search
    230 Audio, Image and Video Processing, Human Computer Interaction and Interfaces in Multimedia Applications, Machine Learning, Data Mining, Information Retrieval in Multimedia Applications, Multimedia File Systems and Databases: Indexing, Recognition and Retrieval Róbert Gubka, Michal Kuba, Roman Jarina, pages 565 – 569. Show abstract Abstract. This article deals with universal sequential audio pattern search and sound recognition method. Inspired by classical phoneme-based speech recognition and word spotting systems, where longer speech patterns are formed by sequences of basic speech units, we propose a methodology for creating a finite database of elementary sound models. These models can be arbitrary concatenated into sequences, thus forming a model of the required acoustical pattern or sound event.
  • Dependence of Kinect sensors number and position on gestures recognition with Gesture Description Language semantic classifier
    88 Animation, Virtual Reality, 3D and Stereo Imaging, Audio, Image and Video Processing, Entertainment and games, Human Computer Interaction and Interfaces in Multimedia Applications, Machine Learning, Data Mining, Information Retrieval in Multimedia Applications Tomasz Hachaj, Marek R. Ogiela, Marcin Piekarczyk, pages 571 – 575. Show abstract Abstract. We have checked if it is possible to increase effectiveness of standard tracking library (Kinect Software Development Kit) by fusion of body joints gathered from different sensors positioned around the user. The proposed calibration procedure enables integration of skeleton data from set of tracking devices into one skeleton. That procedure eliminates many segmentation and tracking errors. The test set for our methodology was 700 recordings of seven various Okinawa Shorin-ryu Karate techniques performed by black belt instructor. In case when side Kinects were rotated in pi/2 and – pi/2 around vertical axis relatively to central one number of all not classified Karate techniques dropped by 48% while excessive misclassification error remained in the same level.
  • Automatic Identification of Broadcast News Story Boundaries using the Unification Method for Popular Nouns
    261 Zainab Ali Khalaf, Tan Tien Ping, pages 577 – 584. Show abstract Abstract. Herein we describe the latent semantic algorithm method for identifying broadcast news story boundaries. The proposed system uses the prosodic (pronounced) forms to identify story boundaries based on popular noun unification. Commonly used clustering methods use latent semantic analysis (LSA) because of its excellent performance and because it is based on deep semantic rather than shallow principles. In this study, the LSA algorithm before and after unification was used to identify boundaries of Malay spoken broadcast news stories. The LSA algorithm with the noun unification approach resulted in less error and better performance than the LSA algorithm without noun unification.
  • Fingerprinting System for Still Images Based on the Use of a Holographic Transform Domain
    20 Security in Multimedia Applications: Authentication and Watermarking Valery Korzhik, Guilermo Morales-Luna, Alexander Kochkarev, Ivan Shevchuk, pages 585 – 590. Show abstract Abstract. We consider the watermarking method based on a holographic transform domain image proposed by A. Bruckstein. Our testing showed that it is resistant not against all possible attacks declared by his inventor, under the condition of a very high image quality just after WM embedding. Only a small part among 120 bits embedding into the image has an acceptable error probability after extraction if some attacks hold. Therefore we propose to modify this system for fingerprinting where only fixed bits are embedded into the most reliable places of the frequency mask. Systematic linear binary codes with large minimal code distance are used in order to correct errors. Simulation showed that such system provides sufficiently reliable tracing ``traitors'' under the most types of attacks subjected to remove WM, while keeping a good quality of the image just after embedding.
  • Real-time Implementation of the ViBe Foreground Object Segmentation Algorithm
    207 Audio, Image and Video Processing Tomasz Kryjak, Marek Gorgoń, pages 591 – 596. Show abstract Abstract. This paper presents a novel real-time hardware implementation of the ViBe (VIsual Background Extractor) background generation algorithm in reconfigurable FPGA device. This novel method combines the advantages of typical recursive and non-recursive approaches and achieves very good foreground object segmentation results. In this work the issue of porting ViBe to a FPGA hardware platform is discussed, two modification to the original approach are proposed and a detailed description of the implemented system is presented. This is the first, known to the authors, FPGA implementation of this algorithm.
  • Image Semantic Annotation using Fuzzy Decision Trees
    368 Machine Learning, Data Mining, Information Retrieval in Multimedia Applications Andreea Popescu, Bogdan Popescu, Marius Brezovan, Eugen Ganea, pages 597 – 601. Show abstract Abstract. One of the methods most commonly used for learning and classification is using decision trees. The greatest advantages that decision trees offer is that, unlike classical trees, they provide a support for handling uncertain data sets. The paper introduces a new algorithm for building fuzzy decision trees and also offers some comparative results, by taking into account other methods. We will present a general overview of the fuzzy decision trees and focus afterwards on the newly introduced algorithm, pointing out that it can be a very useful tool in processing fuzzy data sets by offering good comparative results.
  • Architectural Redesign of a Distributed Execution Environment
    274 Distributed Multimedia Systems Cosmin M. Poteras, Mihai Mocanu, Marian Cristian Mihaescu, pages 603 – 610. Show abstract Abstract. This paper describes the architectural redesign of a distributed execution framework called State Machine Based Distributed System which uses a state machine-based representation of processes in order to reduce the applications development time while providing safety and reliability. Initially the system has been built on top of the .Net Framework employing static programming techniques and made use of a custom data storage. The new architecture is intended to take advantage of the fast growing technologies like dynamic languages and graph databases for speeding up even more the applications development and improve the dynamism of the execution model.
  • Color Classifiers for 2D Color Barcodes
    67 Marco Querini, Giuseppe F. Italiano, pages 611 – 618. Show abstract Abstract. 2D color barcodes have been introduced to obtain larger storage capabilities than traditional black and white barcodes. Unfortunately, the data density of color barcodes is substantially limited by the redundancy needed for correcting errors, which are due not only to geometric but also to chromatic distortions introduced by the printing and scanning process. The higher the expected error rate, the more redundancy is needed for avoiding failures in barcode reading, and thus, the lower the actual data density. Our work addresses this trade-off between reliability and data density in 2D color barcodes and aims at identifying the most effective algorithms, in terms of byte error rate and computational overhead, for decoding 2D color barcodes. In particular, we perform a thorough experimental study to identify the most suitable color classifiers for converting analog barcode cells to digital bit streams.
  • A Novel Portable Surface Plasmon Resonance Based Imaging Instrument for On-Site Multi-Analyte Detection
    437 Sara Rampazzi, Francesco Leporati, Giovanni Danese, Lucia Fornasari, Franco Marabelli, Nelson Nazzicari, Andrea Valsesia, pages 619 – 626. Show abstract Abstract. In the last decade the need for portable Surface Plasmon Resonance (SPR) biosensors capable of on-site simultaneous multiple assays increased steadily. Several devices are available affected, however, by limitations in terms of costs, size, complexity and portability. A compact low-cost SPRi biosensor based on a novel method for multi-analyte detection is presented. The prototype consists of a nanohole array biochip integrated with a compact optics and an elaboration system. A CMOS image sensor captures reflected light from the biochip surface irradiated by a 830 nm LED. The entire system is managed by an ARM9 processor. The biosensor was able to detect a ~10-5 RIU change in the refractive index without analyte receptors at a glycerol concentration equal to 0.2%. Results are available in 14 seconds on LCD display and immediately stored to external SD memory. Preliminary experiments confirmed the strong biosensor’s usability in a wide range of applications and fields.
  • A Score-Based Packet Retransmission Approach for Push-Pull P2P Streaming Systems
    249 Distributed Multimedia Systems Muge Sayit, Erdem Karayer, Kemal Deniz Teket, Yagiz Kaymak, Cihat Cetinkaya, Sercan Demirci, Geylani Kardas, pages 627 – 633. Show abstract Abstract. In this paper we propose an inference based packet recovery technique which considers past scores indicating retransmission success of the peers. Past scores are calculated by considering several parameters such as requested packets availability and round trip time. The importance of packets to be retransmitted is also considered in the proposed model. In order to obtain comparable results, we also implement a different retransmission approach similar to the models proposed in the literature. The ns3 simulations show that retransmission model increases the Peak Signal to Noise Ratio (PSNR) value even under high peer churn and limited resource index. Furthermore, score-based approach provides a decrease in reset counts and the number of duplicate packets, when it is compared to different retransmission approaches.

Doctoral Symposium on Recent Advances in Information Technology

  • Inexact Newton method as a tool for solving Differential-Algebraic Systems
    120 Automatic Control and Robotics, Numerical Analysis, Scientific Computing Paweł Drąg, Krystyn Styczeń, pages 639 – 642. Show abstract Abstract. Inexact Newton method is commonly known from its ability to solve large-scale systems of nonlinear equations. In the paper the classical inexact Newton method is presented as a tool for solving differential-algebraic systems in fully-implicit form F(dy/dt ; y; t) = 0. The appropriate statement of DAEs using Backward Euler method makes the possiblity to see the differentialalgebraic system as a large-scale system of nonlinear equations. Because a choice of forcing terms in inexact Newton method significantly affects the convergence of the algorithm, in the paper new variants of the inexact Newton method were presented and tested. The simulations were executed in Matlab environment using Wroclaw Centre for Networking and Supercomputing.
  • On some quality criteria of bipolar linguistic summaries
    434 Computational Intelligence, Data Mining and Data Visualization, Database Management Systems Mateusz Dziedzic, Janusz Kacprzyk, Sławomir Zadrożny, pages 643 – 646. Show abstract Abstract. The quality measures for bipolar linguistic summaries of data, as proposed in our previous work [1], are further developed. The summaries introduced in [2] are assumed to be an extension of the “classical” linguistic summarization (cf. [3], [4]), a human-consistent data mining technique revealing complex patterns present in data. This extension consists in using the “and possibly” to build a summary and introducing the notion of context to determine the validity of the summary. We present a more detailed description of summaries quality measures/criteria and reports results of more extensive computational experiments.
  • A computational support for the group consensus reaching process in the fuzzy environment
    282 Cognitive Science, Computational Intelligence, Expert Systems, Natural Language Processing . Janusz Kacprzyk, Dominika Gołuńska, Andrzej Gorgoń, pages 647 – 650. Show abstract Abstract. In this paper we present an intelligent consensus reaching support system within the group of individuals under fuzzy preferences and fuzzy majority. Our solution is based on the idea of soft degree of consensus proposed by Fedrizzi, Kacprzyk, Nurmi and Zadrożny, which is meant as the statement: “most of the individuals agree with the most of the options”. Our new comprehensive model provides an effective support for the discussion guidance in the form of quantitative indices, i.e. sensitivity of individuals, option consensus degree and the cost of preference’s changes. This additional measures support and simplify consensus reaching process and improve the degree of total agreement among decision makers.
  • Linguistic knowledge about temporal data in Bayesian linear regression model to support forecasting of time series
    295 Computational Intelligence, Data Mining and Data Visualization, Expert Systems, Natural Language Processing ., Pattern Recognition Katarzyna Kaczmarek, Olgierd Hryniewicz, pages 651 – 654. Show abstract Abstract. Experts are able to predict sales based on approximate reasoning and subjective beliefs related to market trends in general but also to imprecise linguistic concepts about time series evolution. Linguistic concepts are linked with demand and supply, but their dependencies are difficult to be captured via traditional methods for crisp data analysis. There are data mining techniques that provide linguistic and easily interpretable knowledge about time series datasets and there is a wealth of mathematical models for forecasting. Nonetheless, the industry is still lacking tools that enable an intelligent combination of those two methodologies for predictive purposes. Within this paper we incorporate the imprecise linguistic knowledge in the forecasting process by means of linear regression. Bayesian inference is performed to estimate its parameters and generate posterior distributions. The approach is illustrated by experiments for real-life sales time series from the pharmaceutical market.
  • Improving the accessibility of touchscreen-based mobile devices: Integrating Android-based devices and Braille notetakers
    132 Daniel Kocieliński, Jolanta Brzostek-Pawłowska, pages 655 – 658. Show abstract Abstract. The article presents the concept and pilot implementation of wireless (Blutetooth-based) integration of the Braille notetaker environment and the environment of touchscreen-based devices (such as smartphones) operating under the Android system. Advanced functions of Android-based devices are hardly accessible to the blind using a touchscreen; one aim of such integration is to enable accessing them with a notetaker. Another is to allow the blind who work with notetakers on a daily basis and use common touchscreen-based smartphones and tablets to write using the physical Braille keyboard of a notetaker as well as its editing functions; this would solve many problems encountered and prevent numerous errors made by the blind when using the virtual QWERTY keyboard of a touchscreen-based device. Pilot implementation of the concept included developing a communication protocol for a notetaker operated under Windows CE and an Android-based smartphone; services to be provided to notetakers by smartphones have been developed as well. The implemented services dealt with managing contacts and composing messages – operations that normally require considerable interaction with a QWERTY keyboard. Favourable results of initial research on pilot implementation conducted among the blind indicate a need for further development of this concept.
  • A Hybrid Approach of System Security for Small and Medium Enterprises: combining different Cryptography techniques
    193 Vladescu Marius, Mateescu Georgiana, pages 659 – 662. Show abstract Abstract. Information protection is one of the most important issues in every domain, especially when we are talking about enterprises. Information safety can be translated into three key terms: integrity, availability and data protection. There are a lot of means used in order to achieve the three objectives simultaneously. The most popular is cryptography because it offers a lot of techniques which nowadays are impossible to fail. In this paper we want to prove this efficiency by comparing the different types of crypto algorithms and presenting the weaknesses and the strengths of them. To maximize all the crypto techniques benefits, we propose a hybrid approach that combine three crypto algorithms.
  • Impact of Signalling Load on Response Times for Signalling over IMS Core
    386 Computer Networks Lubos Nagy, Jiri Hosek, Pavel Vajsar, Vit Novotny, pages 663 – 666. Show abstract Abstract. This article focuses on the performance evaluation of the response time for signalling through a home Internet Protocol based Multimedia Subsystem (IMS), separately for each of IMS core nodes (Proxy-Call Session Control Function, Interrogating-CSCF, Serving-CSCF and Home Subscriber Server) and then on the investigation of the trend-line functions and their equations to describe these delays for various measured intensity of signalling generated load by high-performance tool - IxLoad. In this article, we have found out the trend-line function of response times for each measured message. Thanks to the showed results, some performance parameters like delay in selected IMS core node and their behaviour can be predicted and evaluated.
  • Creating a Serial Driver Chip for Commanding Robotic Arms
    53 Automatic Control and Robotics, Information Theory, Software Engineering Roland Szabó, Aurel Gontean, pages 667 – 670. Show abstract Abstract. In this paper we shall present a serial driver chip creation on FPGA. We created this serial driver chip for the RS-232 interface and we programmed it to 115200 baud rate to be able to communicate with the Lynxmotion AL5 type robotic arms. This serial driver chip was made for bidirectional communication to be able to send and receive SCPI (Standard Commands for Programmable Instruments) commands on the serial interface. If we create the layout of this chip we can create our own ASIC (Application-Specific Integrated Circuit) and this way we shall have a standalone chip which can control a robotic arm.
  • Fuzzy-Based Multi-Stroke Character Recognizer
    343 Computational Intelligence, Image Processing and Computer Animation, Pattern Recognition Alex Tormási, László T. Kóczy, pages 671 – 674. Show abstract Abstract. In this paper an extension for multi-stroke character recognition of FUzzy BAsed handwritten character Recognition (FUBAR) algorithm will be presented. First the basic concept of a single-stroke version will be overviewed; in the second part of the paper the new version of the algorithm with multi-stroke symbol support will be introduced, which deploy the same algorithm overviewed in the first part and use flat and hierarchical rule bases.
  • Image Recognition System for the VANET
    380 Computer Networks, Image Processing and Computer Animation Štefan Toth, Ján Janech, Emil Kršák, pages 675 – 678. Show abstract Abstract. This paper describes a system for recognition of objects in traffic scene from multiple moving vehicles. The system is based on query and image processing. It allows image recognition along with a spatial relation. A user or a machine makes a query in which description of searched objects are defined. Then the query is sent to vehicles in order to process in real-time. If any vehicle recognizes object of interest given by the query, the answer is returned to query author.
  • Simulation of energy consumption in a microgrid for demand side management by scheduling
    179 Computational Intelligence, Scientific Computing, Software Engineering Weronika Radziszewska, Zbigniew Nahorski, pages 679 – 682. Show abstract Abstract. Energy management systems (EMS) are necessary when smart grids and microgrids are considered. Simulation of energy consumption is very useful in planning and testing such systems. In this article we present the problems of simulating energy consumption and show architecture of very general load simulator. The simulator can generate time series of consumption from fixed profiles and also from the defined rules describing use of energy by the devices. The rules describe the probabilistic distribution of the device behaviour. The architecture of the implementation is also presented.
  • Evolutionary Nonlinear Data Transformation for Visualization and Classification Tasks
    430 Data Mining and Data Visualization, Machine Learning ., Pattern Recognition Kamil Ząbkiewicz, pages 683 – 685. Show abstract Abstract. In this paper we propose new approach in data set dimensionality reduction. We use classical principal component analysis transformation. Instead of rejecting features we generate new one by using nonlinear feature transformation. The values of transformation weights is changed evolutionary by using genetic algorithms. Results show better classification rates in smaller feature space. Visualization results also look better.

Information Systems Education & Curricula Workshop

  • Towards improved student placement and preparation methods on Information Technologies post-secondary education
    327 Ghadah A. Aldabbagh, Jaime Ramirez Castillo, Habib M. Fardoun, pages 689 – 693. Show abstract Abstract. In this article we present the results of a pilot programme for student placement at the King Abdulaziz University, comprised of a preparation course, the GCE Ordinary level test in Computing, and a post-course questionnaire. The aim of the research programme is to identify weaknesses in the student placement tests and set the road map for improved first-time entrant placement and preparation. Studies review shows that common placement tests are far from giving strong predictions and they should be complemented with other metrics, such as high school grades or social factors. Test outcomes show that placement test results per se do not yield enough data to predict student success. However, we discovered it as a quite helpful tool for revealing anomalies at the institutional and methodological level, such as very different outcomes among campuses of the same university, or remarkable difficulty to answer certain questions. In order to enhance student placement accuracy and preparation for university, these issues will need to be addressed in forthcoming research..
  • Reduction of the SEEQ Questionnaire
    29 Montserrat Corbalan, Inmaculada Plaza, Eva Hervas, Emiliano Aldabas-Jordi Zaragoza, Francisco Arcega, pages 695 – 701. Show abstract Abstract. Assessment of students and the evaluation of their satisfaction has been an important element in the improvement of teaching quality in all the Higher Education areas. Specifically the student participation in Computer Science (CS) and Information System (IS) has been highlight valued. Thus a large number of methodologies and standard tools regarding student evaluation has been developed. Specifically, the Students´ Evaluation of Education Quality (SEEQ) is a tool that is validated for international use. But its use leads to several problems, such as the low voluntary participation of students. To solve these problems, a short version of this questionnaire developed using statistic tools is proposed. After using the proposed new version, the voluntary participation of students increased. The reduction of the number of questions facilitates the analysis of data, improving the flow of information and feedback between professors and students.
  • Tutor Platform for Vocational Students Education
    353 Assessment of students, Curriculum organization and curriculum, Innovative teaching methods, Training for career and skills development Habib M. Fardoun, Antonio Paules Cipres, Abdullah Saad AL-Malaise AL-Ghamdi, pages 703 – 707. Show abstract Abstract. The current characteristics of technical education and dropout rates against the students motivation remind us of the need for a system that allows students to work towards the ultimate goal of the training: "Incorporating vocational students to the workforce". In this paper we present a learning platform made for the strengthening of vocational students and college students, the goal is also to provide students, faculty and staff a system to control and acquire new qualifications based on the official curriculum and work experience. The Curricula improvement of the Student curriculum through the use of this platform is found on strengthening and improving the areas that teachers determine necessary for each student. At the end we obtain a system that contains officially curriculums based on professional qualifications and professional skills of workers.
  • New Subject to improve the Educational System: Through the Communication between Educational Institution-Company
    333 Assessment of students, Curriculum organization and curriculum, Innovative teaching methods, Social and environmental commitment, Training for career and skills development Habib M. Fardoun, Abdulfattah S. Mashat, Lorenzo C. Gonzaléz, pages 709 – 712. Show abstract Abstract. Once the students finish their studies in the current educational system, there exist a big gap between the acquired knowledge and the needed knowledge to start a desired job. That provokes certain disconcert and the necessity of an adaptation period so the new worker, student, can develops his work properly. For all of it, we propose in this paper the inclusion of a new subject into the educational system, which will serves as a union nexus between institutions and companies. This subject focuses on the necessities of both: the companies to hire new workers; and students to learn about what they are going to start working at the end of their studies. Thus, in function of the specific characteristics of the job, educational institutions and companies may communicate to establish the needed plan to follow. By this companies will proposes the basic milestones that students have to achieve to become workers with the required skills. In this way we obtain students who at the end of their studies have the needed knowledge to begin their working life without the problems related to the ignorance of the work to do.
  • Improving Learning Methods through Adding Student’s Judgment within Teacher’s curricula
    330 Adaptation to the European Higher Education, Assessment of students, Curriculum organization and curriculum, Innovative teaching methods, Quality and evaluation of teaching, Student participation in research Habib M. Fardoun, Daniyal M. Alghazzawi, Lorenzo C. Gonzaléz, pages 713 – 716. Show abstract Abstract. Nowadays, in the university curricula, teachers do not address classes teaching methods and its efficiency on students’ education and learning level. . We must have to take into account that a teacher is not only defined by his knowledge about a specific field but also, he is defined by the way to pass it. For that reason, we propose the inclusion of a section inside the teacher’s curricula expressly dedicated to the student’s opinion about the class teaching performance. . This information is obtained through surveys and it will be displayed graphically with the goal of localizing the aspects which the teacher must improve and of maintaining an historic register that helps to check the progression. In addition, it will serve to the competent educational organisms to know the different skills of their employees.
  • IS (ICT) and CS in Civil Engineering Curricula: Case Study
    332 Assessment of students, Convergence between IS & CS in higher Education, Curriculum organization and curriculum, Innovative teaching methods, Merge from IS to CS and vice versa in higher Education, Quality and evaluation of teaching R. Robert Gajewski, Lech Własak, Marcin Jaczewski, pages 717 – 720. Show abstract Abstract. The paper presents case study – Information Systems and Computer Science in Civil Engineering curricula. Introduction gives historical background of present role and position of IS, Information & Communication Technologies (ICT) and CS. Later details of the course of Information Technologies (IT) are presented in which elements of IS (ICT) and CS are combined. The use of spreadsheet and its Solver as well as Computer Algebra System (CAS) are stressed. Special attention is put on lectures which give necessary theoretical background for classes. Additionally there are presented some remarks about the subject Computing in Civil Engineering (CCE) which is natural successor of IT course. The paper is illustrated by the results of questionnaires performed in the beginning and in the end of semester.
  • Testing the perception of time, state and causality to predict programming aptitude
    440 Assessment of students José Paulo Leal, pages 721 – 726. Show abstract Abstract. The aim of the research presented in this paper is the development of a novel approach to predict programming aptitude. The existing programming aptitude tests rely on the past academic performance of students, on their psychological features or on a combination of both. The novelty of the proposed approach is that it attempts to measure student capabilities to manipulate abstract concepts that are related with programming, namely time, state and causality. These concepts were captured in OhBalls - a physical simulation of the path taken by a sequence of balls through an apparatus of conveyor belts and levers. An engine for this kind of simulation was implemented and deployed as a web application, creating a self-contained test that was applied to a cohort of first-year undergraduate students to validate the proposed approach. This paper describes the proposed type of programming aptitude test, a software engine implementing it, a validation experiment, discusses the results obtained so far and points out future research.
  • Drawer: an Innovative Teaching Method for Blended Learning
    232 Assessment of students, Innovative teaching methods Félix Albertos Marco, Víctor M.R. Penichet, José Antonio Gallud Lázaro, pages 727 – 734. Show abstract Abstract. During the last decade there has been a shift in the way learning process is conducted. One of the main reasons is that technology is changing. Due to this fast movement, concepts like “class”, “workgroup” and “learning process” are changing too. Learning processes are going beyond the boundaries of what was known as “class”. Face-to-face models get mixed with online environments where students are remotely connected through the Internet. This new approach is called blended learning, and it is aimed at improving learning as well as bringing learning where it was impossible or complicated. Nevertheless, one of the main issues is that teachers need innovative tools that support these different learning models. As a consequence, this work is focused on the development of a tool for dealing with the main issues found in blended learning scenarios. It is divided in three phases. First, the blended learning experiences and models of the last decade are reviewed. In a second phase, a tool called Drawer, for supporting the main features of the design and use of blended learning experience is developed. In the last phase, an evaluation is made to assess the outcomes of the new tool.
  • Computer Science E-Courses for Students with Different Learning Styles
    285 Olga Mironova, Tiia Rüütmann, Irina Amitan, Jüri Vilipõld, Merike Saar, pages 735 – 738. Show abstract Abstract. E-learning is a contemporary teaching tool that has become popular and widely used in engineering education in recent years. This article presents the outcomes of a study on considering students` different learning styles in teaching information and communication technology using e-learning. Students were divided into two study groups. The reference group studied according to a provided learning model which including both theoretical educational material and practical assignments. Students of the test group were divided according to their learning styles using the Felder-Silverman model. Different relevant learning models, which included the same theoretical material and practical assignments, were designed for students of the test group based on the learning styles. The results of the study proved that the learning materials which were designed taking into account students` different learning styles considerably improved the achievement of the learning outcomes. A detailed description and analysis of the study is presented in the article.
  • HEQAM: A Developed Higher Education Quality Assessment Model
    10 Amin Y. Noaman, Abdul Hamid M Ragab, Ayman G. Fayoumi, Ahmed M. Khedra, Ayman. I. Madbouly, pages 739 – 746. Show abstract Abstract. This paper presents a developed higher education quality assessment model (HEQAM) at King Abdulaziz University (KAU). This is because of; there is no universal unified quality standard model that can be used to assess the quality criteria of higher education. Besides, there are shortcomings in the coverage of some current educational quality standards. A Developed questionnaire to examine the quality criteria at KAU is investigated. The analytically hierarchy process is used to identify the priority and weights of the criteria and their alternatives. The model is constructed of three levels including eight main objectives and 53 alternatives. It included e-services criteria which is one of the recent university components, in addition to new sub-criteria for enhancing the model. It produces important recommendations to KAU higher authorities for achieving demanded quality services. Also, it helps KAU to achieve one of its strategic objectives to be a paperless virtual university.
  • Computer Modelling of Cognitive Processes
    189 Innovative teaching methods, Quality and evaluation of teaching Nina Rizun, pages 747 – 750. Show abstract Abstract. Brief description of the author's results of development of cognitive processes (CP) computer modelling concept on the basis of improving the methodology and expanding the area of using computer-based testing technology in education is suggested. The fundamental heuristics for formalizing: the concept of degree of difficulty of test tasks (TT); the degree of confidence of individual's CP concept; the concept of stability modes of individual's CP and CP phases during the working time; the concept of the target level of the test session are presented. The heuristic algorithms of intellectual express and expanded analysis of the TT quality are developed. The algorithm of obtaining the cognitive individual's profile for formation and adequate interpretation of individual intellectual characteristics is offered. The concept of technical implementation of the CP computer model into the informational learning environment is formulated.
  • Hands-On Exercises to Support Computer Architecture Students Using EDUCache Simulator
    318 Curriculum organization and curriculum, Innovative teaching methods Sasko Ristov, Blagoj Atanasovski, Marjan Gusev, Nenad Anchev, pages 751 – 758. Show abstract Abstract. EDUCache simulator \cite{Atanasovski:2013} is developed as a learning tool for undergraduate students enrolled the Computer Architecture and Organization course. It gives the explanations and details of the processor and exploitation of its cache memory. This paper shows a set of laboratory exercises and several case studies with examples on how to use the EDUCache simulator in the learning process. These hands-on laboratory exercises can be also used in learning software performance engineering and to increase the student willingness to learn more hardware based courses in their further studying.
  • Concept of competence management system for Polish National Qualification Framework in the Computer Science area
    216 Przemysław Różewski, Bartłomiej Małachowski, Piotr Dańczura, pages 759 – 765. Show abstract Abstract. This article regards analysing the literature of processing competence in education, as well as competence management systems (CMS) and their role in developing competencies for students of higher education cycle. The Bologna Process and its results are described later in the text, explaining the need for National Qualification Frameworks and the benefits that they can produce when implemented correctly. We focuses on creating the basis for competence management system for Polish National Qualification Framework in Computer Science area, how it should work and how it should be implemented.

2nd International Symposium on Frontiers in Network Applications, Network Systems and Web Services

  • Genetic Algorithms with Different Feature Selection Techniques for Anomaly Detectors Generation
    106 Anomaly and intrusion detection Amira Sayed A. Aziz, Ahmad Taher Azar, Mostafa A. Salama, Aboul Ella Hassanien, Sanaa El Ola Hanfy, pages 769 – 774. Show abstract Abstract. Intrusion detection systems have been around for quite some time, to protect systems from inside ad outside threats. Researchers and scientists are concerned on how to enhance the intrusion detection performance, to be able to deal with real-time attacks and detect them fast from quick response. One way to improve performance is to use minimal number of features to define a model in a way that it can be used to accurately discriminate normal from anomalous behaviour. Many feature selection techniques are out there to reduce feature sets or extract new features out of them. In this paper, we propose an anomaly detectors generation approach using genetic algorithm in conjunction with several features selection techniques, including principle components analysis, sequential floating, and correlation-based feature selection. A Genetic algorithm was applied with deterministic crowding niching technique, to generate a set of detectors from a single run. The results show that sequential-floating techniques with the genetic algorithm have the best results, compared to others tested, especially the sequential floating forward selection with detection accuracy 92.86\% on the train set and 85.38\% on the test set.
  • How to Develop a Biometric System with Claimed Assurance
    239 other theoretical and practical aspects of authentication systems and methods, security of authentication systems Andrzej Bialas, pages 775 – 780. Show abstract Abstract. The article concerns the process of developing biometric devices with a view to submit them for certification in compliance with ISO/IEC 15408 Common Criteria. The author points at the assurance paradigm which shows that the source of assurance is a rigorous process of the product development along with methodical and independent evaluation in an accredited laboratory. The state of the art of certified biometric devices was discussed. There was some focus put on the issue of insufficient support that the developers get in this respect. Basic processes related to the Common Criteria methodology were described (IT security development, IT product development, IT product evaluation). These processes were illustrated by the elements of security specifications of certified biometric devices. The author proposes that development patterns can be used to prepare evidence material, while specialized devices supporting development processes – to deal with basic difficulties encountered by the developers of biometric devices.
  • Real-Time Carpooling and Ride-Sharing: Position Paper on Design Concepts, Distribution and Cloud Computing Strategies
    455 Context-aware Web services, Mobile applications, Network and mobile GIS platforms and applications, Semantic Web services, Software agents for Web services composition Dejan Dimitrijević, Vladimir Dimitrieski, Nemanja Nedić, pages 781 – 786. Show abstract Abstract. Many carpool and ride-sharing solutions have been proposed and even developed in the previous decades, but rarely have they been able to attain a global user base, at least not up until recently. That was mostly because many of them were not initially designed as scalable, leaving their users with a sub-par user experiences as their user base grew, and often their mobile or desktop client reach was not ubiquitous enough, leaving them available only to a small portion of mobile client devices and/or desktop browsers. This paper describes the design concepts, distribution and cloud computing strategies the authors feel any future global carpool and ride-sharing solution could follow, making it very scalable and ubiquitous enough to successfully reach and serve a global user base.
  • Emerging technologies for interactive TV
    176 Mobile applications, Service delivery platforms - architecture and applications, Standards for Web services, Technical and social aspects of Open API and open data, Telecommunication operators API exposition in Telco 2.0 model Marek Dąbrowski, pages 787 – 793. Show abstract Abstract. Advances in web services and open network interfaces enable development of new user-oriented applications in different areas of digital life. Among them, digital entertainment is one of fastest growing networked application domains. Technologies like CDNs (Content Delivery Networks) and HTTP streaming opened the way for new models of TV and video consumption over the Internet. Thanks to emergence of new networking technologies and cooperation paradigms the distinction between traditional TV and the Web is slowly blurring, creating new category of TV entertainment which is more interactive, delivered over multiple screens and inviting users to participate and collaborate. This paper presents research on selected social and interactive TV services, focusing on innovations related with incorporation of network interfaces and web protocols. Two use-case scenarios are investigated: one of them explores telco 2.0 interfaces for interactive videos, and the other allows users for social exchanges while watching live sports events. The evaluation results by qualitative user study are presented for the second use-case. A technical architecture is introduced, focused on seamless integration into a typical WebTV platform of telco operator. As important contribution of the paper, some new emerging network protocols and features (like telco 2.0, WebRTC, WebSockets, HTML5 video) are evaluated as potential enabling technologies for future social and interactive TV.
  • Communication in Distributed Database System in the VANET Environment
    325 Network-based computing systems, Wireless communications Ján Janech, Štefan Toth, pages 795 – 799. Show abstract Abstract. This paper describes principles of the data communication in the distributed database system AD-DB developed by paper authors. The database system is designed to function properly in such a complex and dynamic network as the VANET is. That way, vehicles connected to the VANET could distribute traffic related data to others.
  • Content Delivery Network Monitoring with Limited Resources
    366 Control of networks, High-speed network traffic processing, Network aspects of Cloud Computing, Service delivery platforms - architecture and applications Krzysztof Kaczmarski, Marcin Pilarski, Bogdan Banasiak, Christophe Kabut, pages 801 – 805. Show abstract Abstract. This article presents results of designing a Content Delivery Network monitoring system for resource limited applications. CDN monitoring is important both for content providers (media companies) and administrators (Internet Service Providers). It is a challenging task since network traffic may generate huge volume of data which must be parsed and analysed in real-time. This paper describes the design of a prototype system that uses a small resource footprint, scalable Big Data solution, which is motivated by real world use cases.
  • The control on-line over TCP/IP exemplified by communication with automotive network
    144 Control of networks, Wireless communications Grzejszczyk Elżbieta, pages 807 – 810. Show abstract Abstract. One of the more important stages in the development of wirless networks was to develop protocols, procedures and systems providing packet data transmission. Packet data transmission allows us to send sets of measurement data as well as those of information over long distances, and thus integrating with other available networks, for example the Internet. The aim of the present article is to demonstrate the implementation of communication algorithms of the above networks in questions as exemplified by communication with automobile on-board network. The part of the article dealing specifically with the functionality of the discussed transmission types contains the description of remote control exemplified by a low-power executable devices.
  • How to use the TPM in the method of secure data exchange using Flash RAM media
    105 Network security, Security issues in Cloud Computing, The applications of intelligent techniques in network systems Janusz Furtak, Tomasz Pałys, Jan Chudzikiewicz, pages 811 – 818. Show abstract Abstract. This document describes how to use the Trusted Platform Module (TPM) in the method of secure transmission of data stored on the Flash RAM through insecure transport channel. In this method the sender of the file specifies the recipient and the recipient knows who is the sender of the file. The idea of a solution that uses symmetric and asymmetric encryption is described. The TPM is used to safely generate symmetric and asymmetric keys, and theirs the safe collection, storage and management in order to protect files during transfer. The way of organizing data in a cryptographic keystore for users authorized to use the system for the secure transmission of files stored in Flash RAM is described.
  • LocFusion API – Programming Interface for Accurate Multi-Source Mobile Terminal Positioning
    170 Context-aware Web services, Mobile applications, Network-based computing systems, Telecommunication operators API exposition in Telco 2.0 model Piotr Korbel, Piotr Wawrzyniak, Sebastian Grabowski, Dorota Krasińska, pages 819 – 823. Show abstract Abstract. The aim of this paper is to present a prototype LocNet API programming interface for indoor positioning systems and a prototype LocFusion API interface enabling joint use of terminal positioning data from mobile operator’s GMLC and the LocNet API. The use of data from complementary information sources can improve the accuracy of user terminal positioning in large buildings, where coverage of satellite systems is weak.
  • Mobile Applications Aiding the Visually Impaired in Travelling with Public Transport
    228 Context-aware Web services, Mobile applications, Network-based computing systems Piotr Korbel, Piotr Skulimowski, Piotr Wasilewski, Piotr Wawrzyniak, pages 825 – 828. Show abstract Abstract. The paper presents a set of mobile applications aiding the visually impaired in using the public transport. A user equipped with a modern smartphone with mobile data transmission and positioning capabilities can access location related context information. Keeping up the connection with dedicated system servers gives the user access to additional services, e.g. enables the use of passenger information system and provides access to services facilitating the navigation in urban areas. The paper describes an overall architecture of the system for guidance and public transport assistance of the visually impaired. Then, the details of the applications developed for Android based smartphones are presented. The applications are mainly focused on aiding in urban navigation and provide various ways of accessing data from public transport passenger information system.
  • Towards networks of the future: SDN paradigm introduction to PON networking for business applications
    244 Paweł Parol, Michał Pawłowski, pages 829 – 836. Show abstract Abstract. The paper is devoted to consideration of an innovative access network dedicated to B2B (Business To Business) applications. We present a network design based on passive optical LAN architecture utilizing proven GPON technology. The major advantage of the solution is an introduction of SDN paradigm to PON networking. Thanks to such approach network configuration can be easily adapted to business customers’ demands and needs that can change dynamically. The proposed solution provides a high level of service flexibility and supports sophisticated methods allowing user traffic forwarding in effective way within the considered architecture.
  • Are Graphical Authentication Mechanisms As Strong As Passwords?
    187 Karen Renaud, Peter Mayer, Melanie Volkamer, Joe Maguire, pages 837 – 844. Show abstract Abstract. The fact that users struggle to keep up with all their (textual) passwords is no secret. Thus, one could argue that the textual password needs to be replaced. One alternative is graphical authentication. A wide range of graphical mechanisms have been proposed in the research literature. Yet, the industry has not embraced these alternatives. We use nowadays (textual) passwords several times a day to mediate access to protected resources and to ensure that accountability is facilitated. Con- sequently, the main aspect of interest to decision-makers is the strength of an authentication mechanism to resist intrusion attempts. Yet, researchers proposing alternative mechanisms have primarily focused on the users’ need for superior usability while the strength of the mechanisms often remains unknown to the decision makers. In this paper we describe a range of graphical authentication mechanisms and consider how much strength they exhibit, in comparison to the textual password. As basic criteria for this comparison, we use the standard guessability, observability and recordability metrics proposed by De Angeli et al. in 2005. The intention of this paper is to provide a better understanding of the potential for graphical mechanisms to be equal to, or superior to, the password in terms of meeting its most basic requirement namely resisting intrusion attempts.
  • Tests of Smartphone Localization Accuracy Using W3C API and Cell-Id
    251 Context-aware Web services, Mobile applications, Network and mobile GIS platforms and applications, Service delivery platforms - architecture and applications, Standards for Web services, Technical and social aspects of Open API and open data, Telecommunication operators API exposition in Telco 2.0 model, Wireless communications Grzegorz Sabak, pages 845 – 849. Show abstract Abstract. Location based services (LBS) are considered very relevant for the users of mobile networks. All local events and facts related to area nearby seem to be more important that others which happen in remote places. Localization data is used in all types of services: weather, traffic, tourist info, etc. One of its most important (and regulated by law) applications is providing persons location in case of emergency.
  • Integration of context information from different sources: Unified Communication, Telco 2.0 and M2M
    337 Architecture, scalability and security of Open API solutions, Service delivery platforms - architecture and applications, Technical and social aspects of Open API and open data, Telecommunication operators API exposition in Telco 2.0 model Grzegorz Siewruk, Jarosław Legierski, Sebastian Grabowski, Marek Średniawa, pages 851 – 858. Show abstract Abstract. The paper presents an idea of a context-aware application, which collects context data from many different sources, stores them in a dedicated database and makes use of it to support flexible scenarios for end users. Using open APIs it integrates different types of context information provided by: Unified Communication system, APIs exposed by communication service providers and information from Machine to Machine (M2M) framework. Methods for recording and unifying different types of context data are proposed and their performance is compared with results for the most popular database structures. A context-aware contact list application for a mobile phone user is presented as an example illustrating the main ideas of the paper.
  • Mobile Payment System – Telco 2.0 application dedicated for payments
    363 Service delivery platforms - architecture and applications, Telecommunication operators API exposition in Telco 2.0 model Piotr Trusiewicz, Maciej Witan, Marcin Kuzia, pages 859 – 864. Show abstract Abstract. The following paper presents Mobile Payment System. The system is a prototype of an innovative method of paying for services using the mobile phone. The method is quite straight-forward, basically the user wanting to access some online service supplies his/her cellular phone number to the web form and receives a token via USSD message (Unstructured Supplementary Service Data). Then, introducing the token into the web form gives the user an access to the desired content. At this exact moment the charging is done. The due amount of money is simply added to the monthly bill, in case of postpaid phones, or subtracted from available credits, in case of prepaid mobile phones. The functionality of sending USSD messages from the system to the subscriber mobile phone was achieved by using Telco 2.0 Web Services provided by Orange Labs.
  • Parking Reservation – application dedicated for car users based on telecommunications APIs
    215 Service delivery platforms - architecture and applications, Telecommunication operators API exposition in Telco 2.0 model Piotr Trusiewicz, Jarosław Legierski, pages 865 – 869. Show abstract Abstract. The main objective of this paper is to propose simple, easy to implementation and low cost solution dedicated for parking lots reservation. The presented application uses Unstructured Supplementary Service Data (USSD) as an communication channel between driver and parking system. USSD communication proposed in this paper is more efficient and comfortable for the end user in comparison with SMS used in many existing parking solutions. System is integrated with communication service provider infrastructure using Service Delivery Platform exposed APIs for telecommunication network in Internet. Application can be launched on every phone and does not require Internet access on mobile phone side.
  • Student Information Delivery Platform Using Telecommunications Open Middleware APIs
    426 Applications of SWS to E-business and E-government, Mobile applications, Telecommunication operators API exposition in Telco 2.0 model Piotr Wawrzyniak, Piotr Korbel, Anna Borowska-Terka, pages 871 – 874. Show abstract Abstract. The paper describes architecture of a prototype networked student information delivery system. Main system functionalities include interactive access to lecture room timetables and group messaging. The system exploits modern mobile technologies to allow flexible usage scenarios. The use of open APIs of telecommunications service delivery platforms in combination with e-mail messaging provides diverse ways of system information delivery. The perceived application scenario of the system is to provide ubiquitous access to up-to-date lecture room timetables and reliable ways of notifying the affected users about changes.

2nd International Conference on Wireless Sensor Networks

  • Cloud Computing System Based on Wireless Sensor Network
    297 Applications of WSN, Data Allocation and Information Processing in Sensor Networks, Software, Applications and Programming of Sensor Network Wen-Yaw Chung, Pei-Shan Yu, Chao-Jen Huang, pages 877 – 880. Show abstract Abstract. In this paper, the system presents an integrated wireless sensor network (WSN) to monitor the information of the agriculture systems, such as temperature, humidity, pondus hydrogenii (pH) value…etc. The purpose is to let the client can obtain information about other agricultural sensor nodes more conveniently and faster. A WSN will collect the value of various parameters which can be sensed by the front-end sensors to the host end. At the client sides, they can use the internet to request web services to store this big data into distributed SQL databases which are in our proposed cloud system. In addition, this work presents the concept of cloud computing and services. The benefits of this system include basic computing hardware and reasonable storage capacities are only needed, suitable for any smart device which can monitor the real-time farmland information anywhere. The customers can fully use our cloud service if devices are with the ability of surfing the internet
  • Approaches of Wireless Sensor Network Dependability Assessment
    233 Antonio Coronato, Alessandro Testa, pages 881 – 888. Show abstract Abstract. The extensive use of the Wireless Sensor Networks (WSNs) in main critical scenarios stresses the need to verify their dependability properties at design time to prevent wrong design choices and at runtime in order to make a WSN more robust against failures that may occur during its operation. In literature, several approaches have been proposed in order to evaluate the dependability of a WSN during its inception and its operating. In this paper we present a survey on these adopted techniques reporting aspects and characteristics of some research studies. Moreover, by means of a comparison grid, we analyze the current state-of-the-art of the approaches of WSN dependability assessment in order to identify the most performant and to discuss the ongoing challenges.
  • Analysis of the influence of radio beacon placement on the accuracy of indoor positioning system
    379 Applications of WSN, Performance, Simulation and Modeling of Sensor Network Krzysztof Piwowarczyk, Piotr Korbel, Tomasz Kacprzak, pages 889 – 894. Show abstract Abstract. This paper discusses factors influencing accuracy of estimating localization of radio networks terminals in indoor environment. It introduces parameters that can be useful to describe the quality of localization of radio landmarks. The paper presents a software for computer aided reference radio stations placement inside the buildings and shows the results of exemplary simulations carried out with the use of proposed algorithms.
  • Development of Special Smartphone-Based Body Area Network: Energy Requirements
    263 Applications of WSN, Management, Energy and Control of Sensor Network Jana Púchyová, Michal Kochláň, Michal Hodoň, pages 895 – 900. Show abstract Abstract. In recent years, smart-devices became very popular among people of all ages around the world. Very important is especially their usage in health applications. Special Body Area Network (BAN) for the stress monitoring is currently being developed within the authors’ department. Android-based smartphone is employed as the main control unit of the sensor network built on the star architecture. Since the power consumption of the smart-phone as well as of the single sensor node is one of the key limitations of the network, special attention has to be given on it. In this article, energy requirements necessary for the data transmission among the network is analyzed in detail. For this purpose, communication solution based on 2.4GHz proprietary RF transceiver is implemented.
  • SENTIOF: An FPGA Based High-Performance and Low-Power Wireless Embedded Platform
    78 Khurram Shahzad, Peng Cheng, Bengt Oelmann, pages 901 – 906. Show abstract Abstract. Traditional wireless sensor nodes are designed with low-power modules that offer limited computational performance and communication bandwidth and therefore, are generally applicable to low-sample rate intermittent monitoring applications. Nevertheless, high-sample rate monitoring applications can be realized by designing sensor nodes that can perform high-throughput in-sensor processing, while maintaining low-power characteristics. In this paper, a high-performance and low-power wireless hardware platform is presented. With its compact size and modular structure enabling there to be an integrated customized sensor layer, it can be used for a wide variety of applications. In addition, the flexibility provided through dynamically configurable interfaces and power management, helps optimizing performance and power consumption for different applications.
  • Wireless Indoor Positioning System for the Visually Impaired
    141 Applications of WSN Piotr Wawrzyniak, Piotr Korbel, pages 907 – 910. Show abstract Abstract. The paper presents a prototype radio network aiding the visually impaired to navigate in indoor areas. The main purpose of the system is to provide accurate and reliable location information as well as to enable access to location related context information. The nodes of the network operate in two modes providing basis for both rough and precise user position estimation. The data transmitted by the nodes are used to get access to additional services, e.g. to retrieve position related context information.

4th International Workshop on Advances in Business ICT

  • A Hierarchical Approach for Configuring Business Processes
    311 Business Intelligence, Business Analytics, Information Systems in Enterprise Management, Information Technologies in Enterprise Management, Information Systems Mateusz Baran, Krzysztof Kluza, Grzegorz J. Nalepa, Antoni Ligęza, pages 915 – 921. Show abstract Abstract. Business Process models in the case of real life systems are often very complex. Hierarchization allows for managing model complexity by "hiding" process details into sub-levels. This helps to avoid inconsistencies and fosters reuse of similar parts of models. Configuration, in turn, gives the opportunity to keep different models in one configurable model. In the paper, we propose an approach for configuring Business Processes that relies on hierarchization for more expressive power and simplicity. Our goal is achieved by allowing arbitrary n-to-m relationships between tasks in the merged processes. The approach preserves similar abstraction level of subprocesses in a hierarchy and allows a user to grasp the high-level flow of the merged processes.
  • Simulation driven design of the German toll system – profiling simulation performance
    288 Advanced Technologies of Data Processing, Information Technologies in Enterprise Management, Information Systems Tommy Baumann, Bernd Pfitzinger, Thomas Jestädt, pages 923 – 926. Show abstract Abstract. Taking an existing large-scale simulation model of the German toll system we identify the typical workload by profiling the runtime behavior. Crucial performance hot spots are identified and related to the real-world application to analyze and evaluate the observed efficiency. In a benchmark approach we compare the observed performance to different simulation frameworks.
  • Moving Trend Based Filters Design in Frequency Domain
    444 Advanced Technologies of Data Processing, Business-oriented Time Series Data Mining, Analysis, and Processing Jan T. Duda, Tomasz Pełech-Pilichowski, pages 927 – 930. Show abstract Abstract. AbstractAn original approach to digital moving trend based filters (MTF) design, based on Bode plots analysis is proposed, aimed at seasonal time series decomposition and prediction. A number of polynomials of different range are discussed to be used in the MTF as the LS approximation formula. The Bode plots of the MTF are shown, and the best filter is selected. Results of a seasonal time series decomposition and prediction with the best MTF is presented and compared to the classical MTF calculations (involving the linear LS approximation).
  • Incorporating Text Analysis into Evolution of Social Groups in Blogosphere
    356 Business Applications of Social Networks, Business Data Mining and Knowledge Discovery, Business Intelligence, Business Analytics Bogdan Gliwa, Anna Zygmunt, Stanisław Podgórski, pages 931 – 938. Show abstract Abstract. Data reflecting social and business relations has often form of network of connections between entities (called social network). In such network important and influential users can be identified as well as groups of strongly connected users. Finding such groups and observing their evolution becomes an increasingly important research problem. One of the significant problems is to develop method incorporating not only information about connections between entities but also information obtained from text written by the users. Method presented in this paper combine social network analysis and text mining in order to understand groups evolution.
  • Towards Rule-oriented Business Process Model Generation
    420 Business Rules, Information Systems in Enterprise Management, Information Technologies in Enterprise Management, Information Systems Krzysztof Kluza, Grzegorz J. Nalepa, pages 939 – 946. Show abstract Abstract. Attribute-Relationship Diagrams (ARD) aim at capturing relations, especially dependency relation, between the specified attributes. This paper describes work-in-progress research concerning process and rules integration, which takes advantage of the ARD method and allows for generating executable models. The paper examines the possibility of generating the rule-oriented BPMN model and enriching process models with rules from the ARD diagram.
  • The Set of Time Structures for Economic Phenomena Description
    8 Maria Mach-Król, pages 947 – 949. Show abstract Abstract. The paper describes some possible time structures for describing and analyzing economic phenomena. The author claims that a simple linear time structure is not enough for this task. Therefore he proposes to use more complex structures.
  • Assessment of Business Intelligence Maturity in the Selected Organizations
    139 Information Systems in Enterprise Management Celina Olszak, pages 951 – 958. Show abstract Abstract. The main purpose of this paper is to assess the level of Business Intelligence (BI) maturity in organizations. The research questions I ask in this study are: (1) what possibilities offer BI systems for different organizations, (2) how to measure and evaluate the BI maturity in organizations? The study was based on: (1) a critical analysis of literature, (2) a observation of different BI initiatives undertaken in various organizations, as well as on (3) semi-structured interviews conducted in polish organizations in 2012. Some interviews, conducted in 20 polish enterprises, were held with executives, senior members of staff, and ICT specialists. The reminder of my paper is organized as follows. Firstly, the idea of BI is described. Next, the issue of BI maturity models is recognized. Finally, Garter’s Maturity Model for Business Intelligence and Performance Management is used to assess the level of BI in surveyed organizations.
  • Towards a Better Understanding of Context-Aware Applications
    198 Business Applications of Social Networks, Business Intelligence, Business Analytics, Business Rules, Information Systems in Enterprise Management, Information Technologies in Enterprise Management, Information Systems, Service Oriented Architectures (SOA) Emilian Pascalau, Grzegorz J. Nalepa, Krzysztof Kluza, pages 959 – 962. Show abstract Abstract. With the new technological advances and strong move towards Future Internet and Internet as a Platform a new environment is emerging. This environment is generative, social, strongly interactive and collaborative, so users play a fundamental role in it. Business applications are simplifying, webifying and getting more user-centric. In this environment, context and context-awareness plays a fundamental role, as context gives meaning and accurately describes the situation of an user. This paper introduces the basis for a new research methodology that aims to address and visualize the topic of context and context-awareness from a holistic point of view, by means of text mining and text clustering.
  • Rapid Application Prototyping for Functional Languages
    336 Martin Podloucký, pages 963 – 969. Show abstract Abstract. This work addresses the problem of automated graphical user interface generation for functional programs in relation to rapid application prototyping. First an analysis of current state in the field of automated GUI generation is performed. Based on the analysis, the concept of functionally structured user interface (FSUI) is introduced. Meta-data system for code annotation is then specified for the Clojure programming language and a transformation from this system to FSUI data model is implemented. Finally, a graphical layer for displaying the actual interface is implemented in Clojure.
  • Assessment of the EPQ probability parameter for scientific articles publishing
    301 Information Technologies in Enterprise Management, Information Systems Rafał Rumin, Piotr Potiopa, pages 971 – 976. Show abstract Abstract. This work presents the analysis of evaluation concerning the articles that are send to publication in academic journals, basing on additional parameters not resulting from essential value of the research work. Currently, majority of article verification algorithms is oriented on the selection of such works that are potentially more strongly influencing the international position of journal. For that purpose, editorial offices, and also reviewers, apply multi-criterion parametric evaluations and accepted parameters have often very subjective character. Presented work makes an attempt to identify used criterion functions i.e. defining evaluation parameters. These parameters were divided onto categories and there was proposed their preliminary verification basing on statistical analysis of already published articles in individual journals. Each parameter has attributed weight function, which allows to defined its impact on the total evaluation of article, and also adaptation of formula to any academic journal. Weight functions will be determined with usage of neural networks or genetic algorithms, aiming to their individual adaptation to particular journal.
  • Fuzzy Multi-attribute Evaluation of Investments
    414 Information Forensics and Security, Information Management, Risk Assessment and Bogdan Rębiasz, Bartłomiej Gaweł, Iwona Skalna, pages 977 – 980. Show abstract Abstract. This paper proposes a~practical framework for modelling projects portfolio selection problem with fuzzy parameters as a~fuzzy multi-attribute decision-making problem. A~two-step evaluation model that combines fuzzy AHP and fuzzy TOPSIS methods is used to solve this problem. The proposed approach is illustrated by an empirical study of a~real case from steel industry involving fifteen criteria and ten projects is conducted. The case shows the effectiveness and feasibility of the proposed evaluation procedure.
  • Increase in the Competitiveness of SMEs using Business Intelligence in the Czech-Polish border areas
    160 Information Technologies in Enterprise Management, Information Systems Milena Tvrdíková, pages 981 – 984. Show abstract Abstract. The paper discusses a system of tools that support knowledge-based management. These tools are referred to as Business Intelligence and Competitive Intelligence is also discussed. Both work with selected or modified data which, become bearers of comprehensive information about company processes and real-world impact on their progress. Basic principles of these applications are also defined. The paper aims to present the results of the questionnaire survey, which was conducted among SMEs in the Czech-Polish border area. The subject of evaluation is the use of these tools, the use of cloud innovations of information systems and their inclusion in future plans to update information systems. The results presented show a considerable interest in the use of Business Intelligence applications. The questionnaires also provide a confirmation of the lack of awareness of business owners and managers in the region with regard to Cloud computing possibilities.
  • Implementation of the Big Data concept in organizations – possibilities, impediments and challenges
    161 Janusz Wielki, pages 985 – 989. Show abstract Abstract. This paper is devoted to the analysis of the Big Data phenomenon. It is composed of seven parts. In the first, the growing role of data and information and their rapid increase in the new socio-economical reality, are discussed. Next, the notion of Big Data is defined and the main sources of growth of data are characterized. In the following part of the paper the most significant possibilities linked with Big Data are presented and discussed. The next part is devoted to the characterization of tools, techniques and the most useful data in the context of Big Data initiatives. In the following part of the paper the success factors of Big Data initiatives are analyzed, followed by an analysis of the most important problems and challenges connected with Big Data. In the final part of the paper, the most significant conclusions and suggestions are offered.

Agent Day

  • Learning sensors usage patterns in mobile context-aware systems
    180 data-intensive computing, various application of multi-agent systems Szymon Bobek, Krzysztof Porzycki, Grzegorz J. Nalepa, pages 993 – 998. Show abstract Abstract. Context-aware mobile systems have gained a remarkable popularity in recent years. Mobile devices are equipped with a variety of sensors and become computationally powerful, which allows for real-time fusion and processing of data gathered by them. However, most of existing frameworks for context-aware systems, are usually dedicated to static, centralized architectures, and those that were designed for mobile devices, focus mainly on limited resources in terms of CPU and memory, which in nowadays world is no longer a big issue. Mobile platforms require from the context modelling language and inference engine to be simple and lightweight, but on the other hand -- to be powerful enough to allow not only solving simple context identification tasks but also more complex reasoning. These, with combination of a large number of sensors and CPU power available on mobile devices result in high energy consumption of a system. The original contribution of this paper is a proposal of an intelligent middleware for mobile context-aware frameworks, that is able to learn sensor usage habits, and minimize energy consumption of the system.
  • System Design and Implementation Decisions for ParaMoise Organizational Model
    436 scalability, extendability, resilience in complex systems, stochastic and structural modeling of complex systems Mateusz Guzek, Grégoire Danoy, Pascal Bouvry, pages 999 – 1005. Show abstract Abstract. ParaMoise is a novel organisational model that permits to specify parallel and concurrent systems' organisation and reorganisation. Workflows, locks and multiple organisation managers are the entities that differentiate this model from it antecedent, the Moise+ framework. All these entities must be efficiently designed and implemented to ensure the practical usage of the theoretically formulated model. The main challenge here is the distributed synchronisation of workflows and locks, that will maximise the performance of the system. This paper presents and analyses different workflows and locks management approaches that can be used to achieve this goal: from basic centralised or middleware based solutions, towards truly decentralised coordination mechanisms.
  • Using the Evaluation Nets Modeling Tool Concept as an Enhancement of the Petri Net Tool
    242 multi-agent computation and simulation, multi-agent management, scheduling, load-balancing, various application of multi-agent systems Michał Niedźwiecki, Krzysztof Rzecki, Krzysztof Cetnarowicz, pages 1007 – 1012. Show abstract Abstract. Petri net modeling has well-known algorithms, so it is easy to develop computer tools to build, edit, and analyse these networks. These tools are designed so that it is possible to add extensions that give additional functionality, such as an extension for network evaluation. Network evaluation is not as popular as Petri Net modeling, but it turns out that, in modeling of some of the problems, network evaluation makes analysis of them very clear and intuitive. Unfortunately there are no mathematical tools and computer programmes, which very much limits the use of network evaluation. Fortunately, under certain assumptions, an evaluation network can be converted into a Petri net. This article presents an idea of how to convert an existing Petri net computer programme to draw evaluation nets and convert them into Petri nets in order to use existing tools for Petri net analysis. Evaluation nets are well suited for modeling negotiation protocols between two parties represented by servers or software agents. This article provides an example of such a protocol presented in three versions: a sequence diagram UML, Petri net and Evaluation nets.
  • Analyzing Meme Propagation in Multimemetic Algorithms: Initial Investigations
    412 nature-inspired, evolutionary and memetic computing Rafael Nogueras, Carlos Cotta, pages 1013 – 1019. Show abstract Abstract. Multimemetic algorithms (MMAs) are a subclass of memetic algorithms in which memes are explicitly attached to genotypes and evolve alongside them. We analyze the propagation of memes in MMAs with spatial structure. For this purpose we propose an idealized selecto-Lamarckian model that only features selection and local improvement, and study under which conditions good, high-potential memes can proliferate. We compare population models with panmictic and toroidal grids topology. We show that the increased takeover time induced by the latter is essential to improve the chances for good memes to express themselves in the population by improving their hosts, hence enhancing their survival rates.
  • Fair and truthful multiagent resource allocation for conference moderation
    290 Adam Połomski, pages 1021 – 1027. Show abstract Abstract. Multiuser voice conferencing platforms are more and more popular. Internet bandwidth is becoming very accessible, what makes voice over IP used on an everyday basis. Being able to communicate with multiple people at the same time can be beneficial, but on the other hand increases the need of coordination mechanisms. Determining a moderation scheme which is fair and efficient is not a trivial problem to solve. We define conference moderation as a multiagent resource allocation problem and introduce a process based on Vickrey auctions to solve it. A concept of co-owned communication channel is what stands as a basis of our definition of fairness.
  • Verifying data integration agents with deduction-based models
    113 multi-agent computation and simulation, multi-agent management, scheduling, load-balancing, various application of multi-agent systems Radosław Klimek, Łukasz Faber, Marek Kisiel-Dorohinicki, pages 1029 – 1035. Show abstract Abstract. The paper shows how an agent-based system can be subjected to formal verification using a deductive approach. The particular system for gathering open source intelligence is considered, which is build on a framework for data integration. Techniques allowing for automatic extraction of logical specifications are described with emphasis on pattern-based and rule-based approaches. An example illustrates how the proposed method works in a scenario with iterated agent tasks combining these two approaches.
  • Agent Based System for Assistance at Industrial Process Control with Experience Modeling
    226 multi-agent computation and simulation, various application of multi-agent systems Gabriel Rojek, pages 1037 – 1040. Show abstract Abstract. The problem of automatic or computer aided control is still unsolved in many areas of real production processes. In such cases the only one solution is to employ human operator that uses his experience and knowledge in order to manually control parameters of the process. Such approach has many disadvantages relating to characteristics of human work what is the main ground for presented here research. As the solution a methodology is proposed, which follows the decision processes of human operator using his experience. In order to predict capabilities of proposed methodology a test system is designed and implemented with the use of agent technology.
  • Agent-based Architecture and Situation-based Scenario for Consistency Management
    452 multi-agent computation and simulation, various application of multi-agent systems Pham Phuong Thao, Mourad Rabah, Pascal Estraillier, pages 1041 – 1046. Show abstract Abstract. During interactions, system actors may face up to misunderstandings when their local visions contain inconsistent data about a same fact. Misunderstandings in interaction are likely to reduce interactivity performances (deviation or deadlock) or even affect overall system behavior. In this paper, we present agent-based architecture and scenario-structuring approach to deal with such misunderstandings and consistency. It is based on the notion of “situation” that is an elementary building block dividing the interactions between actors into contextual scenes. This model not only supports the scenario execution, but the consistency management as well. In order to organize and control the interactions, a “situation” contextualizes system’s actors’ interaction and activity, and includes prevention and tolerances mechanisms to deal with the misunderstandings and their causes. We also simulation experimentation on an Online Distance Learning case study.
  • Agent-based Resource Management in Tsunami Modeling
    439 data-intensive computing, multi-agent management, scheduling, load-balancing, various application of multi-agent systems Alexander Vazhenin, Yutaka Watanobe, Kensaku Hayashi, Michał Drozdowicz, Maria Ganzha, Marcin Paprzycki, Katarzyna Wasielewska, Paweł Gepner, pages 1047 – 1052. Show abstract Abstract. The complexity and versatility of tsunami modeling requires designing open modular software system with high level of reusability and interoperability of its components, as well as flexible resource management. In this paper we investigate how to integrate the tsunami modeling software with an agent-based resource management infrastructure.

11th Conference on Advanced Information Technologies for Management

  • Advancements in Cloud Computing for Logistics
    001 Business Process Management and Management Systems (BPM and BPMS), Decision Support Systems and data mining, Knowledge-based and intelligent systems in management, Management Information Systems (MIS) Uwe Arnold, Jan Oberländer, Björn Schwarzbach pages 1055 – 1062. Show abstract Abstract. Adequate integrated ICT infrastructure and services are a prerequisite for keeping pace with the rapid rise of complexity and service levels in logistics. Recent studies indicate a high attractiveness and impact perspective of cloud computing for logistics service providers within few years in order to cope with the growing IT capacity demands. Within this paper, a comprehensive overview is given on R&D with relation to CC for logistics. Among these, the EU-project LOGICAL is presented in detail since it combines different aspects and benefits of CC for the logistics sector. A generic system of CC use cases in logistics and the corresponding needs for a logistics cloud architecture are discussed and compared with the implementation status of the LOGICAL cloud. Special attention is given to the problem of incompatible data and service interfaces. Instead of following the single-window, single-document concept, a semi-automated on demand interface creation service is presented as an intermediate alternative for the practitioning logistics sector.
  • Integrated Model of a Social Navigation System with Self-adaptive Feedback Control Mechanism
    369 Business Process Management and Management Systems (BPM and BPMS), Decision Support Systems and data mining, Knowledge-based and intelligent systems in management, Management Information Systems (MIS) Vangel V. Ajanovski, pages 1063 – 1070. Show abstract Abstract. This paper presents a model of a navigation system in a public information system, that can be used to improve the structure and content of the information repository via self-organization capabilities based on social interaction. This model has the primary goal of establishing a generic and adaptive social-based self-structuring navigation system. To achieve this goal, the model integrates the concepts of social navigation, interaction and self-adaptivity in a feedback control loop. The model gives focus on self-adaptivity and includes elements of social navigation in all parts of the system which enables the implementations based on this model to get social adaptability based on user actions individually, but also as a social environment, in every possible aspect of the functioning of the system. The introduced feedback control loop gives possibility for further autonomous improvements of the organization of the information.
  • Concept of Platform for Hybrid Composition, Grounding and Execution of Web Services
    190 Cloud computing, SOA, Web services Lev Belava, pages 1071 – 1077. Show abstract Abstract. This paper presents a concept of a software platform and a method of hybrid composition of web services and hybrid grounding of abstract composition plans. The paper also describes the architecture of the implemented platform and its modules.
  • Analysis of the importance of business process management depending on the organization structure and culture
    443 Business Process Management and Management Systems (BPM and BPMS), Concepts and methods of business informatics, IT projects & IT projects management, Strategies and methodologies of IT implementation Witold Chmielarz, Marek Zborowski, Aneta Biernikowicz, pages 1079 – 1086. Show abstract Abstract. The present survey mainly aims at analysing determinants of possibilities of streamlining processes in an organization. The early fragments of the study are devoted to a theoretical analysis of determinants of the process management and its connection with the project management. Then the assumptions of the survey on the impact of the organizational structure and culture on possibilities of applying business process management were presented. The verification of theoretical deliberations and survey assumptions is included in the last part of the article presenting the initial results of the obtained survey and the resulting conclusions.
  • Process-based evaluation and comparison of OTS software alternatives
    224 Maria Jesus Faundes, Hernan Astudillo, Bernhard Hitpass, pages 1087 – 1094. Show abstract Abstract. Many Off-The-Shelf Software (OTSS) assessment techniques have been proposed, most of them using criteria related to standard quality models. However, these techniques are not as useful to evaluate and compare alternative OTSS as solutions to specific process-driven organizational changes. This article proposes PBEC-OTSS (Process-Based Evaluation and Comparison of OTSS), a technique for evaluating and comparing OTSS regarding impact in the organization, based on process models, and using fuzzy decision making systems. The technique was compared with an Ad-Hoc approach (systemized from the literature) in an experimental study with IT professionals, some new to BPM and some experts; the experts obtained similarly good results with either approach, but the novice professionals obtained better results with PBEC-OTSS than with Ad-Hoc. These results suggest that organizations can improve their Business/IT alignment with this technique even if no process experts are available.
  • Multi-attribute Auctions and Negotiations with Verifiable and Not-verifiable Offers
    373 Concepts and methods of business informatics, Management Information Systems (MIS) Gregory (Grzegorz) E. Kersten, Tomasz Wachowicz, Margaret Kersten, pages 1095 – 1102. Show abstract Abstract. Comparative studies of auction and negotiation exchange mechanisms have typically compared the outcomes obtained from the two mechanisms. Their result are inconclusive. The question which this paper aims to address is the viability of outcome-based comparisons. Such comparisons assume that both mechanisms produce the same types of outcomes but their values differ. An argument can be made that this is not necessarily the case. Based on several experiments of multi-attribute auctions and two formats of multi-bilateral negotiations the paper argues that the two mechanisms produce some outcomes which are comparable and other outcomes which are qualitatively different. A surprising finding of our experiments is that the outcomes of the non-verifiable negotiations were more similar to the outcomes of the reverse auctions than to the verifiable negotiations, despite the fact that the latter employ rules taken from the auction mechanism.
  • Verification of ArchiMate process specifications based on deductive temporal reasoning
    183 Business Process Management and Management Systems (BPM and BPMS), IT projects & IT projects management, Management Information Systems (MIS), Strategies and methodologies of IT implementation Radosław Klimek, Piotr Szwed, pages 1103 – 1110. Show abstract Abstract. Formal verification of business models has become recently an intensively researched area. Application of formal methods in this field necessities in overcoming several problems. Firstly, business analyst and designers rarely have enough skills and motivation to manually build abstract and formal specifications, hence, it arises the need to provide tools for an automated translation of business models into a suitable form ready for formal verification. Moreover, notations and languages used to describe enterprises usually have no clear semantics. Finally, the verification itself must be supported by an efficient tool. In this paper we investigate an application of formal and deduction-based techniques to automated verification of behavioral description embedded within ArchiMate models. We describe a set of rules that governs translation of processes specified in ArchiMate language into Linear Temporal Logic (LTL) formulas. The translation step is achieved with the developed software, as a plugin into a popular the Archi modeler. Formal verification of a business process properties is achieved with another tool, the LTL prover based on the semantic tableaux technique. Application of the method is discussed on a small, yet illustrative, example of a taxi service.
  • Design of Financial Knowledge in Dashboard for SME Managers
    384 Business Intelligence methods and tools, Business-oriented ontologies, topic maps, Knowledge-based and intelligent systems in management Jerzy Korczak, Helena Dudycz, Miroslaw Dyczkowski, pages 1111 – 1118. Show abstract Abstract. The article presents the approach to develop the economic and financial knowledge used for the Intelligent Dashboard for Managers. The content of the knowledge is focused on essential concepts related to the management of micro, small and medium enterprises. Knowledge-based functions, not previously available in commercial systems, increase the quality, effectiveness, and efficiency of the decision making process. The Intelligent Dashboard for Managers contains six ontologies describing areas of Cash Flow at Risk, Comprehensive Risk Measurement, Early Warning Models, Credit Scoring, Financial Market, and General Financial Knowledge. The ontology design process and examples of topic maps and usage in financial data analysis are presented here.
  • Risk avoiding strategy in multi-agent trading system
    345 Agent-based systems, Cloud computing, SOA, Web services, Concepts and methods of business informatics, Decision Support Systems and data mining Jerzy Korczak, Marcin Hernes, Maciej Bac, pages 1119 – 1126. Show abstract Abstract. The authors of this paper present an approach to trading strategy design for a multi-agent system which supports investment decisions on the stock market. The individual components of the system, the functionalities, and the mechanism of assessing the individual agents are briefly described. The main component, the supervisor agent, uses as a strategy a consensus method to reduce the level of investment risk. This method allows the coordination of the work of agents, and on the basis of decisions provided by the agents, and presents trading advice to the investor. The strategy testing has been done on FOREX quotes, namely on the pair EUR/USD. The results of the research are described and the directions of the further development of the platform are provided in the conclusion.
  • Optimising Web-Based Information Retrieval Methods for Horizon Scanning Using Relevance Feedback
    136 Marco A. Palomino, Tim Taylor, Geoff McBride, Hugh Mortimer, Richard Owen, Michael Depledge, pages 1127 – 1134. Show abstract Abstract. Horizon scanning is being increasingly regarded as an instrument to support strategic decision making. It requires the systematic examination of information to identify potential threats, emerging issues and opportunities to improve resilience and decrease risk exposure. Horizon scanning can use the Web to augment the acquisition of information, though this involves a search for novel and emerging issues without knowing them beforehand. To optimise such a search, we propose the use of relevance feedback, which involves human interaction in the retrieval process so as to improve results. As a proof-of-concept demonstration, we have carried out a horizon scanning exercise suggested by RAL Space. Our implementation of relevance feedback was able to maintain the retrieval of relevant documents constant over the length of the experiment, without any decrements, which represents an improvement over previous studies where relevance feedback was not considered.
  • Software Implementation of Common Criteria Related Design Patterns
    210 Dariusz Rogowski, pages 1135 – 1140. Show abstract Abstract. Writing evidence documents for evaluation and certification processes according to the Common Criteria security standard is a very difficult, time-consuming and complex task. Nowadays there are only a few, limited solutions based on templates and software tools which can efficiently support developers in preparing evaluation deliverables. This paper describes the results of an R&D project whose aim was to work out a computer-aided tool with built-in design patterns. Firstly, according to all security assurance requirements the design patterns in a paper version were prepared. Secondly, they were verified and validated by the developers in order to make some amendments and improvements. The conclusions were used as the source of functional requirements for a computer-aided tool. As a result a complete computer system was designed which implements the design patterns, knowledge base, evaluation methodology, and additional external supporting software. That solution facilitates and speeds up the development of the evidence documentation.
  • IT Security Threats in Cloud Computing Sourcing Model
    192 Cloud computing, SOA, Web services, IT governance, efficiency and effectiveness, IT projects & IT projects management, Strategies and methodologies of IT implementation Artur Rot, Małgorzata Sobińska, pages 1141 – 1144. Show abstract Abstract. New information technologies have been developing nowadays at an amazing speed, affecting the functioning of organizations significantly. Due to the development of new technologies, especially mobile ones, borders in the functioning of modern organizations diminish and models of running business change. Almost all organizations are involved in some way in sourcing activities, and each of them develops a sourcing relationship that suits its particular needs. In this article different kinds of outsourcing models were discussed, which are applied in the contemporary management, with particular emphasis put on cloud computing. The main aim of this article was to present the most important risks related to the introduction of management models based on the most recent IT technologies, e.g. cloud computing, and emphasizing the role of appropriate IT security management in the times of globalization of organization virtualization.
  • Modeling the Bullwhip Effect in a Multi-Stage Multi-Tier Retail Network by Generalized Stochastic Petri Nets
    419 Enterprise information systems (ERP, CRM, SCM, etc.), Knowledge-based and intelligent systems in management, Strategies and methodologies of IT implementation Bidyut Sarkar, Agostino Cortesi, Nabendu Chaki, pages 1145 – 1152. Show abstract Abstract. Bullwhip effect (BWE) refers to the accumulation of stock flowing up and down along the supply chain management (SCM). It reduces the operating efficiency of the chain and blocks the operating resources. Some of the common causes of BWE are demand order variations, long lead times, competence defects between supply chain links, lack of communication among links in the chain, etc. There have been efforts to overcome these issues. However, very little work has been reported based on formal representation and analysis of resource flow in the supply chain system. In this work, a novel framework is proposed using Generalized Stochastic Petri-net (GSPN) model towards handling this issue in a distributed scenario. The analysis on the stochastic nets allows identifying the bottlenecks in the supply chain echelons along with customer relationship management (CRM). This has been used to rebuild infrastructure with the end-objective of reducing the BWE.
  • The postulates of consensus determining in financial decision support systems
    217 Jadwiga Sobieska-Karpińska, Marcin Hernes, pages 1153 – 1156. Show abstract Abstract. This article presents the problem of consensus determining postulates defining in financial decision support systems. The consensus determining methods and function is characterized in the first part. Next the general postulates for consensus estimation and their characteristics are presented. The final part of article suggest new postulates pertaining to financial decisions, and the possibility of their use in practical solutions. The application of these postulates, as a consequence, can lead to the process of making financial decisions will be more flexible, and the risk involved in financial decisions will be significantly reduced.
  • The DDMKCC Decision Support Architecture in the Light of Case Studies
    438 Business Intelligence methods and tools, Decision Support Systems and data mining, Knowledge-based and intelligent systems in management, Management Information Systems (MIS) Stanisław Stanek, Jolanta Wartini Twardowska, Zbigniew Twardowski, pages 1157 – 1164. Show abstract Abstract. What makes the development of decision support systems (DSS) particularly challenging is the change dynamics of the design space, the instability of initial specifications, and the lack of an adequate model of the decision making process. Facing these, one can appreciate a methodology that can drive the designer’s creative effort within a particular decision context. The paper aims to outline the origin and the evolution of research on the DSS architecture commenced by Sprague and Carlson and carried on under the auspices of the International Federation for Information Processing (IFIP) and the International Society for Decision Support Systems (ISDSS) . In particular, the paper presents insights, findings, recommendations and conclusions derived from case studies conducted in domestic medium-sized and large enterprises.
  • The Structure of Agility from Different Perspectives
    227 Business-oriented ontologies, topic maps, Concepts and methods of business informatics, IT governance, efficiency and effectiveness Roy Wendler, pages 1165 – 1172. Show abstract Abstract. Agility as a term is widely known today. However, a common understanding of what agility means and what it consists of is missing. Until today, a lot of frameworks have been developed, but they are very heterogeneous regarding content and structure. This paper approaches that issue by conducting a systematic comparison of 28 available agility frameworks out of the domains of agile manufacturing, agile software development, agile organization, and agile workforce. Altogether, 33 concepts related to agility were identified. The results of the comparison show that even within the examined specific domains a lack of consensus is obvious. In addition, the utilized concepts are very ambigious and overlapping. So, the interdependencies between the identified concepts were analyzed in detail. This revealed five recurring “clusters” that each combine several concepts with similar content, but despite the amount of available frameworks, none of it reflects these clusters directly. Hence, the study shows that the factors beyond the construct of agility are not fully uncovered yet.
  • Measuring the information society in Poland – dilemmas and a quantified image
    204 IT governance, efficiency and effectiveness, Strategies and methodologies of IT implementation Ewa Ziemba, Rafał Żelazny, pages 1173 – 1180. Show abstract Abstract. This paper focuses on measurement of information society in Poland. The aim of this paper is twofold. The first objective is to present a coherent picture of measurement methods for information society. The second aim of the paper is to show measurement findings of information society in Poland. Firstly, the paper presents available methods of information society measurement and a core set of internationally agreed information society indicators. Secondly, the measurement of information society in Poland has been performed with the application of two methods – measuring the influence of ICT on GDP and measuring ICT Development Index. Finally, a discussion has been undertaken in order to establish a framework for development of information society quantitative measurement methods in Poland.
  • The outcomes of the research in areas of application and impact of software agents societies to organizations so far. Examples of implementation in Polish companies.
    168 Mariusz Żytniewski, Radosław Kowal, Andrzej Sołtysik, pages 1181 – 1187. Show abstract Abstract. The development of information management systems stimulates the search for new forms of supporting business processes which take place in the organization. One of the solutions that can be applied here is software agents that enable support activities to the employee and the customer promoting information and knowledge about the organization. Such solutions are commercially available for several years in the form of interface agents, but there is insufficient research on the modeling, the applications and the impact on the organization and its environment. The purpose of this paper is to present theoretical research in this area regarding companies providing such solutions on Polish territory.

2nd Workshop on Information Technologies for Logistics

  • Product Swapping and Transfer Sales Between Suppliers in a Balanced Network
    271 Logistics process modeling, including influence of warehouse automatic, Optimization of logistics processes Ikbal Ece Dizbay, Omer Ozturkoglu, pages 1191 – 1194. Show abstract Abstract. In this paper we present a preliminary, deterministic mathematical model of cooperative supply chain network of suppliers and customers. We consider horizontal cooperation among suppliers such that they can swap their orders to reduce their transportation cost, and they can purchase products from each other to reduce their shortage cost. Hence, the objective is to examine the potential swap and horizontal purchasing operations between suppliers under perfect information sharing. Assuming a balanced network in a single-period, in which total capacity of suppliers is greater than or equal to the total demand of customers, we conduct an empirical analysis for six suppliers and eight customers. The analysis suggests for many suppliers the benefits of order swapping and lateral purchasing.
  • Rule-based Approach For Supplier Evaluation
    156 Andrzej Macioł, Stanisław Jędrusik, Bogdan Rębiasz, pages 1195 – 1202. Show abstract Abstract. This paper presents a concept of use of the rule-based reasoning systems for evaluation and classification of the suppliers. The problem of suppliers selection is widely discussed in literature. Majority of the authors apply the method of multi-criteria evaluation for selection of suppliers, mainly the Analytic Hierarchy Process (AHP) algorithm and related ones, to find its solution. In this paper it has been proved that a suitably expressive system of rules management can be used as an effective tool for suppliers evaluation. In the presented work we have applied the Rebit system which was elaborated by the AGH University of Science and Technology. An example of evaluation of a supplier of primary charging materials for metal processing enterprise has been presented. It has been shown how individual evaluation criteria are grouped into sets of independent rules and how one may use tools to enhance the knowledge acquisition.
  • Applying Big Data and Linked Data Concepts in Supply Chains Management
    269 Innovations in information systems supporting logistics and its management Silva Robak, Bogdan Franczyk, Marcin Robak, pages 1203 – 1209. Show abstract Abstract. One of the contemporary problems, and at the same time a big opportunity, in business networks of supply chains are the issues associated with the vast amounts of data arising there. The data may be utilized by the decision support systems in logistics; nevertheless, often there is an information integration problem. The information interchange issues are related to incompatible information interchange issues cross independently designed data systems. The networked supply chains will need appropriate IT architectures to support the cooperating business units utilizing structured and unstructured big data and the mechanisms to integrate data in heterogeneous supply chains. In this paper we analyze the capabilities of the big data technology architectures with cloud computing under usage of Linked Data in business process management in supply chains to cope with unstructured near-time data and data silos problems. We present our approach on a 4PL (Fourth-party Logistics) integrator business process example.
  • A hybrid approach to supply chain modeling and optimization
    90 Artificial intelligence systems and decision support systems in logistics, Optimization of logistics processes Paweł Sitek, Jarosław Wikarek, pages 1211 – 1218. Show abstract Abstract. The paper presents the concept and an outline of the implementation of hybrid approach to supply chain modeling and optimization. In this approach, integration of two environments of mathematical programming (MP) and logic programming (LP) was proposed. In particular, the connection was made between integer programming (IP) and constraint logic programming (CLP). The idea of the proposed approach is to use the strengths of each of the proposed environments for modeling and optimization of the supply chain issues. This is due to the different treatment of the optimization constraints and the method for each of them. This is particularly important for models in which there are many constraints summing discrete decision variables, and the objective function is also of similar nature. In order to verify the proposed approach, the optimization models and their implementation in traditional mathematical programming and hybrid environments were presented.

19th Conference on Knowledge Acquisition and Management

  • Inconsistency Handling in Collaborative Knowledge Management
    423 Knowledge engineering and software engineering, Knowledge representation models Weronika T. Adrian, Antoni Ligęza, Grzegorz J. Nalepa, pages 1221 – 1226. Show abstract Abstract. One of the challenges of knowledge management is handling inconsistency. Traditionally, it was often perceived as indication of invalid data or behavior and as such should be avoided or eliminated. However, there are also numerous situations where inconsistency is a natural phenomena or carry useful information. In order to decide how to manage inconsistent knowledge, it is thus important to recognize its origin, aspect and influence on the behavior of the system. In this paper, we analyze a case of collaborative knowledge management with hybrid knowledge representation. This serves as a starting point for a discussion about various types of inconsistencies and approaches to handle it. We analyze sources, interpretation and possible approaches to identified types of inconsistencies. We discuss practical use cases to illustrate selected approaches.
  • Internet as the Source for Acquiring the Medical Information
    348 Methods and tools for knowledge acquisition Magdalena Czerwinska, pages 1227 – 1234. Show abstract Abstract. The purpose of the present paper is to discuss the results of research conducted in 2012 in order to determine the role of Internet as the source for acquiring the medical information in a group of students. Obtained results confirm the assumed working hypothesis about big role of Internet as the source of medical information that the searching for medical information in Internet is the principal sign of the use of ICT solutions in health protection (as one of e-health elements). It has been examined what kind of information is searched by the patients, what Internet sources are used, what is the reason of the searching for medical information in Internet, what are the barriers encountered by the patients. Details are included in this paper.
  • Corporate Amnesia in the Micro Business Environment
    260 Stephen J. Hall, Clifford De Raffaele, pages 1235 – 1239. Show abstract Abstract. Corporate amnesia is a phenomenon that has persistently threatened the livelihood of business organizations and their success in commercial activity. Several substantial studies on this observable fact have been undertaken with focus primarily aimed at the large corporations and the small to medium sized organizations. This vulnerability is however evermore present and significant within the smaller of businesses. In the micro enterprise, the impact of corporate amnesia is realized when even a single member of staff is absent for any lengthy period of time or vacates their post altogether. Although 80% of the workforce in major economies is directly engaged within the micro enterprise business, minimal research has addressed such a commercial model albeit the fact that the competitive benefits that can potentially be realized by addressing corporate amnesia are significant. To this end, this paper will identify the main causes of corporate amnesia within the micro business environment and propose a suitable framework for the enterprise to effectively facilitate the adoption of Knowledge Management and realize the associated competitive benefits.
  • Knowledge conflicts in Business Intelligence systems
    283 Business Intelligence environment for supporting knowledge management, Knowledge representation models Marcin Hernes, Kamal Matouk, pages 1241 – 1246. Show abstract Abstract. This document presents a problem of knowledge conflict appearing in Business Intelligence systems. The structure of such class system in context of knowledge creating is presented in the first part of article. Next, the formal definition of knowledge structure of Business Intelligence, which is necessary to comparing these knowledge, was elaborated. The characteristic, sources and examples of knowledge conflicts is presented in the final part of article. The detecting and resolving of this type of conflicts is necessary, because this allows receiving by user, from the system, the proper reports as a results of analyses. On the basis of these reports the user can takes the decision that lead to satisfying benefits.
  • One approach to the classification of business knowledge diagrams: practical view
    40 Methods and tools for knowledge acquisition Dmitry Kudryavtsev, Tatiana Gavrilova, Irina Leshcheva, pages 1247 – 1253. Show abstract Abstract. Diagrams are an effective and popular tool for visual knowledge structuring. Managers also often use them to acquire and transfer business knowledge. There are many currently available diagrams or visual modeling languages for managerial needs, unfortunately the choice between them is frequently error-prone and inconsistent. This situation raises the next questions. What diagrams/ visual modeling languages are the most suitable for the specific type of business content? What domain-specific diagrams are the most suitable for the visualization of the particular elements of organizational ontology? In order to provide the answers, the paper suggests light-weight specification of diagrams and knowledge content types, which is based on the competency questions and ontology design patterns. The proposed approach provides the classification of qualitative business diagrams.
  • Knowledge Management as Foundation of Smart University
    352 Distance learning and knowledge sharing, Knowledge management and e-government, Managerial knowledge evolution Katarzyna Marciniak, Mieczysław Owoc, pages 1255 – 1260. Show abstract Abstract. Functioning in an era of knowledge is forcing organizations to manage this valuable resource in exact way. Very frequently activities of organizations are dependent on the application of knowledge, sometimes even means "to be or not to be" for enterprise. Nowadays, to fulfill business goals of institutions it becomes fundamental for them to use intelligent systems supporting comprehensive management of the organization. Such support allows to increase efficiency and better effectiveness of the running businesses. As we are living in age of international integration, where world economy is tending to reach type of knowledge-based economy (KBE), universities are forced also to change way of their functioning. It is important for modern universities to be not only education centers but mainly the successfully prospering organizations-based-on-knowledge. Such approach is going to provide higher competitiveness of particular institution and will make its functioning more useful for economy of the region. Implementing a comprehensive and intelligent IT solution within a university and providing educational services, which are personalized to the needs of the market, will allow universities to reach a type of institution called “smart”. The aim of the paper is explanation why university centers should evolve in a type of institution based on knowledge. The paper is managed as follows. After short introduction concerning research context the discussed concepts of Knowledge Management and Smart Universities are presented. In the main section real examples of Knowledge Management Systems implementation and examples of Smart Universities are investigated in order to identify and describe roles of Knowledge Management Systems in this area. It allows for formulation conclusions on intersection of two investigated approaches.
  • Scalable Web Monitoring System
    248 Methods and tools for knowledge acquisition Andrzej Opaliński, Wojciech Turek, Krzysztof Cetnarowicz, pages 1261 – 1267. Show abstract Abstract. Publicly available Web search engines suffer from several limitations, which significantly reduce usability in particular cases. The most important limitations are out-of-date information, very simple query language and limited number of results. In many cases, users of the Internet are interested in finding new information which appear in the particular Web portal. In this paper, a system for monitoring of Web sites is presented. The system can continuously analyze the content of specified Web pages using advanced text processing algorithms. It actively notifies the user when required information is found in newly-added content. It can be deployed on a single PC as well as on a cluster of computers, providing good scalability. The paper presents an abstract architecture of the system, details of the implementation and real-life experiments results.
  • Business Intelligence as a service in a cloud environment
    299 Business Intelligence environment for supporting knowledge management Maciej Pondel, pages 1269 – 1271. Show abstract Abstract. Business Intellligence should be described as a way of managing our company more than a set of functionalities in a computer software. Acquiring a real profit requires enterprise management to understand the value of the data and the way data describe business processes. Being aware of the business and measuring its performance we are able to improve the processes and make whole the business more effective. To achieve business improvement we require efficient Business Intelligence system as a combination of a software, hardware, the communication infrastructure and services regarding data preparation, integration and delivery to the system. In this paper author considers if the service oriented approach and cloud computing can make BI implementation more efficient.
  • Knowledge Acquisition for New Product Development with the Use of an ERP Database
    315 Business Intelligence environment for supporting knowledge management, Data mining and knowledge discovery from databases and data warehouses, Knowledge engineering and software engineering, Management of enterprise knowledge versus personal knowledge, Methods and tools for knowledge acquisition Marcin Relich, pages 1273 – 1278. Show abstract Abstract. Nowadays, a considerable number of enterprises develop new products using an Enterprise Resource Planning (ERP) system. One of the modules of a typical ERP system concerns project management. Functionalities of this module consist of defining resources, company calendars, sequence of project tasks, task duration etc. in order to obtain a project schedule. These parameters can be defined by the employees according to their knowledge, or they can be connected with data from previous completed projects. The paper investigates using an ERP database to identify critical factors, i.e. variables that significantly influence on new product development. Project duration and cost is estimated by a fuzzy neural system that uses data of completed projects stored in an ERP system.
  • Preliminaries for Dynamic Competence Management System building
    213 Przemysław Różewski, Bartłomiej Małachowski, Jarosław Jankowski, Marcin Prys, Piotr Dańczura, pages 1279 – 1285. Show abstract Abstract. Competence management systems are an important addition to knowledge management systems. Competencies can be processed, during the identification, assessment and acquisition processes, because there is a certain set of tools used to test competencies and estimate their levels. In this paper, we focused on the analysis of the concept of Dynamic Competence Management System. The system takes into account competence changes caused by the efflux of time and competence diffusion process in project group.
  • Outsourcing of knowledge in change and renewal processes
    44 Małgorzata Sobińska, Jakub Mierzyński, pages 1287 – 1291. Show abstract Abstract. This paper is an attempt to present the concept of organizational transformation with the use of the strategic renewal theory oriented towards organizational learning. It is indicated that enterprise renewal processes constitute the basis for organizational changes enabling evolutionary development towards implementing enterprise learning mechanisms. A new aspect occurring in this presentation is the discussion of the benefits which might be brought for the organization by the use of external resources (outsourcing tools) for the purpose of enhancing renewal processes.
  • Student Response to Educational Games – An Empirical Study
    55 Methods and tools for knowledge acquisition Urszula Świerczyńska-Kaczor, Jacek Wachowicz, pages 1293 – 1299. Show abstract Abstract. In this article we explore students’ experience with digital educational games. We analyze and discuss the factors which determine a student’s perception of the educational benefits of game-based learning. The organization of the research is structured by the main question - how do variables such as player satisfaction, game features (e.g. the perceived quality of the educational content, the design) and enhancement in the player’s educational process are interconnected. The study proves that games as an educational tool are assessed very favorably by undergraduate students of business and economics. Moreover game features are correlated with educational benefits and player satisfaction. A player satisfaction is also linked with enhanced learning.

Techniques and Applications for Mobile Commerce

  • Social Network Framework for Deaf and Blind People based on Cloud Computing
    197 Mahmoud El-Gayyar, Hany F. ElYamany, Tarek Gaber, Aboul Ella Hassanien, pages 1301 – 1307. Show abstract Abstract. Most of the governments and civil society organizations work hardly to promote the disabled people especially blind and deaf persons to join the normal community and practice the regular daily life activities. Indeed, Information Technology with its modern methodologies such as mobile and cloud computing has an impressive role in enhancing the intercommunication among the people with different disabilities and normal persons from one side and among the disabled people themselves who have the same impairment. However, a few numbers of suggested systems are quite limited for the Arabic Region. Additionally, there is no proposed system for connecting the blind and deaf people within a direct conversation in Arabic region. In this paper, we propose a comprehensive framework constructed upon three main modern technologies: mobile devices, cloud resources and social networks to provide a seamless communication between the blind and deaf people especially for those living in the Arabic countries. Moreover, it is designed to facilitating the communication with normal people through various directions by using recent methodologies such as time-of-flight camera and social networks. The main modules and components of the suggested framework and its possible scenarios are full-analyzed and described.
  • Tracking the node path in wireless ad-hoc network
    212 Integration of mobile solutions in urban infrastructure, Languages and methods for mobile based information systems, Measurement, control, and evaluation of urban infrastructure by mobile solutions Artur Sierszeń, Łukasz Sturgulewski, Agnieszka Kotowicz, pages 1309 – 1313. Show abstract Abstract. This article provides an insight into the topic of ad-hoc protocols used for routing, namely proactive and reactive protocols. It depicts the general concept how these protocols can find a path in a network between two nodes and it also presents the evaluation of the methods of tracking the node path in a wireless ad-hoc network through investigating the available mobile routing protocols. The main focus was put on the observation of the throughput and the average end-to-end delay in a network, using for the simulation OMNeT++ environment. Three protocols were chosen for the final testing: Ad-hoc On-demand Distance Vector (AODV), Optimized Link State Routing (OLSR) and Dynamic Source Routing (DSR).
  • User Positioning System for Mobile Devices
    211 Concepts and methods for evaluating problems and the usefulness of mobile techno, Integration of mobile solutions in urban infrastructure, Languages and methods for mobile based information systems, Measurement, control, and evaluation of urban infrastructure by mobile solutions Artur Sierszeń, Łukasz Sturgulewski, Karol Ciążyński, pages 1315 – 1318. Show abstract Abstract. In the recent years, the Global Positioning System (GPS) has become a standard for the location and navigation for a huge number of people all over the world. This system is unquestionably one of the most significant developments of the twentieth century. GPS employs a great variety of applications from car navigation and cellular phone emergency positioning even to aeronautic positioning. Despite the fact that it plays an essential role in today’s world, GPS has some limitations. The main disadvantage is the inability to operate inside the buildings because of the loss of signal from the satellites. During the last decade, the interest in location based services has significantly increased. It is related to the existence of ubiquitous computers and context awareness of mobile devices. Information about the position plays the great role in the field of security, logistics and convenience nowadays. Thus, it is necessary to fill the gap at the point where Global Positioning System does not perform satisfactorily.
  • Development of a Mobile Application for People with Panic Disorder as augmentation for an Internet-based Intervention
    272 Enabling technologies for ubiquitous systems Stefan Kleine Stegemann, Lara Ebenfeld, Dirk Lehr, Matthias Berking, Burkhardt Funk, pages 1319 – 1325. Show abstract Abstract. Smartphone technology has recently gained attention in the field of E-Mental Health research and mobile applications for measuring health-related aspects as well as mobile mental health interventions have emerged. However, little work has been done on leveraging mobile technology in combination with internet-based interventions. We argue, that mobile applications can not only enrich mental health treatments but also foster the commercial success of E-Mental Health applications. To this end, we have developed GET.ON PAPP, a mobile application for panic disorder that integrates an internet-based treatment into daily life. In this work, we present the development and structure of GET.ON PAPP and a perspective for its evaluation.
  • Vertoid: Exploring the Persuasive Potential of Location-aware Mobile Cues
    257 Analysis and Design of mobile web-service based applications, Integration of mobile solutions in urban infrastructure, Pedagogical and social issues in mobile learning Paweł Woźniak, Andrzej Romanowski, pages 1327 – 1330. Show abstract Abstract. This paper presents the design, implementation and user study of Vertoid a mobile system for providing context-aware cues that help users limit domestic greenhouse-gas emissions. We have designed an Android-based mobile application that provides user with tips on simple eco-friendly actions in relevant locations. We then conducted a medium-term field study to evaluate the system. Our study shows that while context-aware cues have the potential to be a useful way to deliver customised content, they may as well provide unnecessary distractions. Based on the results of our study, we discuss how location awareness can be used to support persuasive systems and outline several design considerations for providing context-aware cues.

4th International Workshop Automating Test Case Design, Selection and Evaluation

  • Requirements on automatically generated random test cases
    206 Thomas Arts, Alex Gerdes, Magnus Kronqvist, pages 1335 – 1342. Show abstract Abstract. Developing, for example, a simple booking web service with modern tools can be a matter of a few weeks work. Testing such a system should not need to take more time than that. Automatically generating tests from specified properties of the system using the tool QuickCheck provides professional developers with the required test efficiency. But how good is the quality of these automatically generated tests? Do they cover the cases that one would have written in manual tests? The quality depends on the specified properties and data generators and so far there has not been an objective way to evaluate the quality of these QuickCheck generators. In this paper we present a method to assess the quality of QuickCheck test data generators by formulating requirements on them. Using this method we can give feedback to developers of such data generators in an early stage. The method supports developers in improving data generators, which may lead to an increase of the effectiveness in testing while maintaining the same efficiency.
  • A method for selecting environments for software compatibility testing
    125 Łukasz Pobereżnik, pages 1343 – 1348. Show abstract Abstract. Modern software is developed to work with multiple software and hardware architectures, to cooperate with various peer components and can be installed in many different configurations. In order to test it, all possible working environments needs to be created. This requires software and hardware resources like servers, networks and software licenses and most important: man-hours of qualified engineers that will have to configure and maintain them. Because resources are usually limited we have to choose a set of configurations with highest impact on quality of software under test. In this paper we present a method of measuring effectiveness of given software environment for discovering defects in software by introducing environment sensitivity measure. We also show how it can be used in simple algorithm used to select best configurations by using only a selected subset of them and progressively modifying it thougout software development process.
  • An Evaluation of Data Race Detectors Using Bug Repositories
    310 Evaluation of testing techniques and tools on real systems, not only toy problem, techniques and tools for automating test case design Jochen Schimmel, Korbinian Molitorisz, Walter F. Tichy, pages 1349 – 1352. Show abstract Abstract. Multithreaded software is subject to data races. A large number of data race detectors exists, but they are mainly evaluated in academic examples. In this paper we present a study in which we applied data race detectors to real applications. In particular, we want to show, if these tools can be used to locate data races effectively at an early stage in software development. We therefore tracked 25 data races in bug repositories back to their roots, created parallel unit tests and executed 4 different data race detectors on these tests. We show that with a combination of all detectors 92% of the contained data races can be found, whereas the best data race detector only finds about 50%.
  • Test City metaphor as support for visual testcase analysis within integration test domain
    360 Evaluation of testing techniques and tools on real systems, not only toy problem Artur Sosnówka, pages 1353 – 1358. Show abstract Abstract. the majority of formal description for software testing in the industry is conducted at the system or acceptance level, however most formal research has been focused on the unit level. This paper shows formal test selection and analyzes criteria for system or integration test based on visualization analysis for low level test cases. Visual analysis for low level test case selection is to be based on inputs from available Test Management system. The paper presents a use case for visual metaphor as a base for analysis testware for a test project in the industry.

International Workshop on Cyber-Physical Systems

  • Modelling Java Concurrency: An Approach and a Uppaal Library
    17 Applications of CPS, CPS Education, Validation and Verification Franco Cicirelli, Angelo Furfaro, Libero Nigro, Francesco Pupo, pages 1361 – 1368. Show abstract Abstract. Nowadays trend in multi-core and many-core computing architectures, and the growing acceptance of Java for building embedded real-time computing systems, makes concurrent and timed systems development a necessity in order to properly exploit the available computing resources. This work argues that to effectively cope with the fundamental correctness issues of such systems, the use of formal tools is mandatory so as to avoid misuse of the synchronization primitives which can easily lead to well-known problems of deadlock, starvation and so forth. This paper proposes an original approach to modeling and exhaustive verification of Java-based concurrent systems which relies on the popular UPPAAL model checker. More precisely, a library of UPPAAL timed automata (TA) reproducing the semantics of major Java concurrent and synchronization mechanisms, was developed. All of this fosters a modeling process driven by implementation aspects, thus favoring a smooth transition from specification down to implementation. The library includes such common control structures like semaphores and monitors, both classic and Java specific. The paper describes the developed TA library and shows its practical use by means of examples. Finally, an indication of on-going and future work directions is drawn in the conclusions.
  • Synthesis of Implementable Control Strategies for Lazy Linear Hybrid Automata
    115 Applications of CPS, Control Systems Luigi Di Guglielmo, Sanjit A. Seshia, Tiziano Villa, pages 1369 – 1376. Show abstract Abstract. In the last few years hybrid automata have been widely applied in the modeling and verification of hybrid systems, but their related formal verification techniques usually rely on un-implementable assumptions to which a concrete control strategy cannot adhere. For this reason, once a hybrid model of the system has been proved to be correct with respect to the desired properties, it would be valuable to derive a correct-by-construction implementable control strategy for such a model. This work discusses a new methodology and a corresponding tool-chain that allows to synthesize an implementable control strategy for the class of hybrid automata named Lazy Linear Hybrid Automata (LLHA). LLHA model the discrete time behavior of control systems containing finite-precision sensors and actuators interacting with their environment under bounded delays.
  • Towards deductive-based support for software development processes
    347 Validation and Verification Radosław Klimek, pages 1377 – 1380. Show abstract Abstract. The work relates two initial disciplines of the Rational Unified Process (RUP), i.e. Business Modeling and Requirements Engineering, to support them in an integrated way through deductive-based formal verification using temporal logic. On the other hand, Cyber-Physical Systems (CPS), which should be an effective orchestration of computations and physical processes, need careful development and formal verification to ensure they influence software reliability, trustworthiness and cost in a positive way. A method for building both business models and requirements models, including their logical specifications, is proposed and presented step by step. Applying the presented concepts bridges the gap between the benefits of deductive reasoning for correctness analysis and the difficulties in obtaining complete logical specifications.
  • Studying Interrelationships of Safety and Security for Software Assurance in Cyber-Physical Systems: Approach Based on Bayesian Belief Networks
    214 Applications of CPS, Control Systems Andrew J. Kornecki, Nary Subramanian, Janusz Zalewski, pages 1381 – 1387. Show abstract Abstract. The paper discusses mutual relationships of safety and security properties in cyber-physical systems (CPS). Generally, safety impacts the system’s environment while environment impacts security of a CPS. Very frequently, safety and security of a CPS interact with each other either synergistically or conflictingly. Therefore, a combined evaluation of safety and security that considers their interrelationships is required for proper assessment of a CPS. Bayesian Belief Networks (BBN) can be used for this evaluation where factors related to safety and security of a CPS are assumed to be randomly distributed. The result of this evaluation is an assessment that is non-deterministic in nature but gives a very good approximation of the actual extent of safety and security in a CPS. Using a case study of a SCADA based system in an oil pipeline control, the authors developed a BBN approach for assessing mutual impacts of security and safety violations. This approach is compared with the Non-Functional requirements approach (NFR), used previously, which is largely qualitative in nature. This study demonstrates that BBN approach can significantly complement other techniques for joint assessment of safety and security in CPS.
  • Object-oriented Approach to Timed Colored Petri Net Simulation
    284 Interoperability, Scalability/Complexity, Validation and Verification Michał Kowalski, Wociech Rząsa, pages 1389 – 1392. Show abstract Abstract. This paper presents object-oriented design of library meant for modeling and simulating Timed Colored Petri Net models. The approach is prepared to integrate TCPN models with crucial parts of larger applications implemented in object-oriented languages. The formal models can be tightly joined with applications allowing the latter to interpret states of the formal model in their domain of responsibility. This approach allows less error-prone and more pervasive use of formal methods to improve quality of software created with imperative languages.
  • Interactive Verification of Cyber-physical Systems: Interfacing Averest and KeYmaera
    62 Validation and Verification Xian Li, Kerstin Bauer, Klaus Schneider, pages 1393 – 1400. Show abstract Abstract. Verification is one of the essential topics in research of cyber-physical systems. Due to the combination of discrete and continuous dynamics, most verification problems are undecidable and need to be dealt with by various kinds abstraction techniques. As systems grow larger and larger, most verification problems are difficult even for purely discrete systems. One way to address this problem is the use of interactive verification. Recently, this approach has also been considered by cyber-physical verification tools like KeYmaera and other classical theorem provers.
    Important requirements for the interactive verification are a precise and readable modeling language as well as the possibility to decompose the system into smaller subsystems. Here, tools like KeYmaera and PVS still need further improvement. On the other hand, these modeling aspects are both addressed within the language Quartz as it provides a complete programming language for cyber-physical systems with standard data types and programming statements as well as a precise compositional semantics that is well-suited for compositional verification.
    In this paper, we take the advantages of two different tools, the Averest system and KeYmaera, for the interactive verification of cyber-physical systems. This way, we combine modeling and verification capabilities of Averest and the verification capability of KeYmaera, in order to provide a basis for powerful tool set for the interactive verification of cyber-physical systems.
  • Inter-Domain Requirements and their Future Realisability: The ARAMiS Cyber-Physical Systems Scenario
    63 Applications of CPS, CPS Education, Interoperability Birgit Penzenstadler, Jonas Eckhardt, Wolfgang Schwitzer, Maria Victoria Cengarle, Sebastian Voss, pages 1401 – 1406. Show abstract Abstract. Systems whose functionality and services span over multiple, interconnected application domains have become known as cyber-physical system (CPS) and currently receive much attention in research and practice. So far, CPS still come with a variety of development-process-related and technical challenges. These challenges include the interaction between the different domain-specific systems and possible conflicts between their requirements, as well as the choice of appropriate modelling concepts.
  • Safety Analysis of Autonomous Ground Vehicle Optical Systems: Bayesian Belief Networks Approach
    354 Applications of CPS Daniel Reyes-Duran, Elliot Robinson, Andrew J. Kornecki, Janusz Zalewski, pages 1407 – 1413. Show abstract Abstract. Autonomous Ground Vehicles (AGV) require diverse sensor systems to support the navigation and sense-and-avoid tasks. Two of these systems are discussed in the paper: dual camera-based computer vision (CV) and laser-based detection and ranging (LIDAR). Reliable operation of these optical systems is critical to safety since potential faults or failures could result in mishaps leading to loss of life and property. The paper identifies basic hazards and, using Fault Tree (FT) analysis, the causes and effects of these hazards as related to LIDAR and CV systems. A Bayesian Belief Network (BN) approach supported by automated tool is subsequently used to obtain quantitative probabilistic estimation of system safety.
  • Towards the Applicability of Alf to Model Cyber-Physical Systems
    137 Interoperability Alessandro Gerlinger Romero, Klaus Schneider, Maurício Gonçalves Vieira Ferreira, pages 1415 – 1422. Show abstract Abstract. Systems engineers use SysML as a vendor-independent language to model Cyber-Physical Systems. However, SysML does not provide an executable form to define behavior but this is needed to detect critical issues as soon as possible. Action Language for Foundational UML (Alf) integrated with SysML can offer some degree of precision. In this paper, we present an Alf specialization that introduces the synchronous-reactive model of computation to SysML, through definition of not explicitly constrained semantics: timing, concurrency, and inter-object communication. The proposed specialization is well-suited for safety-critical systems because it is deterministic. We study one example already modeled in the literature, to compare these approaches with our one. The initial results show that the proposed specialization helps to couple complexity, provides better composition, and enables deterministic behavior definition.
  • Improving security in SCADA systems through firewall policy analysis
    43 Cyber-security Ondrej Rysavy, Jaroslav Rab, Miroslav Sveda, pages 1423 – 1428. Show abstract Abstract. Many of modern SCADA networks are connected to both the company’s enterprise network and the Internet. Because these industrial systems often control critical processes the cyber-security requirements become a priority for their design.
  • Development of a Cyber-Physical System for Mobile Robot Control using Erlang
    246 Applications of CPS, Control Systems Szymon Szomiński, Konrad Gądek, Michał Konarski, Bogna Błaszczyk, Piotr Anielski, Wojciech Turek, pages 1429 – 1436. Show abstract Abstract. Design of mobile robot control systems is a huge challenge, which require solving issues related to concurrent hardware access and providing high availability. Existing solutions in the domain are based on technologies using low level languages and shared memory concurrency model, which seems unsuitable for the task. In this paper a different approach to the problem of building a cyber-physical system for mobile robots control is presented. It is based on Erlang language and technology, which support lightweight processes, fault tolerance mechanisms and uses message passing concurrency model with built-in inter-process communication. Created system used a new, open-source robotic platform, which had been designed for scientific and educational purposes. Integrated system has been tested in several scenarios, proving flexibility, durability and high performance.

Performance of Business Database Applications

  • On Redundant Data for Faster Recursive Querying Via ORM Systems
    305 Automated database tuning, Extending the capabilities of object relational mappings Aleksandra Boniewicz, Piotr Wiśniewski, Krzystof Stencel, pages 1439 – 1446. Show abstract Abstract. Persistent data of most business applications contain recursive data structures, i.e. hierarchies and networks. Processing such data stored in relational databases is not straightforward, since the relational algebra and calculus do not provide adequate facilities. Therefore, it is not surprising that a number of initial SQL standards do not contain recursion as well. Although it was introduced by SQL:1999, even now it is implemented in few selected database management systems. In particular, one of the most popular DBMSs (MySQL) does support recursive queries yet. Numerous classes of queries can be accelerated using redundant data structures. Recursive queries form such a class. In this paper we consider four materialization solutions that speed up recursive queries. Three of them belong to the state-of-the-art, while the fourth one is the contribution of this paper. The latter method assures that the required redundant storage is linearithmic. The other methods do not guarantee such a limitation. We also present thorough experimental evaluation of all these solutions using data of various sizes up to million records. Since all these methods require writing complex code if applied directly, we have prototyped an integration of them into Hibernate object-relational mapping system. This way all the peculiarities are hidden from application developers. Architects can simply choose the appropriate materialization method and record their decisions in configuration files. All necessary routines and storage objects are then generated automatically by the ORM layer.
  • Java Interface for Relaxed Object Storage
    268 Column-oriented DBMS, Indexing techniques Michal Danihelka, Michal Kopecký, Petr Švec, Michal Žemlička, pages 1447 – 1454. Show abstract Abstract. Most development tools manipulate objects by changing values of their attributes. If the object should change more radically, problems arise. The amount of available information can vary from instance to instance and can be collected incrementally. It can happen that there exists no class suitable for all known attributes and so even movement of the instance to another class can be complicated. We can create exhaustive number of classes to cover all predicted variants, but still some other combinations of data can occur. To solve this situation, appearing often during processing of heterogeneous and mutable data, the model of relaxed objects was invented. It is based on the idea that object classes should be defined loosely in form of conditions - presumptions on data content or availability - and that instances should belong implicitly to all classes that are currently met. Methods associated with such classes assure that each instance is provided by all currently executable methods and its behavior change dynamically with changes of its content. The paper describes the Java-based object interface for this model, its effectivity, and the domain index suitable for efficient data searching.
  • Approximate Assistance for Correlated Subqueries
    278 Column-oriented DBMS, Usage of fuzzy sets and rough sets in databases Marcin Kowalski, Dominik Ślęzak, Piotr Synak, pages 1455 – 1462. Show abstract Abstract. We discuss some enhancements of approximate SQL extensions available in Infobright's database technology. We explain how these new enhancements can speed up execution of complex correlated subqueries, which are quite popular in advanced database applications. We compare our research to the state-of-the-art solutions in the area of analytic databases. We also show in what sense our technology follows the principles of rough sets and granular computing.
  • Performance Antipatterns of One to Many Association in Hibernate
    322 Extending the capabilities of object relational mappings Patrycja Węgrzynowicz, pages 1463 – 1469. Show abstract Abstract. Hibernate is the most popular ORM framework for Java. It is a straightforward and easy-to-use implementation of Java Persistence API. However, its simplicity of usage often becomes mischievous to developers and leads to serious performance issues in Hibernate-based applications. This paper presents five performance antipatterns related to the usage of one-to-many associations in Hibernate. These antipatterns focus on the problems of the owning side of collections, the Java types and annotations used in mappings, as well as processing of collections. Each antipattern consists of the description of a problem along with a sample code, negative performance consequences, and the recommended solution. Performance is analyzed in terms of the number and complexity of issued database statement. The code samples illustrate how the antipatterns decrease performance and how to implement the mappings to speed up the execution times.

4th Workshop on Advances in Programming Languages

  • Magnify - a new tool for software visualization
    392 Program analysis, optimization and verification, Programming tools and environments Cezary Bartoszuk, Grzegorz Timoszuk, Robert Dąbrowski, Krzysztof Stencel, pages 1473 – 1476. Show abstract Abstract. Modern software systems are inherently complex. Their maintenance is hardly possible without precise up-to-date documentation. It is often tricky to document dependencies among software components by only looking at the raw source code. We address these issues by researching new software analysis and visualization tools. In this paper we focus on software visualisation. Magnify is our new tool that performs static analysis and visualization of software. It parses the source code, identifies dependencies between code units and records all the collected information in a repository based on a language-independent graph-based data model. Nodes of the graph correspond to program entities of disparate granularity: methods, classes, packages etc. Edges represent dependencies and hierarchical structure. We use colours to reflect the quality, sizes to display the importance of artefacts, density of connections to portray the coupling. This kind of visualization gives bird's-eye view of the source code. It is always up to date, since the tool generates it automatically from the current revision of software. In this paper we discuss the design of the tool and present visualizations of sample open-source Java projects of various sizes.
  • Conjunction, Sequence, and Interval Relations in Event Stream Processing
    109 Samujjwal Bhandari, Susan D. Urban, pages 1477 – 1482. Show abstract Abstract. The conjunction operator can be augmented with temporal constraints to define an arbitrary pattern of events in event stream processing (ESP). However, using temporal constraints to specify patterns can be complex. This research has defined an operator hierarchy, where the top of the hierarchy defines the conjunction operator and the leaves of the hierarchy define more specific semantics associated with a sequence of events. The use of the specialized operators simplifies pattern expression and make the sequence semantics clear. Furthermore, in an experimental study, patterns using operators from the hierarchy outperform patterns expressed using the conjunction operator with temporal constraints in run time performance and memory requirements, further validating the usefulness of the operator hierarchy.
  • Visual Programming of MPI Applications: Debugging and Performance Analysis
    173 Stanislav Böhm, Marek Běhálek, Ondřej Meca, Martin Šurkovský, pages 1483 – 1490. Show abstract Abstract. Our research is focused on the simplification of parallel programming for distributed memory systems. Our overall goal is to build a unifying framework for creating, debugging, profiling and verifying parallel applications. The key aspect is a visual model inspired by Colored Petri Nets. In this paper, we will present how to use the visual model for debugging and profiling as well. The presented ideas are integrated into our open source tool Kaira.
  • pLERO: Language for Grammar Refactoring Patterns
    163 Automata theory and applications, Compiling techniques, Grammarware and grammar based systems, Language theory and applications Ján Kollár, Ivan Halupka, Sergej Chodarev, Emília Pietriková, pages 1491 – 1498. Show abstract Abstract. Grammar-dependent software development and grammarware engineering have recently received considerable attention. As a significant cornerstone of grammarware engineering, grammar refactoring is, nevertheless, still weakly understood and practiced. In this paper, we address this issue by introducing pLERO, formal specification language for preserving knowledge of grammar engineers, complementing mARTINICA, the universal approach for automated refactoring of context-free grammars. With respect to other approaches, advantage of mARTINICA lies in refactoring on the basis of user-defined refactoring task, rather than of a fixed objective of the refactoring process. To understand the unified refactoring process, this paper also provides a brief insight into grammar refactoring operators, providing universal refactoring transformations for specific context-free grammars. To preserve knowledge considering refactoring process, we propose formalism based on patterns, seen as well-proven way of knowledge preservation in variety of domains, such as software architectures.
  • Incremental JIT Compiler for Implicitly Parallel Functional Language
    339 Compiling techniques, Language concepts, design and implementation, Virtual machines and just-in-time compilation Petr Krajča, pages 1499 – 1506. Show abstract Abstract. We present a novel method for automatic parallelization of functional programs which combines interpretation and just-in-time compilation. We propose an execution model for a Lisp-based programming language which involves a runtime environment which is able to identify portions of code worth running in parallel and is able to spawn new threads of execution. Furthermore, in order to achieve better performance, runtime environment dynamically identifies expressions worth compiling and compiles them into a native code.
  • Reconstruction of Instruction Idioms in a Retargetable Decompiler
    153 Compiling techniques, Practical experiences with programming languages, Program analysis, optimization and verification, Program generation and transformation Jakub Křoustek, Fridolín Pokorný, pages 1507 – 1514. Show abstract Abstract. Machine-code decompilation is a reverse-engineering discipline focused on reverse compilation. It performs an application recovery from binary executable files back into the high level language (HLL) representation. One of its critical tasks is to produce an accurate and well-readable code. However, this is a challenging task since the executable code may be produced by one of the modern compilers that use advanced optimizations. One type of such an optimization is usage of so-called instruction idioms. These idioms are used to produce faster or even smaller executable files. On the other hand, decompilation of instruction idioms without any advanced analysis produces almost unreadable HLL code that may confuse the user of a decompiler. In this paper, we present a method of instruction-idioms detection and reconstruction back into a readable form with the same meaning. This approach is adapted in an existing retargetable decompiler developed within the Lissom project. The implementation has been tested on several modern compilers and target architectures. According to our experimental results, the proposed solution is highly accurate on the RISC (Reduced Instruction Set Computer) processor families, but it should be further improved on the CISC (Complex Instruction Set Computer) architectures.
  • Declarative Specification of References in DSLs
    342 Domain-specific languages, Language concepts, design and implementation, Metamodeling and modeling languages, Specification languages Dominik Lakatoš, Jaroslav Porubän, Michaela Bačíková, pages 1515 – 1522. Show abstract Abstract. The occurrence of identifiers and references in computer languages is a common issue. The same applies for domain specific languages, whose popularity is increasing and there is a need for aid in their design process. This paper analyses the problem of identifiers and references in computer languages. Current methods use an imperative approach for supporting references in languages; therefore a language designer is required to manually write reference resolving. The method proposed in this paper perceives references and identifiers as language patterns, which can be specified in a declarative manner with much less knowledge about the problem of resolving references in computer languages.
  • SimpleConcepts: Support for Constraints on Generic Types in C++
    302 Language concepts, design and implementation Reed Milewicz, Marjan Mernik, Peter Pirkelbauer, pages 1523 – 1528. Show abstract Abstract. Generic programming plays an essential role in C++ software through the use of templates. However, both the creation and use of template libraries is hindered by the fact that the language does not allow programmers to specify constraints on generic types. To date, no proposal to update the language to provide concepts has survived the committee process. Until that time comes, as a form of early support, this paper introduces SimpleConcepts, an extension to C++11 that provides support for concepts, sets of constraints on generic types. SimpleConcepts features are parsed according to an island grammar and source-to-source translation is used to lower concepts to pure C++11 code.
  • Concern-oriented Source Code Projections
    313 Programming paradigms (aspect-oriented, functional, logic, object-oriented, etc., Programming tools and environments Matej Nosáľ, Jaroslav Porubän, Milan Nosáľ, pages 1529 – 1532. Show abstract Abstract. The quality of the source code structure is a matter of the point of view, one programmer might consider one structure the best, the other not. A concrete structure can help in certain situations with the program understanding. Therefore we propose using dynamic structuring that allows assigning multiple structures to one source code to aid program comprehension. Concern-oriented source code projections facilitate this dynamic structuring expressed by custom metadata and provide multiple views of the source code that reflect logical structures provided by the dynamic structuring. This way in a specific situation a programmer can get a structure (by a view) that meets his/her current needs the best.
  • Teaching Programming through Problem Solving: The Role of the Programming Language
    344 Language concepts, design and implementation, Practical experiences with programming languages Nikolaos S. Papaspyrou, Stathis Zachos, pages 1533 – 1536. Show abstract Abstract. In this short paper, we advocate the importance of problem solving for teaching "Introduction to Programming", instead of merely teaching the syntax and semantics of a programming language. We focus on the role of the programming language used for an introductory course. For this purpose we propose CAL, a C-like algorithmic language, which is essentially a well-defined and behaved subset of C with a small number of modest, "educational" extensions. We present the design rationale for CAL, its main features, syntax and illustrative examples.
  • Compilation to Quantum Circuits for a Language with Quantum Data and Control
    349 Compiling techniques, Language concepts, design and implementation, Programming paradigms (aspect-oriented, functional, logic, object-oriented, etc. Yannis Rouselakis, Nikolaos S. Papaspyrou, Yiannis Tsiouris, Eneia N. Todoran, pages 1537 – 1544. Show abstract Abstract. In this paper we further investigate nQML, a functional quantum programming language that follows the "quantum data and control" paradigm. We define a semantics for nQML, which translates programs to quantum circuits in the category FQC of finite quantum computations, following the approach of Altenkirch and Grattage's QML. This semantics, which coincides with the denotational semantics for nQML over density matrices and unitary transformations, serves as a compiler from nQML programs to quantum circuits. We also provide an implementation of this compiler, written in Haskell, as well as an interpreter for quantum circuits.
  • Grammar-Driven Development of JSON Processing Applications
    287 Domain-specific languages, Grammarware and grammar based systems, Markup languages (XML), Programming tools and environments Antonio Sarasa-Cabezuelo, José-Luis Sierra, pages 1545 – 1552. Show abstract Abstract. This paper describes how to use conventional parser generation tools for the development of JSON processing applications. According to the resulting grammar-driven development approach, JSON processing applications are architected as syntax-directed translators. Thus, the core part of these components can be described in terms of translation schemata and can be automatically generated by using suitable parser generators. It makes it possible to specify critical parts of the application (those interfacing with JSON documents) by using high-level, grammar-oriented descriptions, as well as to promote the separation of JSON processing concerns from other application-specific aspects. In consequence, the production and maintenance of JSON processing applications is facilitated (especially for applications involving JSON documents with intricate nested structures, as well as for applications in which JSON formats are exposed to frequent changes and evolutions in their surface structures). This paper illustrates the approach with JSON-P as the generic JSON processing framework, with ANTLR as the parser generation tool, and with a case study concerning the development of a player for simple man-machine dialogs shaped in terms of JSON documents.
  • Alvis Language with Time Dependence
    235 Formal semantics and syntax, Metamodeling and modeling languages, Model-driven engineering languages and systems, Visual programming languages Marcin Szpyrka, Piotr Matyasik, Michał Wypych, pages 1553 – 1558. Show abstract Abstract. The paper presents the semantics for the time version of the Alvis modelling language. Alvis combines possibilities of formal models verification with flexibility and simplicity of practical programming languages. The considered time Alvis language is suitable for formal verification of real-time systems. The paper contains description of: the Alvis time model, states and transitions between states and snapshot reachability graphs that represent models state spaces in the form of directed graphs.
  • Relaxing Queries to Detect Variants of Design Patterns
    317 Program analysis, optimization and verification, Programming tools and environments Patrycja Węgrzynowicz, Krzysztof Stencel, pages 1559 – 1566. Show abstract Abstract. Design patterns codify general solutions to frequently encountered design problems. They also facilitate writing robust and readable code. Their usage happens to be particularly profitable if the documentation of the resulting system is lost, inaccurate or out of date. In reverse engineering, detection of instances of design patterns is extremely helpful as it aids grasping high level design ideas. However, the actual instances of design patterns can diverge from their canonical textbook templates. Useful pattern detection tools should thus be able to identify not only orthodox implementations but also their disparate variants. In this paper, we present a method to generate queries to detect canonical instances of design patters. Next, we show a systematic technique to relax these queries so that they also cover variant implementations of patterns. we discuss our proof-of-concept implementation of this approach in our prototype tool D-CUBED. Finally, we also report the results of an experimental comparison of D-CUBED and state-of-the-art detectors.
  • FAL: A Forensics Aware Language for Secure Logging
    127 Domain-specific languages, Formal semantics and syntax, Grammarware and grammar based systems, Language concepts, design and implementation, Languages and tools for trustworthy computing, Program generation and transformation Shams Zawoad, Marjan Mernik, Ragib Hasan, pages 1567 – 1574. Show abstract Abstract. Trustworthy system logs and application logs are crucial for digital forensics. Researchers have proposed different security mechanisms to ensure the integrity and confidentiality of logs. However, applying current secure logging schemes on heterogeneous formats of logs is tedious. Here, we propose FAL, a domain-specific language (DSL) through which we can apply a secure logging mechanism on any format of logs. Using FAL, we can define log structure, which represents the format of logs and ensures the security properties of a chosen secure logging scheme. This log structure can be later used by FAL to serve two purposes: it can be used to store system logs securely, and it will help application developers for secure application logging by generating required source code.
  • Dynamic loop reversal - the new code transformation technique
    19 Compiling techniques, Program analysis, optimization and verification, Program generation and transformation Ivan Šimeček, Pavel Tvrdík, pages 1575 – 1582. Show abstract Abstract. In this paper, we describe a new source code transformation called dynamic loop reversal that can increase temporal and spatial locality. We also describe a formal method for predicting the cache behaviour and evaluation results of the accuracy of the model by measurements on a cache monitor. The comparisons of the numbers of measured cache misses and the numbers of cache misses estimated by the model indicate that model is relatively accurate and can be used in practice.
TeXnical Editor: Aleksander Denisiuk
E-mail:
Phone/fax: +48-89-5246089