Logo PTI
Polish Information Processing Society
Logo FedCSIS

Federated Conference on Computer Science and Information Systems

September 18–21, 2011. Szczecin, Poland

Proceedings

ISBN 978-83-60810-34-4
IEEE Catalog Number CFP1185N-USB

Complete FedCSIS Proceedings (PDF, 67.968 M)

Preface

6th International Symposium “Advances in Artificial Intelligence and Applications”

  • A Web Statistics based Conflation Approach to Improve Arabic Text Retrieval
    53 n-gram,conflation approach,mutual information Farag Ahmed, Andreas Nürnberger pages 3 – 9. Show abstract Abstract. We present a language independent approach for conflation that does not depend on predefined rules or prior knowledge of the target language. The proposed unsupervised method is based on an enhancement of the pure n-gram model that is used to group related words based on a revised string similarity measure. In order to detect and eliminate terms that are created by this process, but that are most likely not relevant for the query (“noisy terms”), an approach based on mutual information scores computed based on web statistical cooccurrences data is proposed.
  • Improving the predictiveness of ICU medical scales by the method of pairwise comparisons
    25 medical scale,Expert system,consistency-driven,pairwise comparisons,inconsistency analysis Mohammed Alqarni, Yassen Arabi, Tamar Kakiashvili, Mohammed Khedr, Waldemar W. Koczkodaj, J. Leszek, A. Przelaskowski, K. Rutkowski, pages 11 – 17. Show abstract Abstract. This study demonstrates how to improve the predictability of one of the commonly used ICUs severity of illness scales, namely APACHE II, by using the consistency-driven pairwise comparisons (CDPC) method. From a conceptual view, there is little doubt that all items have exactly equal importance or contribution to predicting mortality risk of patients admitted to ICUs. Computing new weights for all individual items is a considerable step forward since it is based on reasonable to assume that not all individual items have equal contribution in predicting mortality risk. The received predictability improvement is 1.6% (from 70.9% to 72.5%) and the standard error decreased from 0.046 to 0.045. This must be taken as an indication of the right way to go.
  • A rough k-means fragile watermarking approach for image authentication
    167 Watermarking,Rough Sets,Rough K-means clustering Lamiaa M. El Bakrawy, Neveen I. Ghali, Aboul ella Hassanien, Tai-hoon Kim, pages 19 – 23. Show abstract Abstract. In the past few years, various fragile watermarking systems have been proposed for image authentication and tamper detection. In this paper, we propose a rough k-means fragile watermarking approach with a block-wise dependency mechanism which can detect any alterations made to the protected image. Initially, the input image is divided into blocks with equal size in order to improve image tamper localization precision. By combining image local properties with human visual system, authentication data are acquired. By computing the class membership degree of each image block property, data are generated by applying rough k-means clustering to create the relationship between all image blocks and cluster all of them. The embed watermark is carried by least significant bits (LSBs) of each pixel within each block. The effectiveness of the proposed scheme is demonstrated through a series simulations and experiments. Experimental results show that the proposed approach can embed watermark without causing noticeable visual artifacts, and does not only achieve superior tamper detection in images accurately, it also recovers tampered regions effectively. In addition, the results show that the proposed approach can effectively thwart different attacks, such as the cut-and paste attack and collage attack, while sustaining superior tamper detection and localization accuracy. Furthermore, the results show that the proposed scheme can embed watermark without causing noticeable visual artifacts.
  • Sparse PCA for gearbox diagnostics
    129 principal component analysis,sparse PCA,reduction of dimensionality,sparse regression,vibration data,gearbox diagnostics Anna Bartkowiak, Radosław Zimroz, pages 25 – 31. Show abstract Abstract. The paper presents our experience in using sparse principal component (PCs) (Zou, Hastie and Tibshirani, 2006) for visualization of gearbox diagnostic data recorded for two bucket wheel field excavators, one in bad and the other in good state. The analyzed data had 15 basic variables. Our result is that two sparse PCs, based on 4 basic variables, yield similar display as classical pair of first two PCs using all fifteen basic variables. Visualization of the data in Kohonen's SOMs confirms the conjecture that smaller number of variables reproduces quite well the overall structure of the data. Specificities of the applied spca method are discussed.
  • CWJess: Implementation of an Expert System Shell for Computing with Words
    118 Computing with Words,Expert System Shell,knowledge representation,Jess,fuzzy logic Elham S. Khorasani, Shahram Rahimi, Patel Purvag, Daniel Houle, pages 33 – 39. Show abstract Abstract. Computing with Words (CW) is an emerging paradigm in knowledge representation and information processing. It provides a mathematical model to represent the meaning of imprecise words and phrases in natural language, and to perform reasoning on perceptual knowledge. This paper describes a preliminary extension to Jess, CWJess, which allows reasoning in the framework of Computing with Words (CW). The resulting inference shell significantly enhances the expressiveness and reasoning power of expert systems and provides a Java API which allows users to express various types of fuzzy concepts, including: fuzzy graphs, fuzzy relations, fuzzy arithmetic expression, and fuzzy quantified propositions. CWJess is fully integrated with jess and utilizes jess Rete network to perform a chain of reasoning on fuzzy propositions.
  • Visual Exploration of Cash Flow Chains
    112 banking chain representation,visual exploration of databases,optimization of the visual representation of a graph,AML systems Jerzy Korczak, Walter Łuszczyk, pages 41 – 46. Show abstract Abstract. The paper proposes a new method for interactive visual exploration of the chains of banking transactions to facilitate the discovery of patterns of money laundering operations. Facilitating mainly consists of displaying and annotation of selected groups of operations in a graph showing the chains of transactions between accounts. It is shown how you can programmatically and interactively reduce the volume of the chains surveyed and limit the analysis to the most important transactions. In order to improve readability the transaction graph, an evolution-based algorithm has been designed to optimize the visual representation of the graph of the transaction chain. The prototype system is verified on the actual database of banking transactions. The conducted experiments have shown that allowing user interaction with the system led to a large extent, to accelerate the search process and enrich the functionality of the system.
  • Logical Inference in Query Answering Systems Based on Domain Ontologies
    140 query answering systems,domain ontologies,relations,relational equations Juliusz Kulikowski, pages 47 – 53. Show abstract Abstract. This paper describes a proposal of using ontological models as a basis of Query Answering Systems design. It is assumed that the models are presented in the form of relations described on some classes of items (ontological concepts) specified by taxonomic trees. There are analyzed the sufficient and necessary conditions for getting the replies to the queries as solution of relational equations based on the data provided by ontological databases. Simple examples illustrate basic concepts of practical realization of the Query Answering Systems based on domain ontologies.
  • Competitive and self-contained gene set analysis methods applied for class prediction
    235 sample classification,feature selection,pathway analysis,gene expression microarrays Henryk Maciejewski, pages 55 – 61. Show abstract Abstract. This paper compares two methodologically different approaches to gene set analysis applied for selection of features for sample classification based on microarray studies. We analyze competitive and self-contained methods in terms of predictive performance of features generated from most differentially expressed gene sets (pathways) identified with these approaches. We also observe stability of features returned. We use the features to train several classifiers (e.g., SVM, random forest, nearest shrunken centroids, etc.) We generally observe smaller classification errors and better stability of features produced by the self-contained algorithm. This comparative study is based on the leukemia data set published in [S. Chiaretti, et al., “Gene expression profile of adult T-cell acute lymphocytic leukemia identifies distinct subsets of patients with different response to therapy and survical,” Blood, vol. 103, 2004, pp. 2771–2778.].
  • Knowledge patterns for conversion of sentences in natural language into RDF graph language
    126 natural language processing,knowledge patterns,RDF graph language Rostislav Miarka, Martin Žáček, pages 63 – 68. Show abstract Abstract. This paper describes the knowledge patterns for the conversion of sentences in natural language into RDF graph language. While creating knowledge base in RDF graph language from sentences expressed in natural language, one must convert words from sentences to nodes and arcs, in RDF graphs. For this conversion, it is important to know which members of a sentence represent particular words. Knowledge patterns are proposed as a tool for conversion of sentences in this paper. In order to capture knowledge patters one can use extended RDF graph language. For the representation of knowledge patterns, further extension of this language is propsed. The paper contains four examples of knowledge patterns and their use.
  • Graph Mining Approach to Suspicious Transaction Detection
    218 money laundering detection,suspicious transaction detection,graph mining,inexact graph matching,graph structure learning Krzysztof Michalak, Jerzy Korczak, pages 69 – 75. Show abstract Abstract. Suspicious transaction detection is used to report banking transactions that may be connected with criminal activities. Obviously, perpetrators of criminal acts strive to make the transactions as innocent-looking as possible. Because activities such as money laundering may involve complex organizational schemes, machine learning techniques based on individual transactions analysis may perform poorly, when applied to suspicious transaction detection. In this paper, we propose a new machine learning method for mining transaction graphs. The method proposed in this paper builds a model of subgraphs that may contain suspicious transactions. The model used in our method is parametrized using fuzzy numbers which represent parameters of transactions and of the transaction subgraphs to be detected. Because money laundering may involve transferring money through a~variable number of accounts the model representing transaction subgraphs is also parametrized with respect to some structural features. In contrast to some other graph mining methods in which graph isomorphisms are used to match data to the model, in our method we perform a fuzzy matching of graph structures.
  • Growing Hierarchical Self-Organizing Map for searching documents using visual content
    216 hierarchical clustering,GHSOM, images clustering,image search,document search Paweł Myszkowski, Bartłomiej Buczek, pages 77 – 81. Show abstract Abstract. His paper presents document search model based on its visual content. There is used hierarchical clustering algorithm—GHSOM. Description of proposed model is given as learning and searching phase. Also some experiments are described on benchmark image sets (e.g. ICPR, Flickr) and created document set. In paper some first results are given and directions of further research.
  • Automatic Image Annotation by Image Fragment Matching
    225 Image recognition,Image matching,Object localization Mariusz Paradowski, Andrzej Śluzek, pages 83 – 89. Show abstract Abstract. Automatic image annotation problem is one of the most important sub-problems of computer vision. It is strongly related to and goes beyond image recognition. The key goal of annotation is to assign a set of words from a given dictionary to a previously unseen image. In this paper, we address two key problems related to image annotation. The first one is low precision of generated answers, the second one is automatic localization of image fragments related to annotations. The proposed method utilizes image fragment matching to precisely localize near-duplicate visual content of images.
  • How to predict future in a world of antibody-antigen chromosomes
    227 Artificial immune system,Genetic algorithm,Binary coding,Hadamard representation Zbigniew Pliszka, Olgierd Unold, pages 91 – 96. Show abstract Abstract. The paper deals with a representation of the antibody-antigen chromosomes. The proposed new binary decoding allows us to prove the dependence between subsequent generations of chromosomes, using quick and simple operations on chromosomes indices, instead of processing the binary strings. Some formal properties of the immune system were expressed based on this representation.
  • The Fuzzy Genetic Strategy for Multiobjective Optimization
    131 genetic algorithm,fuzzy logic,multiobjective optimization Krzysztof Pytel, pages 97 – 101. Show abstract Abstract. This paper presents the idea of fuzzy controlling of evolution in the genetic algorithm (GA) for multiobjective optimization. The genetic algorithm uses the Fuzzy Logic Controller (FLC), which manages the process of selection of individuals to the parents' pool and mutation of their genes. The FLC modifies the probability of selection and mutation of individuals' genes, so algorithms possess improved convergence and maintenance of suitable genetic variety of individuals. We accepted the well-known LOTZ problem as a benchmark for experiments. In the experiments we investigated the operating time and the number of fitness function calls needed to finish optimization. We compared results of the elementary algorithms and the modified algorithm with the modification of probability of selection and mutation of individuals. Some good results have been obtained during the experiments.
  • New property for rule interestingness measures
    150 Rule interestingness measures,Properties of measures,Confirmation Izabela Szczęch, Salvatore Greco, Roman Słowinski, pages 103 – 108. Show abstract Abstract. The paper considers interestingness measures for evaluation of rules induced from data with respect to two properties: property of Bayesian confirmation and property Ex1 concerning the behavior of measures in case of entailment or refutation of the conclusion by the rule's premise. We demonstrate that property Ex1, even though created for confirmation measures, does not fully reflect the concept of confirmation. We propose a modification of this property, called weak Ex1, that deploys the concept of confirmation in its larger sense and allows to escape paradoxes that might appear when using measures satisfying the original Ex1 property.
  • The Add-Value of Cases on WUM Plans Recommendation
    104 Web Usage Mining,Case Based Reasoning,Clickstream Analysis,Data Mining Plans Generation Cristina Wanzeller, Orlando Belo, pages 109 – 116. Show abstract Abstract. Web Usage Mining is nowadays extremely useful to a diverse and growing number of users, from all types of organizations trying hard to reach the goals of their Web sites. However, inexperienced users, in particular, face several difficulties on developing and applying this kind of mining processes. One crucial and challenging task is selecting proper mining methods to deal with clickstream data analysis problems. We have been engaged on designing, developing and implementing a case based reasoning system, specifically devoted to assists users on knowledge discovery from clickstream data. The system?s main aim is to recommend the most suited mining plans, according to the nature of the problem under analysis. In this paper we present such system, giving emphasis to the retrieving of similar cases using a preliminary constructed case base.
  • Problem of website structure discovery and quality valuation
    49 website structure,quality metric,automated structure discovery,website model,graph energy Dmitrij Żatuchin, pages 117 – 122. Show abstract Abstract. Navigation as a part of an interface was always an important issue of a design process. Because information architecture and the navigation of current websites are very complex, especially of e-commerce websites or information portals, it is very hard to analyze or redesign a structure in a manual way. In order to solve the problem of automation of website structure analysis, there should be defined its model. Also, during the study of a subject it was found, that there is a lack of a quality estimator, which allows to valuate in a vary moments the quality of the structure. Observation of a structure quality gives possibility to analyze and decide when the structure should be changed basing on decision rules or calculated thresholds for analyzed amount of time. The main aim of this study is to describe a model for website structure representation, derive the quality estimator, define and solve the problem of website structure discovery and quality valuation utilizing the proposed metric. Finally, experiment with utilization of proposed methods is presented.

International Workshop on Artificial Intelligence in Medical Applications

  • Identification of Patient Deterioration in Vital-Sign Data using One-Class Support Vector Machines
    153 support vector machine,novelty detection,one-class classification,parameter optimisation,partial AUC Lei Clifton, David Clifton, Peter Watkinson, Lionel Tarassenko, pages 125 – 131. Show abstract Abstract. Adverse hospital patient outcomes due to deterioration are often preceded by periods of physiological deterioration that is evident in the vital signs, such as heart rate, respiratory rate, etc. Clinical practice currently relies on periodic, manual observation of vital signs, which typically occurs every 2-to-4 hours in most hospital wards, and so patient deterioration may go unidenfitied. While continuous patient monitoring systems exist for those patients who are confined to a hospital bed, the false alarm rate of conventional systems is typically so high that the alarms generated by them are ignored. This paper explores the use of machine learning methods for automatically identifying patient deterioration, using data acquired from continuous patient monitors. We compare generative and discriminative techniques (a probabilistic method using a mixture model, and a support vector machine, respectively). It is well-known that parameter tuning affects the performance of such methods, and we propose a method to optimise parameter values using “partial AUC.” We demonstrate the performance of the proposed method using both synthetic data and patient vital-sign data collected from a recent observational clinical study.
  • Data Mining Research Trends in Computerized Patient Records
    142 data mining,computerised patient records,healthcare systems Payam Homayounfar, Mieczyslaw Owoc, pages 133 – 139. Show abstract Abstract. Over the last decades has the research on Data Mining made a great progress. Also the Computerized Patient Records (CPR) as part of Hospital Information Systems have improved in terms of usability, content coverage, and diffusion rate. The number of Health Care Organizations using the CPR is growing. Causally determined is the need for techniques and models to provide solutions for decision making based on the data stored from different sources in CPR. This paper provides an overview on the current research trends and shows the impact on the medical domain with the CPR.
  • A Bezier Curve Approximation of the Speech Signal in the Classification Process of Laryngopathies
    221 computer-based clinical decision support,recurrent neural networks,laryngopathies,Bezier curves,approximation Krzysztof Pancerz, Jaroslaw Szkola, Jan Warchol, pages 141 – 146. Show abstract Abstract. The research concerns computer-based clinical decision support for laryngopathies. The classification process is based on a speech signal analysis in the time domain using recurrent neural networks. In our experiments, we use the modified Elman-Jordan neural network. In the preprocessing step, an original signal is approximated using Bezier curves and next the neural network is trained. Bezier curve approximation reduces amount of data to be learned as well as removes a noise from the original signal.
  • Validation of Data Categorization Using Extensions of Information Systems: Experiments on Melanocytic Skin Lesion Data
    203 extensions of information systems,data categorization,rough sets,melanocytic skin lesions Krzysztof Pancerz, Grzegorz Owsiany, Łukasz Piątek pages 147 – 151. Show abstract Abstract. The purpose of data categorization is to group similar cases (items, examples, objects, etc.) together under a common label so that information can be acted upon in the aggregate form. Sometimes, this process is made arbitrary by an expert. For each case, an expert determines a class (group) to which the case is classified. In the paper, we propose a method for validation of a categorization process. The method is based on extensions of information systems defined in terms of rough sets. Usefulness of the proposed method is shown for the data used in the synthesis of images of melanocytic skin lesions.
  • Interval based attribute evaluation algorithm
    169 Feature extraction,Classification,ChiMerge Mostafa Salama, Nashwa El-Bendary, Aboul Ella Hassanien, Kenneth Revett, Aly A. Fahmy, pages 153 – 156. Show abstract Abstract. Attribute values may be either discrete or continuous. Attribute selection methods for continuous attributes had to be preceded by a discretization method to act properly. The resulted accuracy or correctness has a great dependent on the discretization method. However, this paper proposes an attribute selection and ranking algorithm without introducing such technique. The proposed algorithm depends on a hypothesis that the decrease of the overlapped interval of values for every class label leads to increase the importance of such attribute. Such hypothesis were proved by comparing the results of the proposed algorithm to other attribute selection algorithms. The comparison between different attribute selection algorithms is based on the characteristics of relevant and irrelevant attributes and their effect on the classification performance. The results shows that the proposed attribute selection algorithm leads to a better classification performance than other methods. The test is applied on medical data sets that represent a real life continuous data sets.
  • Medical Image Segmentation Using Information Extracted from Deformation
    174 pattern recognition segmentation,Medical image analysis,Deformation Kai Xiao, Neven Ghali, Aboul Ella Hassanien, pages 157 – 163. Show abstract Abstract. Deformation of normal structures in medical images has normally been considered as an undesired and challenging issue to be tackled in the segmentation and registration tasks. By treating defromation as useful information, this paper proposes an approach to utilize the correlation between lateral ventricular deformation and tumor to improve tumor segmentation accuracy in human magnetic resonance (MR) image.With this information, comparative experiments using pattern recognition segmentation methods show the improved tumor segmentation accuracy in some image cases.
  • Discovering similarities for the treatments of liver specific parasites
    37 Biomedical Text Mining,Clustering Analysis,Liver,Parasite Pinar Yildirim, Kagan Ceken, Osman Saka, pages 165 – 168. Show abstract Abstract. Medline articles are rich resources for discovering hidden knowledge for the treatments of liver specific parasites. Knowledge acquisition from these articles requires complex processes depending on biomedical text mining techniques. In this study, name entity recognition and hierarchical clustering techniques were used for advanced drug analyses. Drugs were extracted from the articles belonging to specific time periods and hierarchical clustering was applied on parasite and drug datasets. Hierarchical clustering results revealed that some parasites have similar in terms of treatment and the others are different. Our results also showed that, there have not been major changes in the treatment of liver specific parasites for the past four decades and there are problems associated with the development of new drugs. Both pharmaceutical initiatives and healthcare providers should investigate major drawbacks and develop some strategies to overcome these problems.
  • Reliability Analysis of Healthcare System
    175 Reliability,Multi-State System,Importance Analysis,Logical Differential Calculus,Healthcare system Elena Zaitseva, Vitaly Levashenko, Miroslav Rusin, pages 169 – 175. Show abstract Abstract. Modern system is complex and includes different types of components such as software, hardware, human factor. Reliability is principal property of this system. The importance analysis is one of approaches in reliability engineering. Application of this approach for healthcare system is considered in this paper. The importance reliability analysis allows estimating the influence of every healthcare system component to the system reliability, its functioning and failure.

1st International Workshop on Advances in Semantic Information Retrieval

  • Fuzzy Cognitive Maps Theory for the Political Domainn
    34 Fuzzy Logic,Protégé,Ontology,Based Ontology,Governmental System Sameera Alshayji, Nahla Elzant Elkadhi, Zidong Wang, pages 179 – 186. Show abstract Abstract. Acceleration of regional and international events contributes to the increasing challenges in political decision making, especially the decision to strengthen bilateral economic relationships between friendly nations. Obviously this becomes one of the critical decisions. Typically, such decisions are influenced by certain factors and variables that are based on heterogeneous and vague information. A serious problem that the decision maker faces is the difficulty in building efficient political decision support systems (DSS) with heterogeneous factors. The basic concept is a linguistic variable whose values are words rather than numbers and therefore closer to human intuition. Fuzzy logic is based on natural language and is tolerant of imprecise data. Furthermore, fuzzy cognitive Mapping (FCM) is particularly applicable in the soft knowledge domains like political science. In this paper, a FCM scheme is proposed to demonstrate the causal inter-relationship between certain factors to provide insight into better understanding about the interdependencies of these factors.
  • Building a Model of Disease Symptoms Using Text Processing and Learning from Examples
    158 natural language processing,semantic model,disease symptoms,clusterization Marek Jaszuk, Grazyna Szostek, Andrzej Walczak, Leszek Puzio, pages 187 – 194. Show abstract Abstract. The paper describes a methodology of building a semantic model of disease symptoms. The fundamental techniques used for creating the model are text analysis and learning from examples. The text analyser is used for extracting a set of symptom descriptions. The descriptions are a foundation for delivering a user interface, necessary for collecting patient cases. Given the cases a semantic model is built, which is achieved through clusterisation and statistical analysis of cases. The approach to creating the model eliminates the need of direct model manipulation, because the meaning is retrieved from association to diseases instead of purely linquistic interpretation of symptom descriptions. Detection of synonyms is also completely automatized.
  • Query Expansion: Term Selection using the Semantic Relatedness Measure
    83 semantic similarity,query expansion,information retrieval Vitaly Klyuev, Yannis Haralambous, pages 195 – 199. Show abstract Abstract. In this paper, we investigate the effectiveness of the semantic relatedness measure in a cross-lingual retrieval task. This measure combines the Wikipedia-based Explicit Semantic Analysis measure, the WordNet path measure and the mixed collocation index. In our experiments, we utilized the open source search engine Terrier as a tool to index and retrieve data. The proposed techniques, we tested on the NTCIR data collection. Our experiments demonstrated promising results.
  • LTIMEX: Representing the Local Semantics of Temporal Expressions
    229 temporal expression,local semantics,LTIMEX,semantic representation,annotation scheme,temporal tagging,TIMEX2,TimeML Paweł Mazur, Robert Dale, pages 201 – 208. Show abstract Abstract. Semantic information retrieval requires that we have a means of capturing the semantics of documents; and a potentially useful feature of the semantics of many documents is the temporal information they contain. In particular, the temporal expressions contained in documents provide important information about the time course of the events those documents describe. Unfortunately, temporal expressions are often context-dependent, requiring the application of information about the context in order to work out their true values. We describe a representational formalism for temporal information that captures what we call the local semantics of such expressions; this permits a modularity whereby the context-independent contribution to meaning can be computed independently of the global context of interpretation, which may not be immediately or easily available. Our representation, LTIMEX, is intended as an extension to widely-used TIMEX2 and TimeML representations.
  • Dependency-Based Rules for Grammar Checking with LanguageTool
    14 grammar checking,natural language parsing,text processing Maxim Mozgovoy, pages 209 – 212. Show abstract Abstract. This paper describes a possible extension of well-known open source grammar checking software LanguageTool. The proposed extension allows the developers to write grammar rules that rely on natural language parser-supplied dependency trees. Such rules are indispensable for the analysis of word-word links in order to handle a variety of grammar errors, including improper use of articles, incorrect verb government, and wrong word form agreement.
  • Preserivng pieces of information in a given order in HRR and GA$_c$
    176 distributed representations,geometric algebra,HRR,BSC,word order,trajectory,associations,bag of words Agnieszka Patyk-Łońska, pages 213 – 220. Show abstract Abstract. Geometric Analogues of Holographic Reduced Representations (GA HRR or GA$_c$ -- the continuous version of discrete GA described in [A. Patyk, “Geometric Algebra Model of Distributed Representations”, in Geometric Algebra Computing in Engineering and Computer Science, E. Bayro-Corrochano and G. Scheuermann, eds. Berlin: Springer, 2010. Preprint arXiv:1003.5899v1 (cs.AI).]) employ role-filler binding based on geometric products. Atomic objects are real-valued vectors in $n$-dimensional Euclidean space and complex statements belong to a hierarchy of multivectors. A property of GA$_c$ and HRR studied here is the ability to store pieces of information in a given order by means of trajectory association. We describe results of an experiment: finding the alignment of items in a sequence without the precise knowledge of trajectory vectors. The paper ends with remarks on perspective applications of geometric algebra to quantum algorithms.
  • Some tests on geometric analogues of Holographic Reduced Representations and Binary Spatter Codes
    70 distributed representations,geometric algebra,HRR,BSC,scaling Agnieszka Patyk-Łońska, Marek Czachor, Diederik Aerts, pages 221 – 228. Show abstract Abstract. Geometric Analogues of Holographic Reduced Representations (GA HRR) employ role-filler binding based on geometric products. Atomic objects are real-valued vectors in $n$-dimensional Euclidean space and complex statements belong to a hierarchy of multivectors. The paper reports a battery of tests aimed at comparison of GA HRR with Holographic Reduced Representation (HRR) and Binary Spatter Codes (BSC). Firstly, we perform a test of GA HRR which is analogous to the one proposed by Plate in [T. Plate, Holographic Reduced Representation: Distributed Representation for Cognitive Structures. CSLI Publications, Stanford, 2003]. Plate's simulation involved several thousand 512-dimensional vectors stored in clean-up memory. The purpose was to study efficiency of HRR but also to provide a counterexample to claims that role-filler representations do not permit one component of a relation to be retrieved given the others. We repeat Plate's test on a continuous version of GA HRR -- GA$_c$ (as opposed to its discrete version described in [A. Patyk, “Geometric Algebra Model of Distributed Representations”, in Geometric Algebra Computing in Engineering and Computer Science, E. Bayro-Corrochano and G. Scheuermann, eds. Berlin: Springer, 2010. Preprint arXiv:1003.5899v1 (cs.AI).]) and compare the results with the original HRR and BSC. The object of the test is to construct statements concerning multiplication and addition. For example, ``$2\cdot 3=6$" is constructed as $times_{2,3} = times + operand ast (num_2 + num_3) + resultast num_6$. To look up this vector one then constructs a similar statement with one of the components missing and checks whether it points correctly to $times_{2,3}$. We concentrate on comparison of recognition percentage for the three models for comparable data size, rather than on the time taken to achieve high percentage. Results show that the best models for storing and recognizing multiple similar statements are GA$_c$ and Binary Spatter Codes with recognition percentage highly above 90.

Workshop on Computational Optimization

  • Task Scheduling with Restricted Preemptions
    68 scheduling,preemptions,heuristic Tomasz Baranski, pages 231 – 238. Show abstract Abstract. One of basic problems in task scheduling is finding the shortest schedule for a given set of tasks. In this paper we analyze a restricted version of the general preemptive scheduling problem, where tasks can only be divided into parts at least as long as a given parameter k. We introduce a heuristic scheduling method TSRP3. Number of processors m, number of tasks n and task lengths p are assumed to be known. If $n >= 2m$ and k is sufficiently small, TSRP3 finds shortest possible schedule with O(m) divisions in polynomial time.
  • A Branch-and-Cut-and-Price Algorithm for a Fingerprint-Template Compression Application
    245 Branch-and-Cut-and-Price,Column-Generation,Combinatorial Optimization,Data Compression Andreas Chwatal, Corinna Thöni, Karin Oberlechner, Günther Raidl, pages 239 – 246. Show abstract Abstract. In this work we present a branch-and-cut-and-price algorithm for the compression of fingerprint minutiae templates, in order to embed them into passport images by watermarking techniques as an additional security feature. For this purpose the minutiae data, which is the set of characteristic points of the fingerprint, is encoded by a spanning tree which edges are encoded efficiently by a reference to an element in a dictionary (template arc) and a small correction. Our proposed integer linear programming algorithm creates meaningful template arcs from a huge set of possible ones on demand in the pricing-procedure. Cutting-planes are separated in order to obtain connected subgraphs from which spanning trees can then be easily deduced. Our computational experiments confirm the superior performance of the algorithm with regard to previous approaches.
  • Improved asymptotic analysis for SUMT methods
    219 SUMT,Penalty and barrier methods,Asymptotic convergence Jean-Pierre Dussault, pages 247 – 253. Show abstract Abstract. We consider the SUMT ({em Sequential Unconstrained Minimization Technique}) method using extrapolations to link successive unconstrained sub-problems. The case when the extrapolation is obtained by a first order Taylor estimate and Newton's method is used as a correction in this predictor--corrector scheme was analyzed in cite{Du95}. It yields a two--steps super-linear asymptotic convergence with limiting order of $frac43$ for the logarithmic barrier and order two for the quadratic loss penalty. We explore both lower order variants (approximate extrapolations correction computations) as well as higher order variants (second order and further) Taylor estimate. First, we address inexact solutions of the linear systems arising within the extrapolation and the Newton's correction steps. Depending on the inexactness allowed, asymptotic convergence order reduces, more severely so for interior variants. Second, we investigate the use of higher order path following strategies in those methods. We consider the approach based on a high order expansion of the so-called central path, somewhat reminiscent of Chebyshev's third order method and its generalizations. The use of higher order representation of the path yields spectacular improvement in the convergence property, even more so for the interior variants.
  • Numerical Assessment of Finite Difference Time Domain (FDTD) and Complex-Envelope Alternating-Direction-Implicit Finite-Difference-Time-Domain (CE-ADI-FDTD) Methods
    185 assessment,performance,FDTD,CE-ADI-FDTD,Photonics Gebriel Gannat, pages 255 – 260. Show abstract Abstract. A thorough numerical assessment of FDTD and CE-ADI-FDTD have been carried out based on a basic single mode Plane Optical Waveguide structure. The structure parameters have been modified and the impact on the performance of both numerical methods is investigated.
  • Performing Conjoint Analysis within a Logic-based Framework
    156 conjoint analysis,quantum logic,information retrieval,machine learning,decision rules,user preferences Adrian Giurca, Ingo Schmitt, Daniel Baier, pages 261 – 268. Show abstract Abstract. Conjoint Analysis is heavily used in many different areas: from mathematical psychology, economics and marketing to sociology, transportation and medicine trying to understand how individuals evaluate products/services and as well as on predicting behavioral outcomes by using statistical methods and techniques. Nowadays is not much agreement about best practice, which in turn has led to many flavors of CA being proposed and applied. The goal of this paper is to offer a solution to perform Adaptive Conjoint Analysis inside CQQL, a quantum logic based information framework. We describe an algorithm to compute a logical CQQL formula capturing user preferences and use this formula to derive decision rules.
  • Extending the definition of beta-consistent biclustering for feature selection
    51 supervised classifications,feature selection,beta-consistent biclustering,combinatorial optimization,variable neighborhood search Antonio Mucherino, pages 269 – 274. Show abstract Abstract. Consistent biclusterings of sets of data are useful for solving feature selection and classification problems. The problem of finding a consistent biclustering can be formulated as a combinatorial optimization problem, and it can be solved by the employment of a recently proposed VNS-based heuristic. In this context, the concept of beta-consistent biclustering has been introduced for dealing with noisy data and experimental errors. However, the given definition for beta-consistent biclustering is coherent only when sets containing non-negative data are considered. This paper extends the definition of beta-consistent biclustering to negative data and shows, through computational experiments, that the employment of the new definition allows to perform better classifications on well-known test problems.

International Workshop on Advances in Business ICT

  • Formal Verification of Business Processes Represented as Role Activity Diagrams
    202 business process,formal model,formal verification,process algebra,temporal logic Amelia Badica, Costin Badica, pages 277 – 280. Show abstract Abstract. Formal modeling and verification are established software engineering methods with applications to modeling and verification of software specifications. These methods are currently well-supported by model checking technologies. Business process modeling is an important activity during the requirements analysis and specification of a business software system. Checking qualitative aspects of busness processes is usually required for quality assurance, as well as for compliance with non-functional nrequirements. In this paper we focus on formal models of business processes represented as role activity diagrams and show how these models can be formally checked using available formal modeling and verification tools based on proccess algebras and temporal logics.
  • Virtualization as an approach in the development of IT system implementation process
    66 virtualization,IS implementation,IS development Iwona Chomiak-Orsa, Wiesława Gryncewicz, Maja Leszczyńska, pages 281 – 285. Show abstract Abstract. Virtual administration of IT system implementation processes is now possible in small and micro-companies, characterized by relative simplicity and marked recurrence of business processes. Popularity of such approach to implementation is largely due to the wide availability of IT solutions offering remote administration of authorized IT resources. Virtual form of implementation offers significant reduction of both cost and time, compared with traditional approach. Consequently, it seems reasonable to expect further development of this trend, addressing larger economic entities and servicing more complex IT systems.
  • An architecture of a Web recommender system using social network user profiles for e-commerce
    194 recommender system,service-oriented architecture,social network Damian Fijałkowski, Radosław Zatoka, pages 287 – 290. Show abstract Abstract. In this paper we propose a concept of a web ecommerce system that collects and uses, in the process of making recommendations, data obtained from social network profiles of its users. This architecture modeling approach was developed within the project of a mashup Web application that integrates with Facebook API. We describe which data could be obtained from Facebook, propose the way to store it and suggest how the information from user profile could improve the effectiveness of a e-commerce recommender system.
  • Geospatial presentation of purchase transactions data
    212 spatial data,temporal data,business intelligence Maciej Grzenda, Krzysztof Kaczmarski, Mateusz Kobos, Marcin Luckner, pages 291 – 296. Show abstract Abstract. This paper presents a simple automatic system for small and middle Internet companies selling goods. The system combines temporal sales data with its geographical location and presents the resulting information on a map. Such an approach to data presentation should facilitate understanding sales structure. This insight might be helpful in generating ideas on improving sales strategy; consequently improving revenues of the company. The system is flexible and generic -- it can be adjusted to process and present the data within different levels of administrative division areas, using different hierarchies of sold goods. While describing the system, we also present its prototype that visualizes the data in an interactive way on a three-dimensional map.
  • Explaining MCDM acceptance: a conceptual model of influencing factors
    33 Technology Acceptance,Multi Criteria Decision Making,Decision Support Martina Maida, Konradin Maier, Nikolaus Obwegeser, Volker Stix, pages 297 – 303. Show abstract Abstract. The number of newly developed Multi-Criteria Decision Making (MCDM) methods grew considerably in the last decades. Although their theoretical foundations are solid, there is still a lack of acceptance and application in the practical field. The objective of this research is the development of a conceptual model of factors that influence MCDM acceptance that serves as a starting point for further research. For this purpose, a broad diversified literature survey was conducted in the discipline of technology adoption and related topics (like human computer interaction) with special focus on MCDM acceptance. The constructs collected within the literature survey were classified based on a qualitative approach which yielded a conceptual model structuring the identified factors according to individual, social, technology-related, task-related and facilitating aspects.
  • A Context-Aware Mobile Accessible Electric Vehicle Management System
    207 Mobile environments,Mobile commerce,Distributed system,Web-based services,Electric vehicles,Charging station Nils Masuch, Marco Luetzenberger, Sebastian Ahrndt, Axel Hessler, pages 305 – 312. Show abstract Abstract. In the coming years, the german traffic situation will undergo a challenging addition. Major car manufacturers have scheduled the year 2013 as cutoff for electric mobility. Yet, current studies indicate that range limitations and insufficient charging infrastructure endanger the acceptance for electric vehicles (EV). This is regrettable, and not only for the producer, but also for less obvious parties such as local energy providers which consider electric vehicles as remedy to one of their most severe problems of managing the grid load balance. In this paper we introduce a mobile accessible EV management system which accounts for the mobility of the user and also integrates web-based (commercial) services of third parties. Our objective is to counter the limitations of electric mobility and also to facilitate all of its (business) perspectives. We want to render electric mobility a success and support its trendsetting character.
  • NotX Service Oriented Multi-platform Notification System
    27 information system,soa,notx,cep,sms,voice,phone,mail,notification,enterprise,java,active mq,jms,jee,j2ee Filip Nguyen, Jaroslav Škrabálek, pages 313 – 316. Show abstract Abstract. This report describes NotX—service oriented system, that applies ideas of CEP and SOA to build highly reusable, flexible, both platform and protocol independent solution. Service oriented system NotX is capable of notifying users of superior information system via various engines; currently: SMS engine, voice synthesizer (call engine) and mail engine. Adaptable design decision makes it possible to easily extend NotX with interesting capabilities. The engines are added as plug-ins written in Java. There are plans to further extend NotX with following engines: Facebook engine, Twitter engine, content management system engine. Also the design of NotX allows to notify users in theirs' own language with full localization support which is necessary to bring value in today's market. Most importantly, the core design of NotX allows to run under heavy load compromising thousands of requests for notification per second via various protocols (currently Thrift, Web Services, Java Client). Thus NotX is designed to be used by state of the art Enterprise Applications that require by default certain properties of theirs' external systems as scalability, reliability and fail-over
  • Commonality in Various Design Science Methodologies
    101 Design Science Methodology,Commonality in Design Science,Design Science Artefact Lukasz Ostrowski, Markus Helfert, pages 317 – 320. Show abstract Abstract. Based on reviewing foremost literature, the paper discusses various design science research methodologies along with their case studies. It concentrates on activities (tools, methods, actions) that are used while constructing an artefact per se. We have identified common activities occurring across “design” steps, which were not indicated in their methodologies. Combining them and drawing on that finding, we propose a concept of reference model, which gives more insights and additionally dissipates design science high level of abstraction.
  • A Hybrid Algorithm for Detecting Changes in Diagnostic Signals Received From Technical Devices
    133 Event detection,time series processing,distance measures,short-term prediction Tomasz Pełech-Pilichowski, pages 321 – 327. Show abstract Abstract. In this paper, a hybrid two-level algorithm of the original changes in diagnostic signals received from multiple technical devices is presented. Research are aimed at identification of the changes, deviations or patters (events), through concurrent diagnostic signals processing, which occur in one selected signal (proposed algorithm is adjusted to omit concurrent and time-lagged changes). In the first stage, detection is based on non-stationarity detection with the short-tem prediction comparison. In the second stage, dedicated distance-like measure is employed. Detection results obtained for sample random signals including simulated large deviations are presented.
  • Adapting Scrum for Third Party Services and Network Organizations
    162 Network Organization,Third Party Services,Holon,Holarchy,Scrum,software development model,Agile,Key Performance Indicators Lukasz D. Sienkiewicz, Leszek A. Maciaszek, pages 329 – 336. Show abstract Abstract. Large number of scientific publications and press releases demonstrate that organizations are adopting the Scrum software development method with success in almost all areas. Nevertheless, traditional Scrum method is not sufficient for managing work in Network Organizations where Third Party Service providers may know nothing about the Scrum. This paper describes the findings of a field study that explores the Scrum in Network Organizations. We extended Scrum core roles and proposed changes in Scrum artifacts that help in adapting the Scrum method to work in Network Organization where changes and high competition are the cornerstone of the entire process.
  • Extending the Descartes Specification Language Towards Process Modeling
    178 software specification;formal methods;software process model; Joseph Urban, Vinitha Hannah Subburaj, Lavanya Ramamoorthy, pages 337 – 340. Show abstract Abstract. With current complex real time software problems, the need for reliable software specification becomes crucial. This paper overviews the use of formal methods to specify requirements and the advantage of using an executable formal specification language processor to develop a process model for the development of a software system. The paper presents how a software process can be described using the Descartes specification language, an executable specification language, and the language extensions made to Descartes to make it suitable to describe a software process.
  • Influence of search engines on customer decision process
    75 customer decision process,information search,search engine Marek Zgódka, pages 341 – 344. Show abstract Abstract. This article summarizes customer decision process focusing on information search. It explains the role and use communication channels that are used during information search, particularly the internet. It describes the internet and the role of search engines during information search. It explains the use of search engines and provides better understanding of ways in which search engines support customer decision process such as reduction of information search cost, higher involvement in the search process and increased ability to search for information. It also identifies some possible disadvantages like information irrelevancy or invisible web. Paper aims to identify the influence of search engines on information search phase of customer decision process.

1st International Workshop on Interoperable Healthcare Systems—Challenges, Technologies, and Trends

  • The Intersection of Clinical Decision Support and Electronic Health Record: A Literature Review
    13 Clinical decision support system,Electronic health record,clinical standards,openEHR Hajar Kashfi, pages 347 – 353. Show abstract Abstract. Aim: It is observed that clinical decision support (CDS) and electronic health records (EHR) should be integrated so that their contribution to improving the quality of health care is enhanced. In this paper, we present results from a review on the related literature. The aim of this review was to find out to what extent CDS developers have actually considered EHR integration in developing CDS. We have also investigated how various clinical standards are taken into account by CDS developers. Methods: The ScienceDirect database was searched for related studies. The search yielded a final collection of 25 studies. Relevance criteria included (i) discussing development of CDS or an EHR with CDS services (ii) taking integration of CDS into EHRs into account. Results: It was observed that there are few CDS development projects where EHR integration is taken into account. Also, the number of studies where various clinical standards are taken into consideration in developing CDS is surprisingly low especially for openEHR, the EHR standard we aimed for. The reasons for low adoption of openEHR are issues such as complex and huge specifications, shortcomings in educational aspects, low empirical focus and low support for developers. Conclusion: There is a need for further investigation to discover the reasons why the rate of integration of EHRs and CDS is not at an optimum level and mostly to discover why CDS developers are not keen to adopt various clinical standards.
  • EMeH: Extensible Mobile Platform for Healthcare
    163 mobile,healthcare,hospital information system Jacek Kobusiński, Maciej Małecki, Krzysztof Stefaniak, Tomasz Biały, pages 355 – 361. Show abstract Abstract. Rapid development of mobile technology and growing number of users open new possibilities in the context of using mobile devices in healthcare. Despite that, the resources and computing power of these devices are still far less than desktop computers. Thus, one can take into account these limitations when designing application for such devices. On the other hand, the increasing functionality of mobile devices makes them a real alternative for traditional PC in certain areas. Nowadays mobility become one of the most required feature for information system users. They require to be connected and have access to data all the time at any location. This is also true in the context of hospital environment where medical staff must collect, update and retrieve various type of data. Extensible Mobile Platform for Healthcare is a solution that provides the possibility to create efficient distributed applications based on SOA paradigm. It offers universal environment that allows to integrate PDAs, smartphones, tablets as well as specialized barcode scanners and RFID readers with existing hospital information system. Layered and flexible design that reflect real usage scenarios, efficient and economic resource usage and consistent dynamically generated user interface makes Extensible Mobile Platform for Healthcare interesting extension to traditional hospital information system
  • Semantic Interoperability for Infectious Disease Reporting System
    224 Semantic Interoperability,Service-oriented architecture,HL7,Ontolgy,OWL Murugavell Pandiyan, Osama Elhassan, Zakaria Maamar, Pallikonda Rajasekar, pages 363 – 367. Show abstract Abstract. This research work describes the “Semantic Interoperability for Infectious Disease Reporting System.” The importance of Infectious Disease Informatics and the challenges in integration of the heterogeneous systems motivated us in architecting such Semantic Interoperable solution. We have built ontology rules on top of an interoperable web service. Moreover we have built data mining repository for providing statistical reports and analysis

International Workshop on Ubiquitous Home Healthcare

  • The role of a mobile device in a home monitoring healthcare system
    201 mobile device,healthcare system,patient monitoring Marcin Bajorek, Jędrzej Nowak, pages 371 – 374. Show abstract Abstract. In the present study, a home monitoring healthcare system for elderly and chronic patients has been proposed. System was developed for three types of users: the patient, doctor and guardian. The system is adapted for continuous measurement of biomedical signals, depending on the patient's disease. The system analyzes the collected information and in case of detection dangerous events inform physician and guardian. A key role in the system has a mobile device, which allows exchange and visualization of data to the users. In detail the role of a tablet with the software which supports a physician visits in a patient home was described. We developed a special protocol for exchanging information between system devices via Bluetooth. Additionally special security features to protect data exchange were introduced. Software part of the system was made using modern technologies such as JavaFX for central unit and Android for mobile devices.
  • MuSA: a multisensor wearable device for AAL
    251 Wearable Sensors,Ambient Assisted Living,Elderly Care Valentina Bianchi, Ferdinando Grossi, Ilaria De Munari, Paolo Ciampolini, pages 375 – 380. Show abstract Abstract. In this paper, a novel multi-sensor wearable device, called MuSA, is introduced. MuSA aims at integrating in the CARDEA ambient-assisted-living framework: on the one hand, MuSA provides CARDEA with useful ambient-intelligence features, such as localization and identification; on the other hand, it may borrow from the environmental control system many infrastructural and communication components, resulting in a less expensive implementation. MuSA exploits on-board sensors and signal processing units for fall detection, heartbeat and breathing rates detection. At this level too, sharing of part of the circuitry enables power and cost savings. Ubiquitous computing paradigm is followed, carrying out all of the signal processing and decision processes at the wearable node: this makes communication toward supervision levels much less demanding and independent on the actual physical features of the sensors themselves. Test have been carried out, confirming that the low-cost approach which has been followed still allows for adequate quality of responses. Field test is starting, to evaluate psychological and ergonomic aspect as well.
  • Intelligent bathroom
    210 Bath monitoring,intelligent sensors,intelligent home,bioimpedance Adam Bujnowski, Adam Wtorek, Arkadiusz Paliński, pages 381 – 386. Show abstract Abstract. Monitoring system for detection events in the bathroom is described in the paper. It consists of humidity, air and water temperature, spilled water, and state of bathtub detectors. The bathtub state detector (BSD) allows controlling water level and temperature and detects basic vital parameters of person taking bath. Essential part of the system is a probe allowing detection of a human being presence in the bath and his/her activity. The system may distinguish between three cases: 1. bathtub filled only with water, 2. bathtub filled with water and occupied by person, and 3. empty bath. The system evaluates activity of the person and his/her ECG in the case 2.
  • Low-coherence method of hematocrit measurement
    87 low-coherence interferometer,optical measurement,hematocrit measurement Małgorzata Jędrzjewska-Szczerska, Marcin Gnyba, Michał Kruczkowski, pages 387 – 391. Show abstract Abstract. During the last thirty years low-coherence measurement methods have gained popularity because of their unique advantages. Low-coherence interferometry, low-coherence reflectometry and low-coherence optical tomography offer resolution and dynamic range of measurement at the range of classical optical techniques. Moreover, they enable measurements of the absolute value of the optical path differences, which is still an unsolved problem in high-coherence interferometry. Furthermore, the use of the spectral signal processing makes this method immune for any change of the optical system transmission. In this article the low-coherence method of a hematocrit measurement has been presented. Elaborated measurement method has many advantages: relatively simple configuration, potentially low cost and high resolution. Investigation of this method confirms its ability to determine the hematocrit value with appropriate measurement accuracy. Furthermore, results of experimental works have shown that the application of the fiber-optic low-coherent interferometry can become in the future an effective base of method of the in-vivo hematocrit measurement.
  • Multimodal platform for continuous monitoring of elderly and disabled at home
    96 elderly monitoring,Integration of sensors,Ubiquitous services Mariusz Kaczmarek, Jacek Ruminski, Adam Bujnowski, pages 393 – 400. Show abstract Abstract. Health monitoring at home could be an important element of care and support environment for older people. Diversity of diseases and different needs of users require universal design of a home platform. We present our work on a sensor-based multimodal platform that is trained to recognize the activities elderly person on their home. Two specific problems were investigated: configuration and functionality of central workstation as a module for data acquisition and analysis and second problem is devoted to user's home environment monitoring.
  • Design of a wearable sensor network for home monitoring of human behavior
    137 telemedicine,home-care,cardiology,body surface network Eliasz Kańtoch, Joanna Jaworek, Piotr Augustyniak, pages 401 – 403. Show abstract Abstract. This paper presents the design and development of a wearable ubiquitous healthcare monitoring system using integrated electrocardiogram (ECG), accelerometer a sensor with a mobile device in a Bluetooth-based body surface network (BSN). Physiological signals are transmitted to a local healthcare provider using wireless technology. This system was developed to promote the mobility and flexibility for the patients and also for the medical personnel, which further will improve both the quality of health care and lifestyle of the patient.
  • Measuring Pulse Rate with a Webcam—a Non-contact Method for Evaluating Cardiac Activity
    188 puls rate,PCA,remote sensing,noncontact Magdalena Lewandowska, Jacek Ruminski, Tomasz Kocejko, Jędrzej Nowak, pages 405 – 410. Show abstract Abstract. In this paper the simple and robust method of measuring the pulse rate is presented. Elaborated algorithm allows for efficient pulse rate registration directly from face image captured from webcam. The desired signal was obtained by proper channel selection and principal component analysis. A developed non-contact method of heart rate monitoring is shown in the paper. The proposed technique may have a great value in monitoring person at home after adequate enhancements are introduced.
  • Pulse pressure velocity measurement—a wearable sensor
    183 Pressure pulse velocity,impedance technique,body sensors Mateusz Moderhak, Mariusz Moderhak, Jerzy Wtorek, Bart Truyen, pages 411 – 416. Show abstract Abstract. Pulse pressure velocity measurements (PPV) may be a source of useful information on artery state. A new 2007 European Society of Hypertension guidelines recommend measuring arterial stiffness in patients with arterial hypertension. It may be accessed by measuring PPV. Mechanical changes in cardiovascular tree involved by the blood ejection have been measured at a thorax and on a wrist using impedance technique together with one channel electrocardiographic signal. Performing impedance measurements on the wrist is a very demanding task because of a very low conductivity changes and relatively high value of basal impedance. Especially, when dimensions of the measuring probe are have to be adequate for a whole day wearable sensor. However, it has been shown that such measurements are possible and moreover they may deliver very useful information of pulse pressure velocity when comparing to classical approach basing on PPV delay in relation to ECG signal. It has been shown that a big discrepancy between results obtained using classical approach and the proposed one may result.
  • Analysis of Correlation between Heart Rate and Blood Pressure
    99 heart rate,blood pressure,correlation coefficient Artur Polinski, Jacek Kot, Anna Meresta, pages 417 – 420. Show abstract Abstract. The paper presents correlation analysis between heart rate (HR) and blood pressure (BP). The real data were obtained from three female and one male. The systolic and diastolic blood pressure was measured with the invasive method in the radial artery. The correlation coefficient indicates only linear dependence, so the inverse of HR was also taken into account. Since the measurements can be corrupted by noise the moving average filtering and trend analysis for all data was done. Results of the correlation analysis of this filtered data were similar to results obtained for raw data. The observed correlation coefficient between HR and BP (systolic and diastolic) for whole available data seems a random number. However the short-term correlation is relatively large (about 0.5), but rather unpredictable, since even sign of the correlation coefficient is changing.

Computer Aspects of Numerical Algorithms

  • The incomplete factorization preconditioners applied to the GMRES(m) method for solving Markov chains
    64 Preconditioning,GMRES,Incomplete factorization,Markov chains Beata Bylina, Jarosław Bylina, pages 423 – 430. Show abstract Abstract. The authors consider the impact of the structure of the matrix on the convergence behavior of the GMRES(m) projection method and preconditioned GMRES(m) method for solving large sparse linear equation systems resulting from Markov chains modeling. Studying experimental results, the authors investigate the number of steps and the rate of convergence of GMRES(m) method and the incomplete factorization preconditioner for the GMRES(m) method. The motivation is to better understand the convergence characteristics of Krylov subspace methods and the relationship between the Markov model, the nonzero structure of the coe- cient matrix associated with this model, the structure of the incomplete preconditioner and the convergence of the preconditioned GMRES(m) method.
  • Cache-Aware Matrix Multiplication on Multicore Systems for IPM-based LP Solvers
    223 matrix multiplication,cache optimization,interior point methods,multicore systems Mujahed Eleyat, Lasse Natvig, Jørn Amundsen, pages 431 – 438. Show abstract Abstract. We profile GLPK, an open source linear programming solver, and show empirically that the form of matrix multiplication used in interior point methods takes a significant portion of the total execution time when solving some of the Netlib and other LP data sets. Then, we discuss the drawbacks of the matrix multiplication algorithm used in GLPK in terms of cache utilization and use blocking to develop two cache aware implementations. We apply OpenMP to develop parallel implementations with load balancing. The best implementation achieved a median speedup of 21.9 when executed on a 12-core AMD Opteron.
  • Object Oriented Model of Generalized Matrix Multipication
    226 Large scale computing,generalized matrix multiplication,object oriented numerical computing,algebraic path problem Maria Ganzha, Stanislav Sedukhin, Marcin Paprzycki, pages 439 – 442. Show abstract Abstract. Paper that we will submit by next Sunday (hopefully) will present an object model of generalized matrix multiplication that can be used in efficient implementation of large class of matrix problems, including standard numerical algorithms as well as algebraic path problems.
  • Parallel alternating directions algorithm for 3D Stokes equation
    127 Navier-Stokes,time splitting,ADI,incompressible flows,pressure Poisson equation,parallel algorithm Ivan Lirkov, Marcin Paprzycki, Maria Ganzha, Paweł Gepner, pages 443 – 450. Show abstract Abstract. We consider the 3D time dependent Stokes equation on a finite time interval and on a uniform rectangular mesh, written in terms of velocity and pressure. For this problem, a parallel algorithm based on a new direction splitting approach is proposed. Here, the pressure equation is derived from a perturbed form of the continuity equation, where the incompressibility constraint is penalized in a negative norm induced by the direction splitting. The scheme used in the algorithm is composed of: (a) pressure prediction, (b) velocity update, (c) penalty step, and (d) pressure correction. In order to achieve good parallel performance, the solution of the Poison problem for the pressure correction is replaced by solving a sequence of one-dimensional second order elliptic boundary value problems in each spatial direction. The efficiency and scalability of the proposed approach are tested on two distinct parallel computers and the experimental results are analyzed.
  • GPGPU calculations of gas thermodynamic quantities
    206 gas state equation,AGA8,gas thermodynamic quantities,CFD,GPGPU,CUDA,FERMI,NVIDIA Tesla Igor Mračka, Peter Somora, Tibor Žáčik, pages 451 – 458. Show abstract Abstract. Computational processors NVIDIA Tesla GPU based on the new Fermi generation of CUDA architecture are intended to perform massively parallel calculations applicable to various parts of the scientific and technical research, including the area of fluid dynamics modeling, in particular the simulation of real gas flow. In this paper we show that a significant acceleration of simulation calculations can be achieved even without the parallelization of the solution of involved differential equations by parallel pre-calculation of thermodynamic quantities using GPGPU.
  • The influence of a matrix condition number on iterative methods' convergence
    65 linear systems,condition number,iterative methods Anna Pyzara, Beata Bylina, Jarosław Bylina pages 459 – 464. Show abstract Abstract. This paper investigates the condition number of the matrices that appear in solving linear system of equations. We consider the iterative method to solve equations, namely Jacobi and Gauss Seidel method. We examine the influence of the condition number on convergence of these iterative methods. We extensively experimental study numerical aspects relation between the condition number and the size of matrix and we analyze the number of iterations. We traverse random matrices and Hilbert matrix and matrices with strictly (or irreducibly) diagonally dominant.
  • Solving Linear Recurrences on Hybrid GPU Accelerated Manycore Systems
    148 linear recursive filters,divide an conquer,multiple GPU systems Przemyslaw Stpiczynski, pages 465 – 470. Show abstract Abstract. The aim of this paper is to show that linear recur- rence systems with constant coeficients can be eficiently solved on hybrid GPU accelerated manycore systems with modern Fermi GPU cards. The main idea is to use the recently developed divide- and-conquer algorithm which can be expressed in terms of Level 2 and 3 BLAS operations. The results of experiments performed on hybrid system with Intel Core i7 and NVIDIA Tesla C2050 are also presented and discussed.
  • A multipoint shooting feasible-SQP method for optimal control of state-constrained parabolic DAE systems
    149 multipoint shooting,feasible-SQP method,heat transfer problem Krystyn Styczeń, Wojciech Rafajłowicz, pages 471 – 476. Show abstract Abstract. Optimal control problem for parabolic differential-algebraic equations (PDAE) systems with spatially sensitive state-constraints and technological constraints is considered. Multipoint shooting approach is proposed to attack such problems. It is well suited to deal with unstable and ill-conditioned PDAE systems. This approach consists in the partitioning of the time-space domain into shorter layers, which allows us to fully parallelize the computations and to employ the reliable PDAE solvers. A new modified method of this kind is developed. It converts the multipoint shooting problem having mixed equality and inequality constraints into the purely inequality constrained problem. The results of the consecutive layer shots are exploited to determine a feasible shooting solution of the converted problem. The knowledge of such a solution is crucial for the use of highly efficient feasible-SQP methods avoiding the incompatibility of the constraints of the QP subproblems (versus the infeasible path SQP methods). The applications of the method proposed to the optimization of some heat transfer processes as well as chemical production processes performed in tubular reactors are discussed.
  • A modified multipoint shooting feasible-SQP method for optimal control of DAE systems
    147 DAE systems,optimal control,multipoint shooting,feasible-SQP method,regularized solution Krystyn Styczeń, Pawel Drąg, pages 477 – 484. Show abstract Abstract. Optimal control problem for state-constrained differential-algebraic (DAE) systems is considered. Such problems can be attacked by the multiple shooting approach well suited to unstable and ill-conditioned dynamic systems. According to this approach the control interval is partitioned into shorter intervals allowing the parallelization of computations with the reliable using of DAE solvers. A new modified method of this kind is proposed, which converts the partitioned problem with mixed equality and inequality constraints into the purely inequality constrained problem. An algorithm for obtaining a feasible initial solution of the converted problem is described. A feasible-SQP algorithm based on an active set strategy is applied to the converted problem. It avoids the inconsistency of the constraints of the QP subproblems (versus the infeasible path SQP methods) and delivers a locally optimal solution of the basic problem preserving all its constraints (including the equality ones), which is of a high practical meaning. Some further developments concerning the regularization of suboptimal solutions for large-scale DAE optimal control problems and multilevel versions of the method proposed are also discussed. The theoretical considerations are illustrated by a numerical example of optimization of a complex DAE chemical engineering system.
  • On the implementation of stream ciphers based on a new family of algebraic graphs
    160 private key,stream cipher,discrete logarithm problem,algebraic graphs Vasyl Ustimenko, Stanislaw Kotorowicz, Urszula Romanczuk, pages 485 – 490. Show abstract Abstract. Families of edge transitive algebraic graphs defined over finite commutative rings were used for the development of stream ciphers, public keys and key exchange protocols. We present the results of the first implementation of private key algorithm based on the family of algebraic graphs, which are not edge transitive. The absence of edge transitive group of symmetries means that the algorithm can not be described in group theoretical terms. We hope that it complicates cryptanalysis of the algorithm. We discuss the connections between security of algorithms and discrete logarithm problem (case of periodic password). The plainspace of the algorithm is $K^n$, where $K$ is chosen commutative ring. Graph theoretical encryption corresponds to walk on the bipartite graph with partition sets which are isomorphic to $K^n$. we conjugate chosen graph based encryption map, which is a composition of several elementary cubical polynomial automorphisms of a free module $K^n$ with special invertible affine transformation of $K^n$. Finally we compute numerically corresponding private map $g$ of $K^n$ onto $K^n$. We evaluate the order of $g$ and compare results with cases of other graph based stream ciphers.
  • Implementation of Movie-based Matrix Algorithms on OpenMP Platform
    146 Visual Programming,Movie-based programming,Matrix Computing,Parallel Programming,OpenMP Platform Dmitry Vazhenin, Alexander Vazhenin, pages 491 – 494. Show abstract Abstract. The convenience and programmer's produtivity are the main point of visual programming systems and languages. From the other side, the parallel programming is mainly focused on reaching the high performance by optimization of executable code. The Movie-based Programming is based not only on the introduction of special symbols and images with semantic support, but also on a series of images that can present dynamical features of algorithms that the system can automatically generate rather effective executable sequential C-code. The presented paper describes a technique of OpenMP parallelization of Movie-based algorithms in order to keep the program performance in a suitable level. The results of numerical experiments are also presented showing applicability of the proposed technique including implementation, code validity checking and performance testing.

3rd International Symposium on Services Science

  • Learning to Innovate in Distributed Mobile Application Development: Learning Episodes from Tehran and London
    246 Distributed Mobile Application Development,Learning to Innovate,Knowledge Broker,Project-Enhanced Learning Episodes,Distributed Scrums,Tehran, Iran Neek Alyani, Sara Shirzad, pages 497 – 504. Show abstract Abstract. This paper reports on the activities of an entrepreneurial small software firm, operating in telecoms value-added services based in Tehran, Iran, with project partners in London, UK. Mobile and smart phone applications are altering our professional and social interactions with innovative business models, glocal content and eco-systems, fusing the multifaceted aspects of mobile software development. To analyze these types of activities in the context of rapidly changing catching-up economies, development of mobile applications by entrepreneurial NTBFs, initially imitating as a way to innovate, require distributed up-skilling, rapid problem-solving and pragmatic learning. Specifically, we focus on knowledge brokerage and sourcing activities in distributed Scrums. Drawing on longitudinal analysis of projects [2004-2010], an iterative ‘learning to innovate’ model, entitled DEAL (Design, Execute, Adjust, Learn) within 'project-enhanced learning episodes', is constructed and outlined utilizing knowledge brokers and boundary sources in enterprise challenges. We conclude by reflecting on distributed learning and skills in practice.
  • Configuring services regarding service environment and productivity indicators
    30 service productivity,customer specific configuration,constraints on service selection Michael Becker, Stephan Klingner, Martin Bottcher, pages 505 – 512. Show abstract Abstract. In course of the extensive changes in the service sector, methods and tools for modelling services, managing service-portfolios and optimising service-offers are required. This paper proposes an extension of a basic metamodel as described in various previous publications to be able to describe non-functional properties, global variables and the definition of configuration constraints.
  • Service quality description—a business perspective
    32 service quality,business/service alignment,service design,construction domain Marija Bjekovic, Sylvain Kubicki, pages 513 – 520. Show abstract Abstract. Business fields characterized by collective activities are numerous and require well-adapted software-based services to improve the efficiency of business collaborations. The design of services supporting the activities in such domains is usually ad-hoc and relies on the know-how of various involved actors. Based on our experience of designing innovative services for Architecture, Engineering and Construction projects, we proposed a service design methodology involving business actors, service and technical experts and being intrinsically collective. This article focuses more precisely on integrating non-functional (i.e. service quality) aspects of services in such an approach. The service-business practices alignment should not only be tackled from functional but as well non-functional perspective, so that not only business-level service quality requirements are clearly understood and taken care of, but as well that business practitioners get a clear view of all the characteristics of the designed service. While concepts referring to the technical service quality are well-known and vastly used by service experts, what are the concepts defining service quality in business terms in the specific business context remains an issue to be addressed both by domain practitioners and service experts. We propose in this article an initial service quality model from business perspective, aimed at qualifying services for construction projects.
  • Towards an Interdisciplinary View on Service Science—The Case of the Financial Services Industry
    151 SOA,Design principles,Banking,Business-IT Alignment,Service Science,Inter-disciplinary,Financial services Michael Fischbach, Thomas Puschmann, Rainer Alt, pages 521 – 527. Show abstract Abstract. In the last decade service science has received considerable attention in the research community. Most research regards services either from a business or a technical perspective. This paper argues that existing approaches still lack detailed models for the application of the inter-disciplinary nature of Service Science as well as an application of these concepts in practice. This paper describes a first attempt to apply the characteristics of service-oriented architectures from the information systems discipline to the business domain. It depicts autonomy and modularity, interoperability and interface orientation as major design principles that promise potentials when transferred to the business domain. The proposed inter-disciplinary approach was applied at the case of Zürcher Kantonalbank in Switzerland that realized a company-wide Service Management concept according to the presented design principles.
  • Services Composition Model for Home-Automation peer-to-peer Pervasive Computing
    184 Services Composition,Service-Oriented Architecture,peer-to-peer,Pervasive Computing,Home-Automation Juan A. Holgado-Terriza, Sandra Rodríguez-Valenzuela, pages 529 – 536. Show abstract Abstract. Collaborative mechanisms between services are a crucial aspect in the recent development of pervasive computing systems based on the paradigm of service-oriented architecture. Currently, trends in development of services computing are taking into account new high-level interaction models founded on services composition. These services make up their functionalities with the objective of creating smart spaces in which services with different purposes can collaborate to offer new and more complex functionalities to the user transparently. This leads to the creation of collaborative spaces with value-added services derived from the composition of existing ones. However, there are many aspects to consider during the deployment of this type of systems in pervasive spaces, in which the extensive use of embedded devices with limited characteristics of mobility, computing resources and memory, is a large handicap. This paper describes a model of services composition based on a directed acyclic graph used in a services middleware for home-automation, in which we work with loosely coupled services-oriented systems over the peer-to-peer technology JXTA. The presented composition model guarantee the acyclicity of the composition map between services as well as favours the building of collaborative light services using peers as proactive entities, which could be executed on embedded devices. These ones are capable of establishing dynamic intercommunications, synchronizing with others and form coalitions to cooperate between theirs for a common purpose.
  • Violation of Service Availability Targets in Service Level Agreements
    122 Service Level Agreement,Availability,Performance Analysis,Risk Maurizio Naldi, Loretta Mastroeni, pages 537 – 540. Show abstract Abstract. Targets on availability are generally included in any Service Level Agreement (SLA). When those targets are not met, the service provider has to compensate the customer. The obligation to compensate may represent a significant risk for the service provider, if the SLA is repeatedly violated. In this paper we evaluate the probability that a SLA commitment on the service availability is violated. For a two state model, where the service alternates between availability and unavailability periods, we show that such probability is mostly a function of the first two moments of the service restoration times, and that it decreases as the variance of the restoration time grows. Lengthening the time interval over which the service availability is evaluated reduces the risk for the service provider just if the compensation grows quite less than the length of that time interval.
  • Orchestration of Service Design and Service Transition
    29 Service Design,Service Transition,Service Management,ITIL,ISO 20.000,Service Orchestration Bernd Pfitzinger, Thomas Jestadt, pages 541 – 544. Show abstract Abstract. Standardized service management processes and organizations allow the implementation of changes to service offerings as part of an integrated and ISO 20.000 certified service management system. Two different models for the process-based orchestration of changes to services are presented addressing the Service Design and Service Transition phases of ITIL V3. The models are evaluated in a real life scenario and discussed in the context of a medium-sized company.
  • Service Innovation Capability: Proposing a New Framework
    108 service innovation,new service development,dynamic capabilities,service innovation capability Jens Pöppelbuß, Ralf Plattfaut, Kevin Ortbach, Andrea Malsbender, Matthias Voigt, Björn Niehaves, Jörg Becker, pages 545 – 551. Show abstract Abstract. Service organizations face the challenge of offering their customers continuously improved or completely new services and, hence, require service innovations to sustain themselves in the market. We interpret the design and implementation of new or enhanced service offerings as a dynamic capability because the service organization is required to sense impulses for innovation, seize meaningful ways for change, and finally transform its operational capabilities to the desired state. Accordingly, we propose a new framework which structures service innovation capability into the areas of sensing, seizing, and transformation. We further identify and describe the key activities in all of these three areas based on an analysis of existing literature. With this conceptual paper, we contribute to a better understanding of service innovation capability by proposing a novel framework which is grounded in dynamic capability theory. This framework is beneficial to both practice and academia. It offers an overview of service innovation capability areas and activities against which service organizations can critically reflect their service innovation initiatives. As for academia, it stipulates promising directions for future research.
  • A Framework for Comparing Cloud-Environments
    31 Cloud-Computing,Cloud-Environments,Meta-Services Rainer Schmidt, pages 553 – 556. Show abstract Abstract. Cloud-services are more and more embedded in so-called cloud-environments. Cloud-environments provide resources for the cloud-services and offer management interactions to configure services and resources to individual requirements. Therefore enterprises selecting a cloud environment have not only to consider the functionality of the cloud-services, but also the management interactions offered by the cloud-environment. Therefore, a framework for the comparison of cloud-environments is introduced and applied to two environments.

Workshop on Agent Based Computing: from Model to Implementation—VIII

  • A methodology for developing component-based agent systems focusing on component quality
    248 Formal methods,Component-based systems,Software quality George Eleftherakis, Petros Kefalas, Evangelos Kehris, pages 561 – 568. Show abstract Abstract. Formal development of component-based distributed agent systems with inherent high complexity is not a trivial task, especially if a formal method used is not accompanied by an appropriate methodology. X-machines is a formal method that resembles Finite State Machines but has two important extensions, namely internal memory structure and functions. In this paper, we present a disciplined methodology for developing component-based systems using communicating X-machine components. In practice, the development of a communicating system model can be based on a number of well-defined distinct steps, i.e. development of types of X-machine models, components as instances of those types, communication between components, and testing as well as model checking each of these components individually. To each of the steps a set of appropriate tools is employed. Therefore the proposed methodology utilises A priori techniques to avoid any flaws in the early stages of the development together with A posteriori techniques to discover any undiscovered flaws in later stages. This way it makes the best use of the development effort to achieve highest confidence in the quality of the developed components. We use this methodology for modelling naturally distributed systems, such as multi-agent systems, factory simulations etc. We borrow an example from the latter in order to demonstrate the methodology and explain in detail how each activity is carried out. We briefly present the theory behind communicating X-machine components and then we describe in detail the practical issues related using the same example throughout.
  • Monitoring Building Indoors through Clustered Embedded Agents
    193 Multi-agent systems,Wireless Sensor Networks,Building Networks,MAPS Giancarlo Fortino, Antonio Guerrieri, pages 569 – 576. Show abstract Abstract. Future buildings will be smart to support personalized people comfort and building energy efficiency as well as safety, emergency, and context-aware information exchange scenarios. In this work we propose a decentralized and embedded architecture based on agents and wireless sensor and actuator networks (WSANs) for enabling efficient and effective management of buildings. The main purpose of the agent-based architecture is to efficiently support distributed and coordinated sensing and actuation operations. The building management architecture is implemented in MAPS (Mobile Agent Platform for Sun SPOTs), an agent-based framework for programming WSN applications based on the Sun SPOT sensor platform. The proposed architecture is demonstrated in a simple yet effective operating scenario related to monitoring workstation usage in computer laboratories. The high modularity of the proposed architecture allows for easy adaptation of higher-level application-specific agents that can therefore exploit the architecture to implement intelligent building management policies.
  • Multiagent Distributed Grid Scheduler
    79 Grid,distributed jobs scheduling,Grid architecture for parallel computations multiagent systems Victor Korneev, Dmitry Semenov, Andrey Kiselev, Boris Shabanov, Pavel Telegin, pages 577 – 580. Show abstract Abstract. An approach for resource scheduling based on multiagent model with distributed queue is discussed. Algorithms of functioning agents for distributed Grid scheduling are presented.
  • Tuning Computer Gaming Agents using Q-Learning
    117 Computer Game Bots,Q-learning,agents,game AI,reinforcement learning Purvag Patel, Norman Carver, Shahram Rahimi, pages 581 – 588. Show abstract Abstract. The aim of intelligent techniques, termed game AI, used in computer video games is to provide an interesting and challenging game play to a game player. Being highly sophisticated, these games present game developers with similar kind of requirements and challenges as faced by academic AI community. The game companies claim to use sophisticated game AI to model artificial characters such as computer game bots, intelligent realistic AI agents. However, these bots work via simple routines pre-programmed to suit the game map, game rules, game type, and other parameters unique to each game. Mostly, illusive intelligent behaviors are programmed using simple conditional statements and are hard-coded in the bots' logic. Moreover, a game programmer has to spend considerable time configuring crisp inputs for these conditional statements. Therefore, we realize a need for machine learning techniques to dynamically improve bots' behavior and save precious computer programmers' man-hours. We selected Qlearning, a reinforcement learning technique, to evolve dynamic intelligent bots, as it is a simple, efficient, and online learning algorithm. Machine learning techniques such as reinforcement learning are known to be intractable if they use a detailed model of the world, and also require tuning of various parameters to give satisfactory performance. Therefore, this paper examine Qlearning for evolving a few basic behaviors viz. learning to fight, and planting the bomb for computer game bots. Furthermore, we experimented on how bots would use knowledge learned from abstract models to evolve its behavior in more detailed model of the world.
  • Developing intelligent bots for the Diplomacy game
    205 Diplomacy,Bot,Multi agent system Sylwia Polberg, Marcin Paprzycki, Maria Ganzha, pages 589 – 596. Show abstract Abstract. This paper describes the design of an architecture of a bot capable of playing the Diplomacy game, to be used within the dip framework—a testbed for multi-agent negotiations—created in the Spanish Artificial Intelligence Research Institute (IIIA). The proposed SillyNegoBot, is an extension of the SillyBot. It is designed to be used in negotiations taking place during the Diplomacy game.
  • Computing Equilibria for Constraint-based Negotiation Games with Interdependent Issues
    161 automated negotiation,interdependent issues,bargaining,multi-agent systems Mihnea Scafes, Costin Badica, pages 597 – 603. Show abstract Abstract. Negotiation with interdependent issues and nonlinear, non-monotonic utility functions is difficult because it is hard to efficiently explore the contract space. This paper presents a new result in automated negotiations with interdependent issues, complete information and time constraints. We consider that agents express their preferences using constraints defined as one interval per issue and that we represent their constraint sets as intersection graphs. We model negotiations as a bargaining game and we show that the equilibrium solution is one of the maximal cliques of the constraint graph. Consequently, we find that the problem of computing the equilibrium solution has polynomial-time complexity when the number of issues is fixed.
  • Agent-Oriented Knowledge Elicitation for Modeling the Winning of “Hearts and Minds”
    195 agent-oriented modeling,agent-based simulation,socio-technical system,conflict resolution Inna Shvartsman, Kuldar Taveter, pages 605 – 608. Show abstract Abstract. Agent-oriented modeling is a top-down approach for modeling and simulating the behaviors of complex socio-technical systems. This research addresses the application of agent-oriented modeling to eliciting and representing knowledge for social simulations. We provide an overview of agent-oriented modeling and outline a knowledge elicitation and representation process that is appropriate for developing social simulation systems, including the ones capable of simulating winning the “hearts and minds” in an occupied territory. The case study describes eliciting and representing knowledge for simulating conflict resolution in the context of winning “hearts and minds”. The models created by means of agent-oriented modeling can be implemented on several simulation platforms, such as NetLogo, Jason, and JADE.

5th International Workshop on Multi-Agent Systems and Simulation

  • Multi Agent Simulation for Decision Making in Warehouse Management
    157 MAS,simulation,logistics Massimo Cossentino, Carmelo Lodato, Lopes Salvatore, Patrizia Ribino, pages 611 – 618. Show abstract Abstract. The paper presents an agent-based simulation as a tool for decision making about automatic warehouses management. The proposed multi-agent system is going to be used in a real environment within a project developed with a company working on logistics. More in details, we have developed a simulation framework in order to study problems, constraints and performance issues of the truck unload operations. We aim to optimize the suitable number of Automated Guided Vehicles (AGVs) used for unloading containers arrived to the warehouse. This is a critical issue since an AGV is a costly resource and an augment in number does not necessarily correspond to an improved unloading speed. The experiment performed with our simulated environment allows us also to evaluate the impact of other elements to the performance.
  • A Multi-Agent Architecture for Simulating and Managing Microgrids
    123 Agent,Multi-Agent Systems,Simulation,Electric Power Production,Electric Power Transportation. Massimo Cossentino, Carmelo Lodato, Salvatore Lopes, Marcello Pucci, Gianpaolo Vitale, Maurizio Cirrincione, pages 619 – 622. Show abstract Abstract. With the increasing demand for electric power, new theories have been studied by the scientific community. One of the most promising consists in splitting the electric grid in microgrids, each one composed by renewable and not renewable sources and various loads. These microgrids aim to be as much autonomous as it is possible in producing the energy they need. Energy once produced must be transferred to the loads. This paper proposes a MAS used to simulate the control of the transportation grid. The system is able to react to feeders overloading and failures by redirecting the energy flow and protecting itself.
  • Agent.GUI: A Multi-agent Based Simulation Framework
    71 agent-based simulations,framework,end user applications,agent-environment interaction,JADE Christian Derksen, Cherif Branki, Rainer Unland, pages 623 – 630. Show abstract Abstract. Multi-agent based simulations (MABS) of real world scenarios are attracting growing interest. Complex real world scenarios require deep knowledge and expertise which can only be provided by specialists in the application area. However, it cannot be expected that such experts understand agent-based technology and simulation. Consequently, tools are required, which deliver a high level, easy usable interface. In this article we propose a new simulation framework based on the JADE framework. Besides extensions to deal with the time aspect, agent/environment interaction, visualization and load balancing, we also address the usability of the tool for specialists from different domains. For this, our framework, called Agent.GUI, provides an easy to use, customizable graphical user interface. Overall, Agent.GUI is a powerful tool for the development of multi-agent based simulations.
  • Minority Game: the Battle of Adaptation, Intelligence, Cooperation and Power
    130 Agent-based simulation,Agent Computational Economics,minority game,zero-sum game,El Farol Bar Akihiro Eguchi, Hung Nguyen, pages 631 – 634. Show abstract Abstract. Minority game is a simple simulation of a zero-sum game, which has a similar structure to that of a real world market like a currency exchange market. We discuss a way to implement the minority game and provide a simulation environment with agents that can use various types of strategies to make decisions like genetic algorithms, simple statistics, and cooperative strategies. The goal of this simulation study is to find the most effective strategy for winning the zero-sum game.
  • Towards a Generic Testing Framework for Agent-Based Simulation Models
    107 Agent-based Modeling and Simulation,Testing,MAS simulation toolkits and frameworks Onder Gurcan, Oguz Dikenelli, Carole Bernon, pages 635 – 642. Show abstract Abstract. Agent-based modeling and simulation (ABMS) had an increasing attention during the last decade. However, the weak validation and verification of agent-based simulation models makes ABMS hard to trust. Moreover, there is no comprehensive tool set for verification and validation of agent-based simulation models which demonstrates that inaccuracies exist and/or which reveals the existing errors in the model. In this sense, we designed and developed a generic testing framework for agent- based simulation models to conduct validation and verification of models. This paper presents our testing framework in detail and demonstrates its effectiveness by showing its applicability on a realistic agent-based simulation case study.
  • Modeling Agent Behavior Through Online Evolutionary and Reinforcement Learning
    242 Multiagent Systems,Multiagent Simulation,Agent Learning,Behavior Modeling Robert Junges, Franziska Klügl, pages 643 – 650. Show abstract Abstract. The process of creation and validation of an agent-based simulation model requires the modeler to undergo a number of prototyping, testing, analyzing and re-designing rounds. The aim is to specify and calibrate the proper low-level agent behavior that truly produces the intended macro-level phenomena. We assume that this development can be supported by agent learning techniques, specially by generating inspiration about behaviors as starting points for the modeler. In this contribution we address this learning-driven modeling task and compare two methods that are producing decision trees: reinforcement learning with a post processing step for generalization and Genetic Programming.
  • Visualizing Agent-Based Simulation Dynamics in a CAVE—Issues and Architectures
    239 agent-based simulation,visualization tools,validation Athanasia Louloudi, Franziska Klügl, pages 651 – 658. Show abstract Abstract. Displaying an agent-based simulation on an immersive virtual environment called CAVE (Cave Automatic Virtual Environment), a human expert is enabled to evaluate the dynamics from the same point of view as in real life—from a within perspective instead of a birds eye view. As this form of face validation is useful for many multiagent simulations, it should be possible to setup such a system with as little effort as possible. In this context, we systematically analyse the critical issues that a realization of such a system raises. Addressing these problems, we finally discuss design aspects of basic framework architectures.
  • SimConnector: An Approach to Testing Disaster-Alerting Systems Using Agent Based Simulation Models
    155 Disaster Alerting System,Agent-based Modeling,Decision Support System,Cognitive Computation Muaz Niazi, Qasim Siddique, Amir Hussain, pages 659 – 665. Show abstract Abstract. The design, development and testing of intelligent disaster detection and alerting systems pose a set of non-trivial problems. Not only are such systems difficult to design as they need to accurately predict real-world outcomes using a distributed sensing of various parameters, they also need to generate an optimal number of timely alerts when the actual disaster strikes. In this paper, we propose the SimConnector Emulator, a novel approach for the testing of real-world systems using agent-based simulations as a means of validation. As proof of concept, we have developed a Forest Fire Disaster Detection and Alerting System, which uses Intelligent Decision Support based on an internationally recognized Fire rating index, namely the Fire Weather Index (FWI). Results of extensive testing show the effectiveness of the SimConnector approach in the development and testing of real-time applications in general and disaster detection and alerting systems, in particular.
  • A Chemical Inspired Simulation Framework for Pervasive Services Ecosystems
    98 Chemical-inspired simulation,Pervasive systems,Agent-based model Danilo Pianini, Sara Montagna, Mirko Viroli, pages 667 – 674. Show abstract Abstract. This paper grounds on the SAPERE project (Self-Aware PERvasive Service Ecosystems), which aims at proposing a multi-agent framework for pervasive computing, based on the idea of making each agent (service, device, human) manifest its existence in the ecosystem by a Live Semantic Annotation (LSA), and of coordinating agent activities by a small and fixed set of so-called eco-laws, which are sort of chemical-like reactions evolving the distributed population of LSAs. System dynamics in SAPERE is complex because of opennes and due to the self-* requirements imposed by the pervasive computing setting: a simulation framework is hence needed for what-if analysis prior to deployment. In this paper we present a prototype simulator which—due to the role of chemical-like dynamics—is based on a variation of an existing SSA (Stochastic Simulation Algorithm), suitable tailored to the specific features of SAPERE, including dynamicity of network topology and pattern-based application of eco-laws. The simulator is tested on a crowd steering scenario where the navigation of groups is guided, through public or private displays, towards the preferential destination and by emergently circumventing crowded regions.
  • BioMASS: a Biological Multi-Agent Simulation System
    116 Individual-based modeling,Ecological complexity,Multi-agent systems,Multi-agent simulation,Complex systems Candelaria Sansores, Flavio Reyes, Hector Gomez, Juan Pavon, Luis Calderon, pages 675 – 682. Show abstract Abstract. This article presents an agent based model for the simulation of biological systems. The approach consists mainly of providing individual based models for each of the functional groups that conform an ecosystem. Functional groups (a term commonly used by ecologists) may represent a group of individuals (from the same or from different species) that share relevant attributes. This provides flexibility to configure different kinds of populations by parametrization without the need of programming, something useful for biologists. Additionally, a simulation tool implemented as a multi-agent system facilitates the analysis and understanding of ecological complexity. Multi-agent systems are proposed to address heterogeneity and autonomy demanded by the interdisciplinary individual-based modeling methodology. The objective of the system is to explore the intricate relationships among population and individuals, in an ecosystem approach. The main difference with other tools is the ability of incorporating individual decisions, based on metabolism and environmental conditions.

International Symposium on Multimedia Applications and Processing

  • Robust Digital Watermarking System for Still Images
    20 Digital image watermarking,Blind decoding,Logarithmically-Polar Transform Sergey Anfinogenov, Valery Korzhik, Guillermo Morales-Luna, pages 685 – 689. Show abstract Abstract. Fast and wide-scale spreading of the image data on the Internet creates great opportunities for illegal access by the different kinds of intruders. In order to solve the problem of intellectual property protection, digital image watermarking can be successfully used. We describe a new method of digital watermarking based on the embedding of the local maxima into the Fourier transform area of the image. Simulation results are presented, which confirm that the proposed method is resistant to cyclic shifts, row and column removal, cropping, addition of noise, rotation and JPEG transforms.
  • Estimating Topographic Heights with the StickGrip Haptic Device
    80 visual intensity discrimination,haptic feedback,topographic height visualization Tatiana V. Evreinova, Grigori Evreinov, Roope Raisamo, pages 691 – 697. Show abstract Abstract. This paper presents an experimental study aimed to investigate the impact of haptic feedback when trying to evaluate quantitatively the topographic heights depicted by height tints. In particular, the accuracy of detecting the heights had been evaluated visually and instrumentally by using the new StickGrip haptic device. The participants were able to discriminate the required heights specified in the scale bar palette and then to detect these values within an assigned map region. It was demonstrated that the complementary haptic feedback increased the accuracy of visual estimation of the topographic heights by about 32%.
  • Image Indexing by Spatial Relationships between Salient Objects
    17 image retrieval,spatial relationships,salient objects,object oriented database Eugen Ganea, Marius Brezovan, pages 699 – 704. Show abstract Abstract. In this paper, we presented our technique to extract and to use the spa- tial relationships between two or more salient objects. Using an object oriented hypergraph data structure, the spatial relationships are determined and stored in an object-oriented database. This work aims to unified the phases of processing, indexing and retrieval of images. The proposed model can be applicable to other types of data (video) and to semantic relations hidden in an image. The struc- ture of the database used for image storage allow the construction of the indexes classes hierarchy in order to improve the results of image retrieval. Our method requires more experiments for datasets which come from different areas and for images which contain more salient objects.
  • From icons perception to mobile interaction
    36 icons recognition,human factors,interface design,mobile interaction Chrysoula Gatsou, Anastasios Politis, Dimitrios Zevgolis, pages 705 – 710. Show abstract Abstract. This study deals with the vital issue of whether a mobile phone interface icon effectively expresses the function related to it. We also examine how far any icon represents the meaning of the function for which it has been designed, chosen and installed by the mobile phone manufacturer and designer. The subject of the effectiveness of icons used in mobile phone interfaces deserves examination. Icons are an integral part of most mobile interfaces, for they are the bridge enabling interaction. Yet there has been little investigation of the influence of graphical icons on the perception of ordinary mobile phone users. Among the chief findings are (1) graphical representation affects the recognition rate of icons and influences user perception and (2) there are significant differences in performance in recognizing icons among different age groups.
  • Automatic Speech Recognition for Polish in a Computer Game Interface
    236 speech recognition,acoustic model,language model,Polish language,computer game Artur Janicki, Dariusz Wawer, pages 711 – 716. Show abstract Abstract. The paper describes the process of designing a task-oriented continuous speech recognition system for Polish, based on CMU Sphinx4, to be used in the voice interface of a computer game called Rally Navigator. The concept of the game is presented, the stages of creating the acoustic model and the language model are described in details, taking into account the specificity of the Polish language. Results of initial experiments show that as little as 15 minutes of audio material is enough to produce a highly effective single-speaker command- and-control ASR system for the computer game, providing the sentence recognition accuracy of 97.6%. Results of the system adaptation for a new speaker are presented. It is also showed that the statistic trigram-based language model with negative trigrams yields the best recognition results.
  • Classification of Learners Using Linear Regression
    16 e-learning,linear regression,learner classification Cristian Mihaescu, pages 717 – 721. Show abstract Abstract. Proper classification of learners is one of the key aspects in e-Learning environments. This paper uses linear regression for modeling the quantity of accumulated knowledge in relationship with variables representing the performed activity. Within the modeling process there are used the experiences performed by students for which it is known the level of accumulated knowledge. The classification of learners is performed at concept level. The outcome is computed as a percentage representing the concept covering in knowledge.
  • Data Centered Collaboration in a Mobile Environment
    138 distributed video processing,remote data visualization,mobile collaboration,videoconferencing systems Maciej Panka, Piotr Bala, pages 723 – 728. Show abstract Abstract. In this paper we present a system we have developed for a mobile audio-video collaboration that is centered around the distributed datasets. In our approach all the data are processed remotely on dedicated servers, where they are successively rendered off-the-screen and compressed using a video codec. The signals captured from the users' cameras are transferred to the server in real time, where they are combined with the data frames into single video streams. Dependent on the device's capabilities and current network bandwidth every session participant receives individually customized stream, which presents both the remote data and the camera view of currently chosen presenter alternately. At the end of this paper we also present the results of the system's performance test that we have obtained during the collaborative visualization of a remote, multidimensional dataset using different kind of modern mobile devices, including tablets and cell phones.
  • Computerized Three-Dimmensional Craniofacial Reconstruction from Skulls Based on Landmarks
    100 3D Craniofacial Reconstruction,3D Modeling,Mesh,Landmark Leticia Carnero Pascual, Carmen Lastres Redondo, Belen Rios Sanchez, David Garrido Garrido, Asuncion Santamaria Galdon, pages 729 – 735. Show abstract Abstract. Human identification from a skull is a critical process in legal and forensic medicine, specially when no other means are available. Traditional clay-based methods attempt to generate the human face, in order to identify the corresponding person. However, these reconstructions lack of objectivity and consistence, since they depend on the practitioner. Current computerized techniques are based on facial models, which introduce undesired facial features when the final reconstruction is built. This paper presents an objective 3D craniofacial reconstruction technique, implemented in a graphic application, without using any facial template. The only information required by the software tool is the 3D image of the target skull and three parameters: age, gender and Body Mass Index (BMI) of the individual. Complexity is minimized, since the application database only consists of the anthropological information provided by soft tissue depth values in a set of points of the skull.
  • DCFMS: A Chunk-Based Distributed File System for Supporting Multimedia Communication
    181 distributed file system,multimedia file transfer,multimedia logical partitioning prediction Cosmin Marian Poteras, Constantin Petrisor, Mihai Mocanu, Cristian Marian Mihaescu, pages 737 – 741. Show abstract Abstract. It is well known that the main drawback of distributed applications that require high performance is related to the data transfer speed between system nodes. The high speed networks are never enough. The application has to come out with special techniques and algorithms for optimizing data availability. This aspect is increasingly needed for some categories of distributed applications such as computational steering applications, which have to permanently allow users to interactively monitor and control the progress of their applications. Following our previous research which was focused on the development of a set of frameworks and platforms for distributed simulation and computational steering, we introduce in this paper a new model for distributed file systems, supporting data steering, that is able to provide optimization for data acquisition, data output and load balancing while reducing the development efforts, improving scalability and flexibility of the system. Data partitioning is being performed at a logical level allowing multimedia applications to define custom data chunks like frames of a video, phrases of text, regions of an image, etc.
  • Automatic classification of gestures: a context-dependent approach
    110 Human computer interaction,human gestures,automatic recognition of gestures,automatic classification of gestures Mario Refice, Michelina Savino, Michele Caccia, Michele Adduci, pages 743 – 750. Show abstract Abstract. Gestures represent an important channel of human communication, and they are “co-expressive” with speech. For this reason, in human-machine interaction automatic gesture classification can be a valuable help in a number of tasks, like for example as a disambiguation aid in automatic speech recognition. Based on the hand gesture categorization proposed by D. McNeill in his reference works on gesture analysis, a new approach is here presented which classifies gestures using both their kinematic characteristics and their morphology stored as parameters of the templates pre-classified during the training phase of the procedure. In the experiment presented in this paper, an average of about 90% of correctly classified gesture types is obtained, by using as templates only about 3% of the total number of gestures produced by the subjects.
  • Concurency control for a Multimedia Database System
    121 multimedia database system,multiuser,concurrency control Cosmin Stoica Spahiu, pages 751 – 754. Show abstract Abstract. The paper presents the concurrency control methods used to provide simultaneous access to data, in a relational database management system. This is an original system that has integrated methods for extracting the color and texture characteristics from images and executing content-based visual queries. In order to accomplish this, the system has defined an original new data type called IMAGE that is used to store the images along with the characteristics extracted and other important information. The problems that should be handled are: processing multiple requests and accessing the same set of data simultaneously. If multiple client requests concurrently access the same data (writes to or modifies it), then obviously the information in the database must be protected with a synchronization algorithm to ensure that the information doesn't get corrupted.
  • Automated annotation system for natural images
    41 Image annotation,image segmentation,ontology,relevance models Liana Stanescu, pages 755 – 762. Show abstract Abstract. Automated annotation of digital images remains a highly challenging task. This process can be used for indexing, retrieving, and understanding of large collections of image data. This paper presents an image annotation system used for annotating natural images. The proposed system is using an efficient annotation model called Cross Media Relevance Model for the annotation process. Image's regions are described using a vocabulary of blobs generated from image features using the K-means clustering algorithm. Using SAIAPR TC-12 Dataset of annotated images it is estimated the joint probability of generating a word given the blobs in an image. The annotation process of each new image starts with a segmentation phase. An original and efficient segmentation algorithm based on a hexagonal structure is applied to obtain the list of regions. Each meaningful word assigned to the annotated image is retrieved from an ontology derived in an original manner starting from the hierarchical vocabulary associated with SAIAPR TC-12 and from the spatial relationships between regions.
  • Fuzzy UML and Petri Nets Modeling Investigations on the Pollution Impact on the Air Quality in the Vicinity of the Black Sea Constanta Romanian Resort
    43 Unified Modeling Language (UML),Fuzzy Petri Nets,Environment Protection,Statistics Elena-Roxana Tudoroiu, Adina Astilean, Tiberiu Letia, Gabriela Neacsu, Zoltan Maroszy, Nicolae Tudoroiu, pages 763 – 766. Show abstract Abstract. The purpose of this research is to investigate the use of an intelligent neural-fuzzy modeling strategy based on Unified Modeling Language (UML) diagrams and Petri nets models of the pollution sources impact on the air quality along the Romanian coast of Black Sea, especially in Constanta vicinity. This is possible by monitoring the physical and chemical parameters of the air quality, such as temperature, wind speed, Carbon Dioxide (CO2), methane (CH4), Nitrogen Oxide, ozone, water vapours concentrations provided by several “in-situ” measurements stations spread in the critical points from Constanta area. Moreover, we will try to disseminate the information collected and to investigate adequate actions to prevent the continuous degradation of the environment. The values of air quality-monitored parameters vary with the position of the sampling sites in quasi-large range; consequently a direct correlation between these indicators will be useful. Air pollution sources cause the ”greenhouse effect” with a high impact on the live and fauna, degrading progressively the Black Sea ecosystems. In these circumstances is vital to install more efficient filters in the industrial area, to clean the residual water, and to build enough collecting residues places that prevent direct discharge of the pollutant residues in the surface water. Closing, in our research we try to present the benefit of the UML diagrams in combination with Petri nets models developed on a wide database concerning the air pollution degree inside Constanta Romanian Black sea resort to predict the future results.
  • Pass-Image Authentication Method Tolerant to Video-Recording Attacks
    134 authentication,observing attack,random attack,video-recording attack Hirakawa Yutaka, Hiroyuki Take, Kazuo Ohzeki, pages 767 – 773. Show abstract Abstract. User authentication is widely used in automatic teller machines (ATMs) and Internet services. Recently, ATM passwords have been increasingly stolen using small charge-coupled device cameras. This article discusses a user authentication method in which graphical passwords instead of alphabetic ones are used as passwords in order for it to be tolerant to observation attacks. Several techniques for password authentications have been discussed in various studies. However, there has not been sufficient research on authentication methods that use pass-images instead of pass-texts. This article proposes a user authentication method that is tolerant to attacks when a user's pass-image selection operation is video recorded twice. In addition, usage guidelines recommending eight pass-images are proposed, and its security is evaluated.

Risks Awareness and Management through Smart Solutions

  • Enhancing DNS Security using Dynamic Firewalling with Network Agents
    48 DNS,Security,Intrusion Detection System,Monitoring. Joao Afonso, Pedro Veiga, pages 777 – 782. Show abstract Abstract. There is no doubt that one of the most critical components of the Internet is the DNS—Domain Name System. In this paper, we propose a solution to strengthen the security of DNS servers, namely those associated with Top Level Domains (TLD), by using a system that identifies patterns of potentially harmful traffic and isolates it. The proposed solution has been developed and tested at FCCN, the TLD manager for the .PT domain. The system consists of network sensors that monitor the network in real-time and can dynamically detect, prevent, or limit the scope of the attempted intrusions or other types of attacks to the DNS service, thus improving it's global availability.
  • Enhanced CakES representing Safety Analysis results of Embedded Systems
    50 Visualization,Safety Analysis,Fault Tree Analysis,Minimal Cutsets,Basic Events,Embedded Systems Yasmin I. Al-Zokari, Daniel Schneider, Dirk Zeckzer, Liliana Guzman, Yarden Livnat, Hans Hagen, pages 783 – 790. Show abstract Abstract. Nowadays, embedded systems are widely used. It is extremely difficult to analyze safety issues in embedded systems, to relate the safety analysis results to the actual parts and to identify these parts in the system. Further, it is very challenging to compare: the system's safety development, and the different safety metrics to find their most critical combinations. Due to these fundamental problems, a large amount of time and effort is. Until now, there is a lack of visualization metaphors supporting the efficient analysis of safety issues in embedded systems. Therefore we present “Enhanced CakES,” a system that combines and links the existing knowledge of the safety analysis and the engineering domain and improves the communication between engineers of these domains. The engineers can directly explore the most safety critical parts, retaining an overview of all critical aspects in the actual model. A formal evaluation was performed and proved its superiority.
  • Integrated management of risk information
    199 Risk Management,Enterprise Architecture,Metadata Registry,Information System José Barateiro, José Borbinha, pages 791 – 798. Show abstract Abstract. Today's competitive environment requires effective risk management activities to create prevention and control mechanisms to address the risks attached to specific activities and valuable assets. One of the main challenges in this area is concerned with the analysis and modeling of risks, which increases with the fact that current efforts tend to operate in silos with narrowly focused, functionally driven, and disjointed activities. This leads to a fragmented view of risks, where each activity uses its own language, customs and metrics. The lack of interconnection and holistic view of risks limits an organization-wide perception of risks, where interdependent risks are not anticipated, controlled or managed. In order to address the Risk Management interoperability and standardization issues, this paper proposes an alignment between Risk Management, Governance and Enterprise Architecture activities, providing a systematic support to map and trace identified risks to enterprise artifacts modeled within the Enterprise Architecture, supporting the overall strategy and governance of any organization. We propose an architecture where risks are defined through a XML-based domain specific language, and integrated with a Metadata Registry to handle risk concerns in the overall organization environment.

3rd Workshop on Advances in Programming Languages

  • Implementation of a Domain-Specific Language EasyTime using LISA Compiler Generator
    47 domain-specific language,parser,code generator,time measuring,RFID technology Iztok Jr. Fister, Marjan Mernik, Iztok Fister, Dejan Hrnčič, pages 801 – 808. Show abstract Abstract. A manually time measuring tool in mass sport competitions cannot be imagined nowadays because a lot of modern disciplines, such as IronMan, take a long time and, therefore, demand additional reliability. Moreover, automatic timing devices, based on RFID technology, have become cheaper. However, these devices cannot operate stand-alone because they need a computer measuring system capable of processing the incoming events, encoding the results, assigning them to the correct competitor, sorting the results according to the achieved times, and providing a printout of the results. In this article, the domain-specific language EasyTime is presented, which enables the controlling of an agent by writing the events in a database. Especially, we are focused on the implementation of EasyTime with LISA tool that enables automatic construction of compilers from language specification using Attribute Grammars. By using of EasyTime, we can also decrease the number of measuring devices. Furthermore, EasyTime is universal and can be applied to many different sports competitions.
  • Using Aspect-Oriented State Machines for Resolving Feature Interactions
    105 Automata theory and applications,Domain-specific languages,Formal semantics and syntax,Modeling languages,Program analysis and verification,Programming paradigms (aspect-oriented),Specification languages Tom Dinkelaker, Mohammed Erradi, pages 809 – 816. Show abstract Abstract. Combining Features may lead to conflicting situations called Feature Interactions. What we call Feature is a self-contained subset of the behavior of the system. Adding a feature consists of extending the system's behavior. Intuitively in the design of telephone systems, this may consist for instance in adding functionalities. This may be the case of features like Call Waiting and Three Way Calling. Feature interactions problem occurs when the addition of a new feature to a system disrupts the existing services and features. Feature interactions is a kind of inconsistent conflict between multiple communication services and considered an obstacle to developing reliable telephone systems. In this work we present an implementation of existing approach for detecting and resolving feature interactions. This implementation uses Aspect Oriented Programming (AOP), in fact aspects can resolve interactions by intercepting the event which causes troubleshoots. Also a Domain Specific Language (DSL) was developed to handle Finite State Machine concept, which was the for malism used to specify the Telecommunication Features.
  • Domain-Specific Modeling in Document Engineering
    58 Domain Specific Modeling,Document Engineering,Document rendering,Incremental specification,User Driven Modeling,Application generating Verislav Djukic, Ivan Luković, Aleksandar Popovic, pages 817 – 824. Show abstract Abstract. Specification languages play a central role in supporting document engineering. We describe in this paper how domain-specific languages, along with domain-specific frameworks and generators, can support formal specification and document rendering in directory publishing. With flexible metamodel-based tools we have developed four languages for the modeling of: (i) small advertisements, (ii) appropriate documents, (iii) workflow control and (iv) templates. We describe a domain-specific framework with specific libraries, interpreter for the languages as well as code, and document and application generators. The presented approach enables, in a typical document-centric system, specification of both static and dynamic characteristics of the system on a high abstraction level with domain specific concepts. The concepts of incremental document specification and incremental document rendering have been introduced, in order to address the problem of very frequent specification(s) refinements. The expression power of the created languages is demonstrated with a representative examples of document engineering covering document content specification, workflow control and application generation. All of the aforementioned languages are integrated into a single meta-model, under the name of DVDocLang which is, due to its simplicity, highly applicable for user-driven conceptual modeling.
  • A MOF based Meta-Model of IIS* Case PIM Concepts
    120 Meta-Object Facility,Domain Specific Modeling,Information System Modeling Milan Čeliković, Ivan Luković, Slavica Aleksić, Vladimir Ivančević, pages 825 – 832. Show abstract Abstract. In this paper, we present platform independent model (PIM) concepts of IIS*Case tool for information system (IS) modeling and design. IIS*Case is a model driven software tool that provides generation of executable application prototypes. The concepts are described by Meta Object Facility (MOF) specification, one of the commonly used approaches for describing meta-models. One of the main reasons for having IIS*Case PIM concepts specified through the meta-model, is to provide software documentation in a formal way, as well as a domain analysis purposed to create a domain specific language to support IS design. Using the meta-model of PIM concepts, we can generate test cases that may assist in software tool verification.
  • Memory Safety and Race Freedom in Concurrent Programming Languages with Linear Capabilities
    77 Concurrent programming,Linear type systems,Memory safety,Race freedom Niki Vazou, Michalis Papakyriakou, Nikolaos Papaspyrou, pages 833 – 840. Show abstract Abstract. In this paper we show how to statically detect memory violations and data races in a concurrent language, using a substructural type system based on linear capabilities. However, in contrast to many similar type-based approaches, our capabilities are not only linear, providing full access to a memory location but unshareable; they can also be read-only, thread-exclusive, and unrestricted, all providing restricted access to memory but extended shareability in the program source. Our language features two new operators, let! and lock, which convert between the various types of capabilities.
  • Decomposition of SBQL Queries for Optimal Result Caching
    81 query optimization,cached queries,object-oriented databases,query and programming languages,stack-based approach Piotr Cybula, Kazimierz Subieta, pages 841 – 848. Show abstract Abstract. We present a new approach to optimization of query languages using cached results of previously evaluated queries. It is based on the stack-based approach (SBA) which assumes description of semantics in the form of abstract implementation of query/programming language constructs. Pragmatic universality of object-oriented query language SBQL and its precise, formal operational semantics make it possible to investigate various crucial issues related to this kind of optimization. There are two main issues concerning this topic—the first is strategy for fast retrieval and high reuse of cached queries, the second issue is development of fast methods to recognize and maintain consistency of query results after database updates. This paper is focused on the first issue. We introduce data structures and algorithms for optimal, fast and transparent utilization of the result cache, involving methods of query normalization with preservation of original query semantics and decomposition of complex queries into smaller ones. We present experimental results of the optimization that demonstrate the effectiveness of our technique.
  • Automated Conversion of ST Control Programs to Why for Verification Purposes
    44 IEC 61131-3 standard,ST language extension,design by contract,Why generator,Coq prover Jan Sadolewski, pages 849 – 854. Show abstract Abstract. The paper presents a prototype tool ST2Why, which converts a Behavioral Interface Specification Language for ST language from IEC 61131-3 standard to Why code. The specification annotations are stored as special comments, which are close to implementation and readable by the programmer. Further transformation with Why tool into verification lemmas, confirms compliance between specification and implementation. Proving lemmas is performed in Coq, but other provers can be used as well.
  • Implementing Attribute Grammars Using Conventional Compiler Construction Tools
    93 Attribute Grammars,Parse generators,Language Processor Development Method,Grammarware Daniel Rodirguez Cerezo, Antonio Sarasa Cabezuelo, Jose Luis Sierra Rodriguez, pages 855 – 862. Show abstract Abstract. This article describes a straightforward and structure-preserving coding pattern to encode arbitrary non-circular attribute grammars as syntax-directed translation schemes for bottom-up parser generation tools. According to this pattern, a bottom-up oriented translation scheme is systematically derived from the original attribute grammar. Semantic actions attached to each syntax rule are written in terms of a small repertory of primitive attribution operations. By providing alternative implementations for these attribution operations, it is possible to plug in different semantic evaluation strategies in a seamlessly way (e.g., a demand-driven strategy, or a data-driven one). The pattern makes it possible the direct implementation of attribute grammar-based specifications using widely-used translation scheme-driven tools for the development of bottom-up language translators (e.g. YACC, BISON, CUP, etc.). As a consequence, this initial coding can be subsequently refined to yield final efficient implementations. Since these implementations still preserve the ability of being extended with new features described at the attribute grammar level, the advantages from the point of view of development and maintenance become apparent.
  • The embedded left LR parser
    177 LR parsing,left parse,embedded parsing Bostjan Slivnik, pages 863 – 870. Show abstract Abstract. A parser called the embedded left LR(k) parser is defined. It is capable of (a) producing the prefix of the left parse of the input string and (b) stopping not on the end-of-file marker but on any string from the set of lookahead strings fixed at the parser generation time. It is aimed at automatic construction of LL parsers that use embedded LR parsers to resolve LL(k) conflicts. The conditions regarding the termination of the embedded left LR(k) parser if used within LL (and similar) parsers are defined and examined in-depth. As the embedded LR parser produces the prefix of the left parse, the LL parser augmented with embedded LR parsers still produces the left parse and the compiler writer does not need to bother with different parsing strategies during the compiler implementation.
  • Nonlinear Tree Pattern Pushdown Automata
    196 Nonlinear tree pattern matching,indexing trees,pushdown automata Jan Travnicek, Jan Janoušek, Borivoj Melichar, pages 871 – 878. Show abstract Abstract. A new kind of an acyclic pushdown automaton for an ordered tree is presented. The nonlinear tree pattern pushdown automaton represents a complete index of the tree for nonlinear tree patterns and accepts all nonlinear tree patterns which match the tree. Given a tree with $n$ nodes, the number of such nonlinear tree patterns is $O((2+v)^n)$, where $v$ is the number of variables in the patterns. We discuss time and space complexities of the nondeterministic nonlinear tree pattern pushdown automata and a way of its implementation. The presented pushdown automaton is input--driven and therefore can be determinised.
  • A Type and Effect System for Implementing Functional Arrays with Destructive Updates
    135 Functional programming,Purely functional data structures,Type and effect system,Compiler optimization Georgios Korfiatis, Michalis Papakyriakou, Nikolaos Papaspyrou, pages 879 – 886. Show abstract Abstract. It can be argued that some of the benefits of purely functional languages are counteracted by the lack of efficient and natural-to-use data structures for these languages. Imperative programming is based on manipulating data structures destructively, e.g., updating arrays in-place; however, doing so in a purely functional language violates the language's very nature. In this paper, we present a type and effect system for an eager purely functional language that tracks array usage, i.e., read and write operations, and enables the efficient implementation of purely functional arrays with destructive update.
  • Checking the Conformance of Grammar Refinements with Respect to Initial Context-Free Grammars
    94 Context-free grammars,Grammar Refinement,Equivalence Checking,Grammarware Bryan Temprado Battad, Antonio Sarasa Cabezuelo, Jose Luis Sierra Rodriguez, pages 887 – 890. Show abstract Abstract. This paper deals with the refinement of context-free grammars. According to this paper, to refine an initial context-free grammar supposes to devise an equivalent grammar that preserves the main syntactic structures of the initial one while making explicit other structural characteristics (e.g., associativity and priority of the operators in an expression language). Thus, writing grammar refinements becomes a usual activity in any grammarware scenario. It naturally leads to a central concern: to check the conformance of a grammar refinement with respect to an initial grammar, in the sense of checking the equivalence of the two grammars involved. Although, generally speaking, checking the equivalence of two context-free grammars is an undecidable problem, in the scenario of grammar refinement it is possible to exploit the relationships between the initial grammar and the grammar refinement to run a heuristic conformance test. These relationships must be made explicit by associating core non-terminal symbols in the initial grammar with core non-terminal symbols in the grammar refinement. Once it is made, it is possible to base the heuristic test on searching regular expressions involving both terminal and core non-terminal symbols that describe each core non-terminal symbol, and on checking the equivalence of carefully chosen pairs of such regular expressions. While test fails are non-conclusive, since they do not guarantee non-equivalence, test successes actually are conclusive: they guarantee equivalence, and therefore the conformance of the grammar refinement with respect to the initial grammar. The paper describes the method and illustrates it with some examples.
  • Identification of Patterns through Haskell Programs Analysis
    72 language abstraction,program patterns recognition,recurring patterns,automated analysis of programs,software language engineering Jan Kollar, Sergej Chodarev, Emilia Pietrikova, Lubomir Wassermann, pages 891 – 894. Show abstract Abstract. Usage of appropriate high-level abstractions is very important for development of reliable and maintainable programs. Abstractions can be more effective if applied at the level of language syntax. To achieve this goal, analysis of programs based on the syntax is needed. This paper presents Haskell Syntax Analyzer tool that can be used for analysis of Haskell programs from the syntactic perspective. It allows to retrieve derivation trees of Haskell programs, visualize them and perform their statistical analysis. We also propose approach for recognition of recurring patterns in programs that can be used as a basis for automated introduction of abstractions into the language.
  • Computer Language Notation Specification through Program Examples
    168 computer language,language pattern,example driven language specification Miroslav Sabo, Jaroslav Porubän, Dominik Lakatoš, Michaela Kreutzová, pages 895 – 898. Show abstract Abstract. It often happens that computer-generated documents originally intended for human recipient need to be processed in an automated manner. The problem occurs if analyzer does not exist and therefore must be created ad hoc. To avoid the repetitive manual implementation of parsers for different formats of processed documents, we propose a method for specification of computer language notation by providing program examples. The main goal is to facilitate the process of computer language development by automating the specification of notation for recurring well-known language constructs often observed in various computer languages. Hence, we introduce the concept of language pattern, which captures the knowledge of language engineer and enables its automated application in the process of notation recognition. As a result, by using the proposed method, even a user less experienced in the field of computer language construction is able to create a language parser.
  • Tree Indexing by Pushdown Automata and Repeats of Subtrees
    220 repeats in trees,indexing trees,pushdown automata Tomas Flouri, Jan Janoušek, Borivoj Melichar, Costas Iliopoulos, Solon Pissis, pages 899 – 902. Show abstract Abstract. We consider the problem of finding all subtree repeats in a given unranked ordered tree. We show a new elegant and simple method, which is based on the construction of a tree indexing structure called the subtree pushdown automaton. We propose a solution for computing all repeating subtrees from the deterministic subtree pushdown automaton constructed over the subject tree. The method we present is directly analogous to the relationship between string deterministic suffix automata and repeats of substrings in a given string.
  • Subtree Oracle Pushdown Automata for Ranked and Unranked Ordered Trees
    215 Indexing trees,factor oracle,pushdown automata Martin Plicka, Jan Janoušek, Borivoj Melichar, pages 903 – 906. Show abstract Abstract. Oracle modification of subtree pushdown automata for unranked and ranked ordered trees is presented. Subtree pushdown automata [J. Janoušek, “String suffix automata and subtree pushdown automata,” in Proceedings of the Prague Stringology Conference 2009,] represent a complete index of a tree for subtrees and accept all subtrees of the tree. Subtree oracle pushdown automata, as inspired by string factor oracle automaton [C. Allauzen, M. Crochemore, and M. Raffinot, “Factor oracle: A new structure for pattern matching,” in SOFSEM, ser. Lecture Notes in Computer Science, J. Pavelka, G. Tel, and M. Bartosek, Eds., vol. 1725. Springer, 1999, pp. 295–310.], have the number of states equal to n+1, where n is the length of a corresponding linear notation of the tree. This makes the space complexity very low. The presented pushdown automata are input-driven and therefore they can be determinised. By analogy with the string factor oracle automaton the subtree oracle automata can also accept some subtrees which are not present in the given subject tree. However, the number of such false positive matches is smaller than in the case of the string factor oracle automaton because of a specific use of the pushdown store.
  • Semi-Automatic Component Upgrade with RefactoringNG
    86 API evolution,Java,refactoring,software evolution,software maintenance,NetBeans Zdeněk Troníček, pages 907 – 910. Show abstract Abstract. Software components evolve and this evolution often leads to changes in their interfaces. Upgrade to a new version of component then involves changes in client code that are nowadays usually done manually. We deal with the problem of automatic update of client code when the client upgrades to a new version of component. We describe a new flexible refactoring tool for the Java programming language that performs refactorings described by refactoring rules. Each refactoring rule consists of two abstract syntax trees: pattern and rewrite. The tool searches for the pattern tree in client-source-code abstract syntax trees and replaces each occurrence with the rewrite tree. The client-source-code abstract syntax trees are built and fully attributed by the Java compiler. Thus, the tool has complete syntactic and semantic information. Semantic analysis and flexibility in refactoring definitions make the tool superior to most competitors.
  • Extension of Iterator Traits in the C++ Standard Template Library
    209 C++ STL,compile-time warnings,traits Norbert Pataki, Zoltán Porkoláb, pages 911 – 914. Show abstract Abstract. The C++ Standard Template Library is the flagship example for libraries based on the generic programming paradigm. The usage of this library is intended to minimize classical C/C++ error, but does not warrant bug-free programs. Furthermore, many new kinds of errors may arise from the inaccurate use of the generic programming paradigm, like dereferencing invalid iterators or misunderstanding remove-like algorithms. In this paper we present some typical scenarios, what are mean risk from the view of program safety. We emit warnings while these constructs are used without any modification in the compiler. We argue for an extension of the STL's iterator traits in order to emit these warnings. We also present a general approach to emit ``customized' warnings. We support the so-called believe-me marks to disable warnings.

3rd Workshop on Software Services: Semantic-based Software Services

  • Search--Based Testing, the Underlying Engine of Future Internet Testing
    111 evolutionary testing,search--based testing,research topics,FITTEST Arthur Baars, Kiran Lakhotia, Tanja E. J. Vos, Joachim Wegener, pages 917 – 923. Show abstract Abstract. The European project FITTEST has the goal to develop and evaluate an integrated environment for continuous automated testing of Future Internet Applications. To deal with the highly dynamic nature of the Future Internet, Search--Based Testing is used in FITTEST as the underlying engine of Future Internet Testing. However, with 8 partners in 3 years, the FITTEST project cannot solve all open challenges in Search--Based Testing. This paper presents an overview of Search--Based Testing and discusses some of the open challenges that are out of the scope of the FITTEST project, but need to be addresses to take full advantage of search--based techniques for the Future Internet.
  • Testing and Remote Maintenance of Real Future Internet Scenarios, Towards FITTEST and FastFix Advanced Software Engineering
    92 testing,maintenance,practical experience Alessandra Bagnato, Anna Esparcia Alcazar, Tanja E. J. Vos, Beatriz Marin, José Oliver Murillo, Salvador I. Folgado, Auxiliadora Carlos Alberola, pages 925 – 932. Show abstract Abstract. In recent years, software testing and maintenance services are key factors of customers' perception of software quality. Nowadays, customers are more demanding about these services, while contribution of maintenance and testing services to products total cost of ownership should be reduced. Reducing these costs is even more crucial for SME's. To do this, new methods and techniques that will be aligned with the needs of companies are required. This paper presents the preliminary results of an interactive workshop celebrated by researchers and three companies. In the workshop, researchers present the advanced software engineering methods proposed by FastFix and FITTEST European projects. After that, discussions about their potential use in three application scenarios at Infoport Valencia, BULL Spain, and INDRA were performed and some lessons were learned.
  • A Neural Model for Ontology Matching
    197 ontology matching,unsupervised neural network,text mining Emil Stefan Chifu, Ioan Alfred Letia, pages 933 – 940. Show abstract Abstract. Ontology matching is a key issue in the Semantic Web. The paper describes an unsupervised neural model for matching pairs of ontologies. The result of matching two ontologies is a class alignment, where each concept in one ontology is put into correspondence with a semantically related concept in the other one. The framework is based on a model of hierarchical self-organizing maps. Every concept of the two ontologies that are matched is encoded in a bag-of-words style, by counting the words that occur in their OWL concept definition. We evaluated this ontology matching model with the OAEI benchmark data set for the bibliography domain. For our experiments we chose pairs of ontologies from the dataset as candidates for matching.
  • An Adaptive Virtual Machine Replication Algorithm for Highly-Available Services
    180 high-availability,virtualization,replication,asynchronous,adaptive Adrian Coleșa, Mihai Bica, pages 941 – 948. Show abstract Abstract. This paper presents an adaptive algorithm for the replication process of a primary virtual machine (VM) hosting a service that must be provided high-availability. Running the service in a VM and replicating the entire VM is a general strategy, totally transparent for the service itself and its clients. The replication takes place in phases, which are run asynchronous for efficiency reasons. The replication algorithm adapts to the running context, consisting of the behavior of the service and the available bandwidth between primary and backup nodes. The length of each replication phase is determined dynamically, in order to reduce as much as possible the latencies experienced by the clients of the service, especially in the case of a degraded connectivity between primary and backup nodes. We implemented our replication algorithm as an extension of the Xen hypervisor's VM migration operation. It proved better than its non-adaptive variants.
  • Service Modelling for Internet of Things
    113 Internet of Things,Semantic service modelling,Ontology,OWL-S Suparna De, Payam Barnaghi, Martin Bauer, Stefan Meissner, pages 949 – 955. Show abstract Abstract. The Internet of Things concept envisions a multitude of heterogeneous objects and interactions with the physical environment. The functionality provided by these objects can be termed as “real-world services” as it provides a near real-time state of the physical world. A structured, machine-processible approach to provision such real-world services is needed to make heterogeneous physical objects accessible on a large scale and to integrate them with the cyber world. This paper presents a semantic modeling approach for different components in an IoT framework. It is also discussed how the model can be integrated into the IoT framework by using automated association mechanisms with physical entities and how the data can be discovered using semantic search and reasoning.
  • Self-Healing Approach in the FastFix Project
    189 self-healing,maintenance,FastFix Benoit Gaudin, Mike Hinchey, pages 957 – 964. Show abstract Abstract. The EU FP7 FastFix project tackles issues related to remote software maintenance. In order to achieve this, the project considers approaches relying on context elicitation, event correlation, fault-replication and self-healing. Self-healing helps systems return to a normal state after the occurrence of a fault or vulnerability exploit has been detected. The problem is intuitively appealing as a way to automate the different maintenance type processes (corrective, adaptive and perfective) and forms an interesting area of research that has inspired many research initiatives. In this paper, we propose a framework for automating corrective maintenance that is based on software control principles. Our approach automates the engineering of self-healing systems as it does not require the system to be designed in a specific way. Instead it can be applied to legacy systems and automatically equips them with observation and control points. Moreover, the proposed approach relies on a sound control theory developed for Discrete Event Systems. Finally, this paper contributes to the field by introducing challenges for effective application of this approach to relevant industrial systems.
  • Autonomic Execution of Computational Workflows
    35 autonomic computing,service oriented architectures,scientific workflows Tomasz Haupt, Nitin Sukhija, Igor Zhuk, pages 965 – 972. Show abstract Abstract. This paper describes the application of an autonomic paradigm to manage the complexity of software systems such as computational workflows. To demonstrate our approach, the workflow and the services comprising it are treated as managed resources controlled by hierarchically organized autonomic managers. By applying service-oriented software engineering principles, in particular enterprise integration patterns, we have developed a scalable, agile, self-healing environment for execution of dynamic, data-driven workflows which are capable of assuring scientific fidelity despite unavoidable faults and without human intervention.
  • An Analysis of mOSAIC ontology for Cloud Resources annotation
    154 Cloud,Ontology,Semantics,OWL Francesco Moscato, Rocco Aversa, Beniamino Martino, Teodor-Florin Fortis, Victor Munteanu, pages 973 – 980. Show abstract Abstract. The easiness of managing and configuring resources and the low cost needed for setup and maintaining Cloud services have made Cloud Computing widespread. Several commercial vendors now offer solutions based on Cloud architectures. More and more providers offer new different services every month, following their customers needs. Anyway, it is very hard to find a single provider which offers all services needed by end users. Furthermore, different vendors propose different architectures for their Cloud systems and usually these are not compatible. Very few efforts have been done in order to propose a unified standard for Cloud Computing. This is a problem, since different Cloud systems and vendors have different ways to describe and invoke their services, to specify requirements and to communicate. Hence a way to provide a common access to Cloud services and to discover and use required services in Cloud federations is appealing. mOSAIC project addresses these problems by defining a common ontology and it aims at developing an open-source platform that enables applications to negotiate Cloud services as requested by users. The main problem in defining the mOSAIC ontology is in the heterogeneity of terms used by Clouds vendors, and in the number of standards which refer to Cloud Systems with different terminology. In this work the mOSAIC Cloud Ontology is described. It has been built by analysing Cloud standards and proposals. The Ontology has been then refined by introducing individuals from real Cloud systems.
  • Multi-Agent Architecture for Solving Nonlinear Equations Systems in Semantic Services Environment
    214 multi-agent architecture,semantic services,ness,nonlinear equations systems Victor Ion Munteanu, Cristina Mindruta, Viorel Negru, Calin Sandru, pages 981 – 984. Show abstract Abstract. Service oriented architectures allow accessing already implemented methods for solving complex mathematical problems. Semantic descriptions for services can provide support for intelligent systems in order to be able to select the right method for a given problem. Agents can take advantage of semantics and they provide the flexibility when solving a problem. In this context, a multi-agent architecture for solving nonlinear equations systems and a semantic services ontology are proposed.
  • Cloud-based Assistive Technology Services
    208 Cloud computing,Service provisioning,Assistive technology,Accessibility Ane Murua, Igor González, Elena Gómez-Martínez, pages 985 – 989. Show abstract Abstract. Cloud computing will play a large part in the ICT domain over the next 10 years or more. Many long-term aspects are still in an experimental stage, where the long-term impact on provisioning and usage is as yet unknown. While first attempts at this field focused on service provisioning for enterprises, cloud is reaching individuals nowadays. Our proposal is to go a step further and, based on the proven benefits of the Cloud, improve Internet and technology access for those always forgotten when any technological advance takes place. This paper presents the Cloud-based Assistive Technology Service delivering to individuals who face technology accessibility barriers due to ageing or disabilities. An example of how an Assistive Service is delivered to an individual in an easy and seamless way is given as a demonstration of how the future should be.
  • Semantic P2P Search engine
    237 Semantic Web,Semantic Peer-to-Peer,Multi-agent systems Ilya Rudomilov, Ivan Jelinek, pages 991 – 995. Show abstract Abstract. This paper discusses the possibility to use Peer-to-Peer (P2P) scenario for information-retrieval (IR) systems for higher performance and better reliability than classical client-server approach. Our research emphasis has been placed on design intelligent Semantic Peer-to-Peer search engine as multi-agent system (MAS). The main idea of the proposed project is to use semantic model for P2P overlay network, where peers are specified as semantic meta-models by the standardized OWL language from The World Wide Web Consortium. Using semantic model improve the quality of communication between intelligent peers in this P2P network. Undoubtedly, proposed semantic P2P network has all advantages of normal P2P networks and in the first place allow deciding a point with bottle-neck effect (typical problem for client-server applications) by using a set of peers for storing and data processing.
  • Hybrid Immune-inspired Method for Selecting the Optimal or a Near-Optimal Service Composition
    74 semantic Web service composition,selecting the optimal service composition solution,immune-inspired selection,reinforcement learning,genetic operators Ioan Salomie, Monica Vlad, Viorica Rozina Chifu, Cristina Bianca Pop, pages 997 – 1003. Show abstract Abstract. The increasing interest in developing optimization techniques that provide the optimal or a near-optimal solution of a problem in an efficient way has determined researchers to turn their attention towards biology. It has been noticed that biology offers many clues regarding the design of such optimization techniques, since biological systems exhibit self-optimization and self-organization capabilities in a de-centralized way without the existence of a central coordinator. In this context we propose a bio-inspired hybrid method that selects the optimal or a near-optimal solution in semantic Web service composition. The proposed method combines principles from immune-inspired, evolutionary, and neural computing to optimize the selection process in terms of execution time and explored search space. We model the search space as an Enhanced Planning Graph structure which encodes all the possible composition solutions for a given user request. To establish whether a solution is optimal, the QoS attributes of the services involved in the composition as well as the semantic similarity between them are considered as evaluation criteria. For the evaluation of the proposed selection method we have implemented an experimental prototype and carried out experiments on a set of scenarios from the holiday planning domain.
  • Dynamic Consolidation Methodology for Optimizing the Energy Consumption in Large Virtualized Service Centers
    63 service centers,dynamic server consolidation,reinforcement learning,virtualization,hierarchical clusters Cioara Tudor, Ionut Anghel, Ioan Salomie, Daniel Moldovan, Georgiana Copil, Pierluigi Plebani, pages 1005 – 1011. Show abstract Abstract. In this paper we approach the high energy consumption problem of large virtualized service centers by proposing a dynamic server consolidation methodology for optimizing the service center IT computing resources usage. The consolidation methodology is based on logically structuring the service center servers hierarchical clusters, consolidation decisions being taken in each cluster using a reinforcement learning based algorithm. The methodology defines two ways of consolidation decisions propagation across the hierarchy: bottom-up propagation for the dynamic power management actions and top-down propagation for the consolidation actions. The consolidation decision time complexity analysis shows that the methodology usage in large service centers improves the decision time with a factor proportional with the ratio between the service center total number of servers and the logical clusters' number of servers.
TeXnical Editor: Aleksander Denisiuk
E-mail:
Phone/fax: +48-55-2393802