Logo PTI
Polish Information Processing Society
Logo FedCSIS

Federated Conference on Computer Science and Information Systems

September 9–12, 2012. Wrocław, Poland

Proceedings

ISBN 978-83-60810-51-4
IEEE Catalog Number CFP1285N-USB

Complete FedCSIS Proceedings (PDF, 99.620 M)

Preface

7th International Symposium Advances in Artificial Intelligence and Applications

  • Case Studies on the Clinical Applications using Case-Based Reasoning
    286 Mobyen Uddin Ahmed, Shahina Begum, Peter Funk, pages 3 – 10. Show abstract Abstract. Case-Based Reasoning (CBR) is a promising Artificial Intelligence (AI) method that is applied for problem solving tasks. This approach is widely used in order to develop Clinical Decision Support System (CDSS). A CDSS for diagnosis and treatment often plays a vital role and brings essential benefits for clinicians. Such a CDSS could function as an expert for a less experienced clinician or as a second option/opinion of an experienced clinician to their decision making task. This paper presents the case studies on 3 clinical Decision Support Systems as an overview of CBR research and development. Two medical domains are used here for the case studies: case_study_1) CDSS for stress diagnosis case_study_2) CDSS for stress treatment and case_study_3) CDSS for post-operative pain treatment. The observation shows the current developments, future directions and pros and cons of the CBR approach. Moreover, the paper shares the experiences of developing 3CDSS in medical domain in terms of case study
  • Dependency Tree Matching with Extended Tree Edit Distance with Subtrees for Textual Entailment
    222 tree edit distance, dependency trees matching, entailment Maytham Alabbas, Allan Ramsay, pages 11 – 18. Show abstract Abstract. A lot of natural language processing (NLP) applications require the computation of similarities between pair of syntactic or semantic trees. Tree Edit Distance (TED), in this context, is considered one of the most effective techniques. However, its main drawback is that it deals with single node operations only. We therefore extended TED to deals with subtree transformation operations as well as single nodes. This makes the extended TED more effective and flexible than the standard one, especially for the applications that pay attention to relations among nodes (e.g. in linguistic trees, deleting a modifier subtree should be cheaper than the sum of deleting its components individually). The preliminary results of extended TED were encouraging compared with the standard one when tested on different examples of dependency trees.
  • Combining black-box taggers and parsers for Modern Standard Arabic
    198 Arabic processing, Parsing, Tagging Maytham Alabbas, Allan Ramsay, pages 19 – 26. Show abstract Abstract. A number of trainable dependency parsers have been presented in the literature. These parsers require tagged input: this may potentially cause a problem, because taggers are not in general 100% accurate, and any errors in tagging are likely to lead to errors in the output of the parsers. The current paper investigates the relationship between tagging errors and parsing errors. The investigation is carried out on Arabic text, using specific taggers and parsers, but the lessons that can be learned are applicable to other languages and other tools of the same kind.
  • CCA in search of dimensionality of data from a normal and damaged gearbox
    329 Power spectra, Dimension reduction, Visualization of multivariate data, Non-linear mapping, distance matrix, DxDy-plot Anna Bartkowiak, Radoslaw Zimroz, pages 27 – 34. Show abstract Abstract. Our aim is to explore some gearbox vibration data characterized by 15 power spectra amplitudes. The data were gathered for two gearboxes: one being in a good and the the other in a bad state.
    In turn, each of the sets was gathered when the machine was operated under small/no load, and under full load. This gives 4 data sets to compare. We are concerned with two topics:
    1. Is it possible to compare in a simple way the structure of the four obtained subsets?
    2. Could the number of variables be reduced to carry out further analysis of the data? To answer both these questions, we use a visual tool, the CCA (Curvilinear Component Analysis) method, proposed by Demartines and Herault. Using this tool, we are able to answer positively the above two questions.
  • Mental State Monitoring System for the Professional Drivers Based on Heart Rate Variability Analysis
    407 Shahina Begum, Mobyen Uddin Ahmed, Peter Funk, Reno Filla, pages 35 – 42. Show abstract Abstract. The consequences of tiredness, drowsiness, stress and lack of concentration caused by a variety of different factors such as illness, sleep depletion, drugs and alcohol is a serious problem in traffic and when operating industrial equipment. A system that recognizes the state of the driver and e.g. suggests breaks when stress level is too high or driver is too tired would enable large savings and reduces accident. So, the aim of the project is to develop an intelligent system that can monitor drivers' stress depending on psychological and behavioral conditions/status using Heart Rate Variability (HRV). Here, we have proposed a solution using Case-Based Reasoning (CBR) to diagnose individual driver's level of stress. The system also considers feedback from the driver's on how well the test was performed. The validation of the approach is based on close collaboration with experts and measurements from 18 drivers from Volvo Construction Equipment (Volvo CE) are used as reference.
  • Variable Precision Fuzzy Rough Set Based on Relative Cardinality
    398 fuzzy set; rough set; variable precision rough set; fuzzy cardinality Tuan-Fang Fan, Churn-Jung Liau, Duen-Ren Liu, pages 43 – 47. Show abstract Abstract. The fuzzy rough set approach (FRSA) is a theoretical framework that can deal with data analysis of possibilistic information systems. While a set of comprehensive rules can be induced from a possibilistic information system by using FRSA, generation of several intuitively justified rules is sometimes blocked by objects that only partially satisfy the antecedents of the rules. In this paper, we use the variable precision models of FRSA to cope with the problem. The models admit rules that are not satisfied by all objects. It is only required that the proportion of objects satisfying the rules must be above a threshold called a a precision level. In the presented models, the proportion of objects is represented as a relative cardinality of a fuzzy set with respect to another fuzzy set. We investigate three types of models based on different definitions of fuzzy cardinalities including $\Sigma$-counts, possibilistic cardinalities, and probabilistic cardinalities; and the precision levels corresponding to the three types of models are respectively scalars, fuzzy numbers, and random variables.
  • An approximation method for Type Reduction of an Interval Type-2 Fuzzy Set based on a-cuts
    266 Interval Type-2 fuzzy sets, Fuzzy Optimization Juan Carlos Figueroa-García, pages 49 – 54. Show abstract Abstract. This paper shows a proposal for Type-reduction of an Interval Type-2 fuzzy set composed from α-cuts done over its primary membership functions. The definition of available Type-reduction methods for Interval Type-2 fuzzy sets are based in an homogeneous subdivision of the universe of discourse, so we propose an approximation algorithm for Type-reduction of an Interval Type-2 fuzzy set through its primary a-cuts. Some definitions about the a-cut of a Type-2 fuzzy set are provided and used for computing the centroid of an Interval Type-2 fuzzy set through a mapping of its membership function, instead of its universe of discourse
  • Eye Color Classification for Makeup Improvement
    277 eye color, eyes classification, support vector machine Camelia Florea, Mihaela Moldovan, Mihaela Gordan, Aurel Vlaicu, Radu Orghidan, pages 55 – 62. Show abstract Abstract. The development of computer-aided solutions able to suggest the right facial makeup is a recent trend in image analysis applications, from which both amateurs and professionals could benefit significantly. The global harmony of a person is highly valuable when choosing makeup colors to make a person looking lovely. The global harmony is evaluated taking into account the color of the hair, skin and eyes, and among these features, the eyes seem to be one of the most salient features that capture an individual attention. This paper proposes a simple yet effective eye color classification scheme, compliant to the categories associated to the cosmetic software, which are often different than the classification systems used in medicine or biometrics. The color descriptors are histograms of the iris color distribution in the HSV color space, classified by multi-class Support Vector Machines, and the high accuracies achieved recommend it for digital cosmetic assistant solutions.
  • Towards Automatically Categorizing Mathematical Knowledge
    318 mathematical knowledge repositories, categorization of knowledge, math assistants Adam Grabowski, Christoph Schwarzweller, pages 63 – 68. Show abstract Abstract. Clearly, there is no definitive standard for categorizing information contained in mathematical papers. Even if AMS Mathematics Subject Classification was important for mathematicians for years, nowadays we can observe growing popularity of other schemes, e.g. arXiv categories. On the other hand, in the era of digital information storing one can expect from the process of classification to be more or less automatic. Furthermore, generic categorization can be done inside the search engine. At different level, the distinction between such classical tagging items as lemma, proposition, theorem etc. had the aim of showing importance of proven facts. Here the automatization is much harder, or, to be more precise, the results obtained can be far from the original tagging given by the author. In the paper we point out some problems and thoughts concerned with the categorization of mathematical knowledge, illustrating some of them by examples taken from the Mizar Mathematical Library, large machine-checked repository of mathematical facts.
  • Automatic Noise Recognition Based on Neural Network Using LPC and MFCC Feature Parameters
    130 Noise classification, neural network, linear predictive coefficient, Mel-frequency cepstral coefficient Reza Haghmaram, Ali Aroudi, Mohammad Hossein Ghezel Aiagh, Hadi Veisi, pages 69 – 73. Show abstract Abstract. This paper studies the automatic noise recognition problem based on RBF and MLP neural networks classifiers using linear predictive and Mel-frequency cepstral coefficients (LPC and MFCC). We first briefly review the architecture of each network as automatic noise recognition (ANR) approach, then, compare them to each other and investigate factors and criteria that influence final recognition performance. The proposed networks are evaluated on 15 stationary and non-stationary types of noises with frame length of 20 ms in term of correct classification rate. The results demonstrate that the MLP network using LPCs is a precise ANR with accuracy rate of 99.9%, while the RBF network with MFCCs coefficients goes afterward with 99.0% of accuracy.
  • The importance of handling multivariate attributes in the identification of heart valve diseases using heart signals
    47 multivariate attributes - heart valve diseases - heart signals - Feature extraction Ahmed Hamdy, Aboul Ella Hassanien, pages 75 – 79. Show abstract Abstract. Automated detection of heart valve disease through heart sound has a great requirement due to its inexpensive and non-invasive availability. Extensive research has been conducted recently on applying different classification and features selection techniques. Heart sound data sets represent a real life data that contains continuous attributes and a large number of features that could be hardly classified by most of classification techniques. Data mining techniques including the feature evaluation and classification techniques that ignores the important characteristics that may exist in the heart sound data set may not be applicable on this case. In this context, the present paper initially surveys the research that has been conducted concerning the exploitation of heart sound signals for automated detection of heart conditions. Then, A comparative study is applied to determine the most effective data mining techniques that are capable for the detection of heart valve disease with a high accuracy. The results shows that the techniques that are capable of the handling the multivariate data sets that has continuous nature show the highest classification accuracy.
  • Revising Structured Knowledge Bases
    179 knowledge representation, belief revision Michał Korpusik, Witold Łukaszewicz, Ewa Madalińska-Bugaj, pages 81 – 87. Show abstract Abstract. In this paper we present a new approach to belief revision. In contrast to traditional formalizations of this problem, where all pieces of information included in a knowledge base have identical status, we explicitly distinguish between observations, i.e., facts which an agent observes or is being told, and facts representing a general, sometimes defeasible, knowledge about the considered world. This distinction allows us to deal with scenarios that cannot be properly modelled using existing belief revision operators.
  • Experiments on distance measures for combining one-class classifiers
    347 one-class classification, multiple classifier systems, distance measures, one-class support vector machine, combined classifier. Bartosz Krawczyk, Michal Wozniak, pages 89 – 92. Show abstract Abstract. The paper investigates the influence of different types of distance measures on the performance of a multiple classifier system consisting of one-class classifiers. This specific type of machine learning approach uses examples only from a single class to derive a decision boundary - hence its is often referred to as learning in the absence of counterexamples. Combining several one-class classifiers is a promising research direction, as it often results in a more precise classification than when using just a single model. Most one-class classifiers base their decision on a distance from an object to the decision boundary, canonically expressed in the Euclidean measure. When combining such predictors it is necessary to map the distance into probability, therefore the measure used has a crucial impact on the classifier fusion. This paper proposes alternative distance measures for one-class classification, which are evaluated through experimental investigations.
  • Artificial Reasoning in Relative Dilemmatic Logic
    81 logical reasoning, relative logic, dilemmatic logic, Hasse diagrams Juliusz Kulikowski, pages 93 – 100. Show abstract Abstract. In this paper principles of relative dilemmatic logic as a modification of conventional relative logic are formulated and based on them methods of logical reasoning are presented and illustrated by examples. It is shown that dilemmatic logic makes possible not only to relatively evaluate logical values of statements without using any numerical parameters but also it makes possible to eliminate from logical inference process premises and inductions whose relative value is lower than this of their negations. Graphical representation of logical inference processes by bi-partite and tri-partite graphs is proposed and the role of graph theory methods in solution of the logical inference tasks based on relative dilemmatic logic is indicated.
  • Improving Efficiency in Constraint Logic Programming Through Constraint Modeling with Rules and Hypergraphs
    291 Constraint Logic Programming, Constraint Satisfaction Problem, Constraint Propagation Antoni Ligęza, pages 101 – 107. Show abstract Abstract. Constraint Satisfaction Problems typically exhibit strong combinatorial explosion. In this paper we analyze a possibility of improving efficiency in Constraint Logic Programming. A hypergraph model of constraints is presented and an outline of strategy planning approach focused on efficient use of propagation rules is put forward. Two example cryptoarithmetic problems are explored in order to explain the proposed approach.
  • Knowledge patterns in RDF graph language for English sentences
    231 ontology, knowledge patterns, RDF graph language Rostislav Miarka, Martin Zacek, pages 109 – 115. Show abstract Abstract. Each language has its word order, which determines the way of ordering words in sentence. This paper presents knowledge patterns for sentences in English language. Knowledge patterns are general patterns of knowledge, which can be used in any knowledge base or ontology. While using them, the general symbols from the pattern are renamed to special symbols from modeled domain. For representation of knowledge pattern, the RDF graph language is used. The paper contains examples of usage of knowledge patterns.
  • Estimation of Eye Condition using Waveform Shapes of Pupil Light Responses to Chromatic Stimuli
    105 Pupil Light Reflex; Age-Related Macular Degeneration; Waveform shape; Fourier Descriptors; Random Forests Minoru Nakayama, Wioletta Nowak, Hitoshi Ishikawa, Ken Asakawa, Yoshiaki Ichibe, pages 117 – 122. Show abstract Abstract. This paper examines the possibility of detecting 2 conditions which cause vision to deteriorate: Aged-Related Macular Degeneration (AMD), and the effects of aging on eyes using the features of PLR waveforms. These features were extracted using Fourier descriptors of PLR waveform shapes, weighted amplitudes of the waveforms, and a balanced combination of these two in the from of a weighted value. The Random Forest method was used for classification analysis to detect three types of PLR, such as in healthy eyes, in AMD-affected eyes, and in age-affected eyes. The optimized weight values were evaluated using a classification error rate. The results show that the error rates for healthy PLRs and AMD PLRs were low, but the error rates for PLRs of age-affected eyes stayed at a high level. Additionally, dissimilarities between the PLRs for blue light and red light at low intensities contributed to the performance of the classification technique.
  • Camera calibration with two or three vanishing points
    110 vanishing points, camera calibration, 3D reconstruction, depth measurement Radu Orghidan, Joaquim Salvi, Mihaela Gordan, Bogdan Orza, pages 123 – 130. Show abstract Abstract. The perspective projection models the way a 3D scene is transformed into a 2D image, usually through a camera or an eye. In a projective transformation, parallel lines intersect in a point called vanishing point. This paper presents in detail two calibration methods that exploit the properties of vanishing points. The aim of the paper is to offer a practical tool for the choice of the appropriate calibration method depending on the application and on the initial conditions. The methods, using two respectively three vanishing points, are presented in detail and are compared. First, the two models are analyzed using synthetic data. Finally, each method is tested in a real configuration and the results show the quality of the calibration.
  • Comparison of a memetic algorithm and a tabu search algorithm for the Traveling Salesman Problem
    358 Evolutionary computing, Memetic algorithm, Tabu Search, TSP Eneko Osaba, Fernando Díaz, pages 131 – 136. Show abstract Abstract. The traveling salesman problem, or TSP, is one of the most famous and well studied problems in combinatorial optimization. There are many studies with the objective of finding an optimal solution for this problem. These studies have not been successful, since it is considered to be an NP-Hard problem. This means that is not possible to find a method that ensures an optimal solution for all instances of this problem. In this paper we present two techniques to solve this problem, a tabu search based algorithm and a memetic algorithm. The results of both techniques are shown and compared to decide which one of the two alternatives gets better results. Apart from this, several studies are performed to determine certain aspects of the algorithms, such as the crossover function for the memetic algorithm or the size of the tabu list.
  • The Fuzzy Genetic System for Multiobjective Optimization
    26 fuzzy logic, genetic algorithm, hybrid system Krzysztof Pytel, pages 137 – 140. Show abstract Abstract. The article presents the idea of a hybrid system for multiobjective optimization. The system consists of the genetic algorithm and the fuzzy logic driver. The genetic algorithm realizes the process of multiobjective optimization. The fuzzy logic driver uses data aggregated by the genetic algorithm and controls the process of evolution by modifying the probability of selection of individuals to the parental pool. The controlling of the evolution process makes it possible to choose the preferred area with pareto-optimal solution. In experiments we investigated the ability of the proposed system to search solutions in a given area of the search space. We compared the results of the elementary algorithm and the proposed system. The experiments showed that the proposed system is able to control the process of evolution toward pareto-optimal solutions in the given area of searching.
  • Ontology Based Integration and Decision Support in the Insigma Route Planning Subsystem
    61 ontology, dynamic route planning, personalization Piotr Szwed, Piotr Kadłuczka, Wojciech Chmiel, Andrzej Glowacz, Joanna Sliwa, pages 141 – 148. Show abstract Abstract. The route planning subsystem is an important component of the Intelligent System for Global Monitoring Detection and Identification of Threats (INSIGMA). Its goal is to calculate an optimal route taking into consideration contextual information and values of dynamically updated parameters describing the current traffic.
    Developing the system we have taken an approach consisting in providing a set of simpler route planning algorithms that can be used in various situations instead a single all purpose procedure.
    A key issue encountered during the system development was the correct choice and configuration of algorithm to be used. The selection depends on such factors, as: user profile and preferences, dynamically collected traffic data and historical records. In the developed system the knowledge about these factors, their relations and rules is gathered in ontologies. The paper presents the system architecture and an execution scenario, in which the decision on selection and configuration of one of the several implemented route planning algorithms is based on semantic information and build in rules.
  • A Real-Time License Plate Localization Method Based on Vertical Edge Analysis
    354 License plate localization, License plate recognition, Intelligent transportation systems, Edge detection, Pattern recognition, Image processing, Artificial intelligence Peter Tarabek, pages 149 – 154. Show abstract Abstract. License plate localization is the most important part of the license plate recognition system. Ability to correctly detect license plate under different conditions directly affects overall recognition system accuracy. In this paper a real-time license plate localization method is proposed. First, vertical edges are detected from the image and binarized. Then, license plate candidates are extracted by the two-stage detection process. In this process a sliding-window technique is used to mark all windows which satisfied edge density conditions. Edge density conditions are computed on integral edge image allowing us to significantly increase the processing speed of the method. To better distinguish between license plates and complex backgrounds, the edge analysis is performed to remove specific edges. Finally, false candidates are filtered out based on geometrical and textural properties. The proposed method can detect multiple license plates with different sizes in a complex background. The experimental results confirm robustness and ability to localize license plates in real-time. On the database of 501 images our method correctly localizes 97.4% of license plates.
  • Adding rules into database systems
    22 deductive rules, active rules, data derivation, integrity constraint Zdenka Telnarová, pages 155 – 159. Show abstract Abstract. There are several types of rules playing an important role in database systems. As opposed to simple database systems where the only reasoning service is query answering, more advanced database systems offer number of advanced reasoning services such as deductive query answering based on deductive rules and active input processing based on active rules. In this article we discuss the concept of active and deductive rules and describe a simple model that uses both of them.
  • Laser trail shape identification technique for robot navigation based on genetic programming
    127 genetic programming, shape identification, robot navigation, laser pointer, pattern recognition, image processing, optical flow Takeshi Tsujimura, Hiroki Fukushima, Toshihiro Minato, Kiyotaka Izumi, pages 161 – 166. Show abstract Abstract. This paper proposes a meta-heuristic image processing application for mobile robot navigation. It classifies figures that are drawn on a wall by hand with a laser pointer. Image processing technique extracts optical flow of the laser beam trail, which represents vectors along edges of shapes. Genetic programming learns geometric characteristics of laser trail shapes and creates classification algorithm. Three typical figures, such as a circle, a triangle, and a square, are evaluated and identified in high accuracy. We have investigated the effects of genetic programming parameters on the performance of shape identification. As a result, proposal system makes it possible to command robots by easy and intuitive action of drawing a figure only with a laser pointer.
  • Is Visual Similarity Sufficient for Semantic Object Recognition?
    98 keypoint correspondence, object similarity, image matching, semi-local constraints Andrzej Śluzek, Mariusz Paradowski, pages 167 – 173. Show abstract Abstract. The paper discusses experiments on the-same-class object detection for typical exemplary classes of man-made objects using the keypoint matching techniques (two algorithms are used, i.e. building clusters of consistently similar and distributed keypoints, and matching of individual points represented by novel description incorporating semi-local geometry of images). It is shown that while detection of identically looking objects in random images can be performed reliably, the same is not possible for semantically defined classes of objects (even if we expect a certain level of visual and configurational uniformity within the class). The experiments conducted on PASCAL2007 dataset provide results at the level of random selection. However, a small percentage of results indicate that for some classes semantics may be significantly correlated with visual and configurational consistency.

International Workshop on Artificial Intelligence in Medical Applications

  • Detection of Heart Disease using Binary Particle Swarm Optimization
    96 Binary particle swarm optimization, Support vector machine, heart signals Mona Nagy Elbedwehy, Hossam Zawbaa, Neveen Ghali, Aboul Ella Hassanien, pages 177 – 182. Show abstract Abstract. This article introduces a digenesis system of the heart valve disease using binary particle swarm optimization and support vector machine, in conjunction with K-nearest neighbor and with leave-one-out cross-validation. The system was applied in a representative heart dataset of 198 heart sound signals, which come both from healthy medical cases and from cases suffering from the four most usual heart valve diseases: aortic stenosis (AS), aortic regurgitation (AR), mitral stenosis (MS) and mitral regurgitation (MR). The introduced approach starts with an algorithm based on binary particle swarm optimization to select the most weighted features. This is followed by performing support vector machine to classify the heart signals into two outcome: healthy or having a heart valve disease, then its classified the having a heart valve disease into four outcomes: aortic stenosis (AS), aortic regurgitation (AR), mitral stenosis (MS) and mitral regurgitation (MR). The experimental results obtained, show that the overall accuracy offered by the employed approach is high compared with other techniques.
  • Graph Partitioning based Automatic Segmentation Approach for CT Scan Liver Images
    57 Graph Partitioning - Normalized Cut - Image Segmentation Walaa Elmasry, Nashwa El-Bendary, Aboul Ella Hassanien, pages 183 – 186. Show abstract Abstract. Manual segmentation of liver computerized tomography (CT) images is very time consuming, so it is desired to develop a computer-based approach for the analysis of liver CT images that can precisely segment the liver without any human intervention. This paper presents normalized cuts graph partitioning approach for liver segmentation from CT images. To evaluate the performance of the presented approach, we present tests on different liver CT images. Experimental results obtained show that the overall accuracy offered by the employed normalized cuts technique is high compared to the well known K-means segmentation approach.
  • Biomarker Clustering of Colorectal Cancer Data to Complement Clinical Classification
    386 Colon Cancer, Clustering Chris Roadknight, Uwe Aickelin, Alex Ladas, Daniele Soria, John Scholefield, Lindy Durrant pages 187 – 191. Show abstract Abstract. In this paper, we describe a dataset relating to cellular and physical conditions of patients who are operated upon to remove colorectal tumours. This data provides a unique insight into immunological status at the point of tumour removal, tumour classification and post-operative survival. Attempts are made to cluster this dataset and important subsets of it in an effort to characterize the data and validate existing standards for tumour classification. It is apparent from optimal clustering that existing tumour classification is largely unrelated to immunological factors within a patient and that there may be scope for re-evaluating treatment options and survival estimates based on a combination of tumour physiology and patient histochemistry.
  • Wireless System for Remote Monitoring of Oxygen Saturation and Heart Rate
    343 remote monitoring, chronic patients, wireless sensor networks, oxygen saturation, heart rate Cristian Rotariu, Vasile Manta, pages 193 – 196. Show abstract Abstract. This paper describes the realization of a wireless oxygen saturation and heart rate system for patient monitoring in a limited area. The proposed system will allow the automatic remote monitoring in hospitals, at home, at work, in real time, of persons with chronic illness, of elderly people, and of those having high medical risk. The system can be used for long-time continuous patient monitoring, as medical assistance of a chronic condition, as part of a diagnostic procedure, or recovery from an acute event. The blood oxygen saturation level (SpO2) and heart rate (HR) are continuously measured using commercially available pulse oximeters and then transferred to a central monitoring station via a wireless sensor network (WSN). The central monitoring station runs a patient monitor application that receives the SpO2 and HR from WSN, processes these values and activates the alarms when the results exceed the preset limits. A user-friendly Graphical User Interface was developed for the patient monitor application to display the received measurements from all monitored patients. A prototype of the system has been developed, implemented and tested.
  • Hierarchical Heterogeneous Ant Colony Optimization
    271 automated meal plan, ant colony optimization, hierarchy, heterogeneity Miroslav Rusin, pages 197 – 203. Show abstract Abstract. Ant Colony Optimization (ACO) is used to solve problems with multiple objectives. Various extensions have been implemented to the traditional approach to improve algorithm performance or quality of solutions. In this paper we propose to apply a novel ACO-based method that employs heterogeneity and hierarchy in the area of automated meal plans. The hierarchy consists of 2 levels: at the first there are ants working in a fairly traditional way (a worker); at the second there is an ant manager. Each worker has its own plan and searches the unique environment. The second level ant monitors a group of workers. Experimental results show that this approach is capable to tackle the task in a reasonable time and quality.
  • LARDISS—a Tool for Computer Aided Diagnosis of Laryngopathies
    270 laryngopathy, classification, computer-aided diagnosis Jarosław Szkoła, Krzysztof Pancerz, Jan Warchoł, Grażyna Olchowik, Maria Klatka, Regina Wojecka-Gieroba, Agata Wróbel, pages 205 – 211. Show abstract Abstract. In the paper, we present a new computer tool supporting a non-invasive diagnosis of selected larynx diseases. The tool is created for the Java platform. Computer-aided diagnosis of laryngopathies, in the presented tool, is based on analysis of a patient's speech signal in time and frequency domains. A number of classification ways proposed for diagnosis of laryngopathies is listed and described in this paper.
  • Fuzzy Decision Trees in Medical Decision Making Support System
    284 Fuzzy Logic, Fuzzy Decision Trees, Cumulative Information Estimates Elena Zaitseva, Vitaly Levashenko, pages 213 – 219. Show abstract Abstract. Decision Making Support System based on Fuzzy Logic is considered in this paper for oncology disease diagnosis. The decision making procedure corresponds to the recognition (classification) of the new case by analyzing a set of instances (already solved cases) for which classes are known. Ontology (solved cases) is defined as Fuzzy Classification Rules that are formed by different Fuzzy Decision Trees. Three types of Fuzzy Decision Trees (Non-ordered, ordered and Stable) are considered in the paper. Induction of these Fuzzy Decision Trees is based on Cumulative Information Estimates. The proposed approach is implemented in well-know benchmark medical problem with real clinical data for breast cancer diagnosis.
  • Improved Feature Selection for Hematopoietic Cell Transplantation Outcome Prediction using Rank Aggregation
    345 Rank Aggregation, Hematopoietic stem cell transplantation, Acute Myeloid Leukemia, Genetic Data Chandrima Sarkar, Sarah Cooley, Jaideep Srivastava, pages 221 – 226. Show abstract Abstract. This paper presents a methodology for developing an improved feature selection technique that will help in accurate prediction of outcomes after hematopoietic stem cell transplantation (HSCT)for patients with acute myelogenous leukaemia (AML). Allogeneic HSCT using related or unrelated donors is the standard treatment for many patients with blood related malignancies who are unlikely to be cured by chemotherapy alone, but survival is limited by treatment-related mortality and relapse. Various genetic factors such as tissue type or human leukocyte antigen (HLA) type and immune cell receptors, including the killer-cell immunoglobulin-like receptor (KIR) family can affect the success or failure of HSCT. In this paper we aim to develop a novel, aggregated ranking based feature selection technique using HLA and KIR genotype data, which can efficiently assist in donor selection before BMT and confer significant survival benefit to the patients. In our approach we use a rank aggregation based feature selection technique for selecting suitable donor genotype characteristics. The result obtained is evaluated with classifiers for prediction accuracy. On average, our algorithm improves the prediction accuracy of the results by 3-4% compared to generic analysis without using feature selection or single feature selections algorithms. Most importantly the selected features completely agree with those obtained using traditional statistical approaches, proving the efficiency and robustness of our technique which has great potential in the medical domain.

2nd International Workshop on Advances in Semantic Information Retrieval

  • Towards WordBricks—a Virtual Language Lab for Computer-Assisted Language Learning
    106 natural langauge processing, intelligent computer-assisted language learning, dependency grammar Maxim Mozgovoy, pages 229 – 232. Show abstract Abstract. This paper describes WordBricks project—an intelligent computer-assisted language learning environment, recently initiated at our institution. WordBricks is intended to serve as a “virtual language lab” that supports open experiments with natural language constructions. Being based on dependency grammars, this instrument illustrates the use of modern natural language processing technologies in language learning.
  • Information Retrieval across Information Visualization
    161 information visualization, classification mapping, concept map, information retrieval Veslava Osinska, Piotr Bala, Michal Gawarkiewicz, pages 233 – 239. Show abstract Abstract. This article presents the analytical and retrieval potential of visualization maps. Obtained maps were tested as information retrieval (IR) interface. The collection of documents derived from the ACM Digital Library was mapped on the sphere surface. Proposed approach uses nonlinear similarity of documents by comparing ascribed thematic categories and thereby development of semantic connections between them. For domain analysis the newest IT trend - Cloud Computing was monitored across time period 2007-2009. Visualization reflects evolution, dynamics and relational fields of cloud technology as well as its paradigmatic property.
  • Improving Wikipedia Miner Word Sense Disambiguation Algorithm
    253 worde sense disambiguation, semantic relatedness, Wikipedia, Jaccard coefficient Aleksander Pohl, pages 241 – 248. Show abstract Abstract. This document describes the improvements of the Wikipedia Miner word sense disambiguation algorithm. The original algorithm performs very well in detecting key terms in documents and disambiguating them against Wikipedia articles. By replacing the original Normalized Google Distance inspired measure with Jaccard coefficient inspired measure and taking into account additional features, the isambiguation algorithm was improved by 8 percentage points (F1-measure), without impeding its performance nor introducing any additional preprocessing overhead.
    This document also presents some statistical data that are extracted from the Polish Wikipedia by Wikipedia Miner. An automatic evaluation of the performance of the disambiguation algorithm for Polish shows that it is almost as good as for English, even though the Polish Wikipedia has only a quarter of the number of the articles of the English Wikipedia.
  • A Study of Measures for Document Relatedness Evaluation
    203 informational search; document relatedness; semantic similarity; semantic measures; information retrieval Evgeny Pyshkin, Vitaly Klyuev, pages 249 – 256. Show abstract Abstract. Measures and approaches to estimate document relatedness are classified and described
  • Class-based Approach in Semantic P2P Information Retrieval
    257 Information retrieval, Peer-to-Peer, P2P, class-based search, semantic clustering Ilya Rudomilov, Ivan Jelínek, pages 257 – 261. Show abstract Abstract. Peer-to-Peer (P2P) approach in information retrieval systems has drawn significant attention recently. P2P networks provide obvious advantages like scalability, reliability and, therefore, recent researchers are looking for a way to adapt these techniques to Information Retrieval fashion of nodes with heterogeneous documents. The greatest attention is paid to different semantic-based searching such as Gnutella Efficient Search (GES) proposed by Zhu Y et al., which derives from Vector Space Model. This paper proposes conceptual design of P2P unstructured information retrieval (IR) with heterogeneous documents on independent nodes.
  • A Feature Model Configuration for Multimedia Applications by an OWL-based Approach
    382 Feature Model, Semantic Information retrieval, OWL. Giuseppe Santoro, Carmelo Pino, Concetto Spampinato, pages 263 – 268. Show abstract Abstract. Feature models are used to describe common and variable properties of families of related software systems referred as SPL (Software Product Line). Every program in an SPL is identified by a unique and legal combination of features called feature configuration. There is no formal semantic for describing a feature model and no standard tool for building and validate a feature configuration. In this paper we present an OWL-based approach for building and editing feature models together with an OWL-based inferential engine for creating a feature configuration and check its consistency. The Framework has been developed as a Java web service and can be applied to model a wide range of applications from media processing to business and financial modeling.

International Workshop on Rough Sets Applications

  • Tests for Decision Tables with Many-Valued Decisions—Comparative Study
    140 decision table with many-valued decisions, test, reduct, greedy algorithm Mohammad Azad, Igor Chikalov, Mikhail Moshkov, Beata Zielosko, pages 271 – 277. Show abstract Abstract. The paper is devoted to the study of a greedy algorithm for construction of approximate tests (super-reducts). This algorithm is applicable to decision tables with many-valued decisions where each row is labeled with a set of decisions. For a given row, we should find a decision from the set attached to this row. After constructing a test we use algorithm which try to remove attributes from a test and obtain a reduct. We present also some experiments for greedy algorithm which construct a test based on generalized decision approach.
  • Predicting the presence of serious coronary artery disease based on 24 hour Holter ECG monitoring
    227 prediction, coronary artery stenosis, coronary heart disease, CHD, ECG Holter, revascularization, classification, temporal patterns, temporal concepts, complex objects Jan G. Bazan, Stanisława Bazan-Socha, Sylwia Buregwa-Czuma, Przemysław Wiktor Pardel, Barbara Sokołowska, pages 279 – 286. Show abstract Abstract. The purpose of this study was to evaluate the usefulness of classification methods in recognizing cardiovascular pathology. Based on clinical and electrocardiographic (ECG) Holter data we propose the method for predicting coronary stenosis demanding revascularization in patients with diagnosis of stable coronary heart disease. An approach to solving this problem has been found in the context of rough set theory and methods. Rough set theory introduced by Zdzisław Pawlak during the early 1980s provides the foundation for the construction of classifiers. From the rough set perspective, classifiers presented in the paper are based on a decision tree calculated on the basis of the local discretization method. We present a new modification of tree building method which emphasizes the discernibility of objects belonging to decision classes indicated by human experts. Presented method may be used to assess the need for coronary revascularization. The paper includes results of experiments that have been performed on medical data obtained from Second Department of Internal Medicine, Collegium Medicum, Jagiellonian University, Kraków, Poland.
  • Weighted lambda precision models in rough set data analysis
    89 Rough sets, PRE learning, Goodman-Kruskal lambda Ivo Düntsch, Günther Gediga, pages 287 – 294. Show abstract Abstract. We present a parameter free and monotonic alternative to the parametric variable precision model of rough set data analysis, based on the well known PRE index lambda of Goodman and Kruskal. Using weighted (parametric)lambda models we show how expert knowledge can be integrated without losing the monotonic property of the index. Based on a weighted lambda index we present a polynomial algorithm to determine an approximately optimal set of predicting attributes. Finally, we exhibit a connection to Bayesian analysis.
  • Utilization of Attribute Clustering Methods for Scalable Computation of Reducts from High-Dimensional Data
    330 attribute selection, attribute reduction; attribute clustering; high-dimensional data; microarray data; scalable reducts computation methods; Andrzej Janusz, Dominik Ślęzak, pages 295 – 302. Show abstract Abstract. We investigate methods for attribute clustering and their possible applications to a task of computation of decision reducts from information systems. We focus on high-dimensional data sets, for which the problem of selecting attributes that constitute a reduct can be extremely computationally intensive. We apply an attribute clustering method to facilitate construction of reducts from microarray data. Our experiments confirm that by proper grouping of similar, in some sense replaceable attributes it is possible to significantly decrease a computation time, as well as increase a quality of resulting reducts (i.e. decrease their average size).
  • SQL-Based Heuristics for Selected KDD Tasks over Large Data Sets
    395 attribute extraction; attribute reduction; decision trees; SQL-based data mining; triple stores; Marcin Kowalski, Sebastian Stawicki, pages 303 – 310. Show abstract Abstract. We investigate how to use the scripts with automatically generated fast-performing analytic SQL statements to speed up the KDD-related tasks of attribute extraction, attribute reduction, and decision tree induction. We base our framework on a triple-store data model in order to seamlessly scale the required queries with respect to the amounts of (original or extracted) attributes involved in the given task's specification. We note that all the above-mentioned tasks can be heuristically handled using the same class of aggregation queries, where the most promising new attributes, attribute subsets, or iteratively chosen tree splits are searched by analyzing diversity of aggregations computed for (subsets of) attributes, grouped by decision. We also outline our plans with respect to creation of a large-scale framework for evaluating the proposed heuristics against real-world data.
  • Metric Based Attribute Reduction in Decision Tables
    311 rough set theory, decision table, condition attribute, reduct, relational database, relation Long Giang Nguyen, pages 311 – 316. Show abstract Abstract. In any information system, each set of attributes can be associated with a partition of the universe into indiscernibility classes. This paper introduces a new distance function between two sets of attributes on the base of the Jaccard distance between corresponding partitions. We proposes a new method for attribute reduction in decision table using the proposed metric. The paper proves theoretically and experimentally that this metric-based method is more effective than other methods based on conditional Shannon entropy.
  • On elimination of redundant attributes from decision table
    324 relational database, rough set theory, decision table, condition attribute, reduct, Long Giang Nguyen, Hung Son Nguyen, pages 317 – 322. Show abstract Abstract. Most reduct calculation methods in rough set theory are related to the minimal reduct calculation problem, which is NP-hard. This paper investigates the problem of searching for the set of useful attributes that occur in at least one reduct. By compliment, this problem is equivalent to searching for the set of redundant attributes, i.e. the attributes that never occur in reducts of the given decision table. We show that the considered problem is equivalent to a Splener system for relational data base system and prove that it can be solved in polynomial time.
  • Dominance-Based Rough Set Approach for Decision Systems over Ontological Graphs
    366 decision system, decision rule, ontological graph, DRSA Krzysztof Pancerz, pages 323 – 330. Show abstract Abstract. In the paper, we continue research in the field of decision systems over ontological graphs. Some relations defined over attribute values in such systems are partial order relations. Therefore, we try to incorporate Dominance-Based Rough Set Approach (DRSA) for decision systems over ontological graphs. Due to this approach, we can define decision rules, similar to that defined in original DRSA, which give us a new look on data included in such systems.
  • Decision Bireducts and Approximate Decision Reducts: Comparison of Two Approaches to Attribute Subset Ensemble Construction
    348 Attribute Subset Selection; Approximate Reducts; Bireducts; Classifier Ensembles; Randomized Search; Sebastian Stawicki, Sebastian Widz, pages 331 – 338. Show abstract Abstract. We discuss the notion of a decision bireduct [D. Ślęak and A. Janusz, “Ensembles of Bireducts: Towards Robust Classification and Simple Representation,” in Proc. of FGIT 2011, ser. LNCS, vol. 7105, 2011, pp. 64–77.], which is an extension of the notion of a decision reduct developed within the theory of rough sets. We show relationships between the decision bireducts and some formulations of approximate decision reducts summarized in [LNCS, vol. 7105, 2011, pp. 64–77. D. Ślęzak and S. Widz, “Rough-Set-Inspired Feature Subset Selection, Classifier Construction, and Rule Aggregation,” in Rough Sets and Knowledge Technology (RSKT) - 6th International Conference, RSKT 2011, Banff, Canada, October 9-12, 2011. Proceedings, ser. Lecture Notes in Computer Science, vol. 6954. Springer, 2011, pp. 81–88]. We investigate advantages of the decision bireducts and the approximate decision reducts within a rough-set-inspired framework for deriving attribute subset ensembles from data, wherein each of attribute subsets yields a single classifier, basically by generating its corresponding if-then decision rules from the training data. We also show how to use the above-mentioned relationships to build even more efficient rough-set-based ensembles in the future.
  • Sequential Optimization of γ-Decision Rules
    87 decision rules, dynamic programming, sequential optimization Beata Zielosko, pages 339 – 346. Show abstract Abstract. The paper is devoted to the study of an extension of dynamic programming approach which allows sequential optimization of approximate decision rules relative to length, coverage and number of misclassifications. Presented algorithm constructs a directed acyclic graph $\Delta_{\gamma}(T)$ which nodes are subtables of the decision table T. Based on the graph $\Delta_{\gamma}(T)$ we can describe all irredundant $\gamma$-decision rules with minimum length, after that among these rules describe all rules with maximum coverage, and among such rules describe all rules with minimum number of misclassifications. We can also change the set of cost functions and order of optimization. Sequential optimization can be considered as tool that help to construct simpler rules for understanding and interpreting by experts.

Workshop on Computational Optimization

  • Online Algorithm for Battery Utilization in Electric Vehicles
    189 Power allocation, online algorithm, Electric vehicle Ron Adany, Tami Tamir, pages 349 – 356. Show abstract Abstract. We consider the problem of utilizing the pack of batteries serving power demands in Electric Vehicles. When serving a power demand, the power allocation might be split among the batteries in the pack. Due to its internal chemistry structure, a battery's life depends on the discharge current used for supplying the requests. Any deviation from the (a-priori known) optimal discharge-current is associated with a penalty. Thus, the problem is to serve an online sequence of requests for energy, in a way that minimizes the total penalty associated with the service.
    We first formulate the problem as a combinatorial optimization problem. We show that the problem is strongly NP-hard and is hard to approximate within an additive gap of $\Omega(m)$ from the optimum, where $m$ is the number of batteries in the pack. This hardness result is valid even in the offline case, where the sequence of power demands is known in advance. For the online problem, we suggest an algorithm in which the total penalty might be larger than the minimal possible by at most an additive gap of $1.5m$-independent of the initial capacity of the batteries and the number of requests in the sequence. Finally, we provide a lower bound of $1.5$ for the multiplicative competitive ratio of any online algorithm.
    To the best of our knowledge, our work is the first to analyze the problem theoretically.
  • An Improved Algorithm for the Strip Packing Problem
    93 Cutting&packing, beam search, strip packing Hakim Akeb, Mhand Hifi, Dominique Lazure, pages 357 – 364. Show abstract Abstract. This paper solves the strip packing problem (SPP) that consists in packing a set of circular objects into a rectangle of fixed width and unlimited length. The objective is to minimize the length of the rectangle that will contain all the objects such that no object overlaps another one. The proposed algorithm uses a look-ahead method combined with beam search and a restarting strategy. The particularity of this algorithm is that it can achieve good results quickly (faster than other known methods and algorithms) even when the number of objects is large. The results obtained on well-known benchmark instances from the literature show that the algorithm improves a lot of best known solutions.
  • Multi-population Genetic Algorithm Quality Assessment Implementing Intuitionistic Fuzzy Logic
    178 genetic algorithms, purposeful genesis, model parameters, fermentation process, Saccharomyces cerevisiae Maria Angelova, Krassimir Atanassov, Tania Pencheva, pages 365 – 370. Show abstract Abstract. Intuitionistic fuzzy logic has been implemented in this investigation aiming to derive intuitionistic fuzzy estimations of S. cerevisiae fed-batch cultivation model parameters obtained using multi-population genetic algorithm (MpGA). Performances of the examined algorithm have been tested before and after the application of the procedure for purposeful model parameters genesis for three different values of generation gap which is the most sensitive genetic algorithms parameter toward convergence time. Results obtained after the implementation of intuitionistic fuzzy logic for the algorithm performance assessment have been compared and MpGA with GGAP = 0.1 after the purposeful model parameters genesis procedure application has been distinguished as the fastest and the most reliable one.
  • Orchestrating CSP and Local Search to Solve a Large Scale Energy Management Problem
    301 constraint satisfaction, local search, optimization,scheduling, ROADEF challenge Mirsad Buljubasic, Haris Gavranovic, pages 371 – 378. Show abstract Abstract. This paper presents a heuristic approach combining constraint satisfaction, local search and a constructive optimization algorithm for a large-scale energy management and maintenance scheduling problem. The methodology shows how to successfully combine and orchestrate different types of algorithms and produce competitive results. The local search for production assignment is a simple yet optimal solution for the relaxed initial problem. We also propose an efficient way to scale the method for huge instances. A large part of the presented work is done to compete in the ROADEF/EURO Challenge 2010, organized jointly by the ROADEF, EURO and the Èlectricitè de France. The numerical results obtained in official competition instances testify about the quality of the approach. The method achieves 3 out of 15 possible best results.
  • On suitable orders for discretizing Molecular Distance Geometry Problems related to protein side chains
    174 distance geometry, side chains, discretization, branch-and-prune, discretization orders Virginia Costa, Antonio Mucherino, Carlile Lavor, Luiz Mariano Carvalho, Nelson Maculan, pages 379 – 384. Show abstract Abstract. Proteins are important molecules that are widely studied in biology. Their three-dimensional conformations can give clues about their function, but an optimal methodology for the identification of such conformations has not been found yet. Experiments of Nuclear Magnetic Resonance (NMR) are able to estimate distances between some pairs of atoms forming the protein, and the problem of identifying the possible conformations satisfying the available distance constraints is known in the scientific literature as the Molecular Distance Geometry Problem (MDGP). Since some years, some of us have been working on a suitable discretization for the MDGP and on an efficient Branch & Prune (BP) algorithm which is based on a search on a tree. In order to perform this discretization, however, some assumptions need to be satisfied. We recently hand-crafted a special order for the protein backbone atoms which allows us to always discretize MDGPs concerning backbones. In this paper, we do the same for the side chains of some amino acids. Our computational experiments show that the inclusion of the side chain information allows to improve the performances of the BP algorithm.
  • Lagrangean decomposition of a lot-sizing problem into facility location and multicommodity flow.
    156 Lot Sizing, Lagrangean Decomposition, Multi-commodity Flow, Facility Location Samuel Deleplanque, Alain Quilliot, pages 385 – 392. Show abstract Abstract. This paper describes the way a multi-item, multi-plant Lot-Sizing problem with transfer costs and capacities may be reformulated according to a multi-commodity flow formalism, and decomposed, through Lagrangean Relaxation, into a master Facility Location problem and a slave Minimal Cost Multi-commodity Flow problem. Thsi decomposition framework gives rise in a natural way to the design of a relax/project approximate algorithm, and it ends with numerical experiments.
  • Insertion techniques and constraint propagation for the DARP
    132 DARP, Constraint propagation, insertion techniques Samuel Deleplanque, Alain Quilliot, pages 393 – 400. Show abstract Abstract. This paper deals with the Dial and Ride Problem (DARP), while using randomized greedy insertion techniques together with constraint propagation techniques. Though it focuses here on the static version of Dial and Ride, it takes into account the fact that practical DARP has to be handled according to a dynamical point of view, and even, in some case, in real time contexts. So, the kind of algorithmic solution which is proposed here, aim at making easier to bridge both points of view. The model is a classical one, and considers a performance criterion which is a mix between Quality of Service (QoS) and economical cost. We first propose the general framework of the model and discuss the link with dynamical DARP, next describe the algorithm and end with numerical experiments.
  • An exact algorithm and a heuristic for scheduling linearly deteriorating jobs with arbitrary precedence constraints and maximum cost criterion
    191 scheduling, single machine, precedence constraints, maximum cost Marek Dębczyński, Stanisław Gawiejnowicz, pages 401 – 405. Show abstract Abstract. We consider the problem of minimizing maximum cost of a single machine schedule for a set of linearly deteriorating jobs with arbitrary precedence constraints. We propose for the problem an exact algorithm and a heuristic, and report results of computational evaluation of the algorithms.
  • ACO for Parameter Settings of E.coli Cultivation Model
    34 E.coli Cultivation, ant colony optimization, Parameter optimization, Hausdorff distance Stefka Fidanova, Olympia Roeva, Maria Ganzha, pages 407 – 414. Show abstract Abstract. E. coli plays an important role in modern biological engineering and industrial microbiology. In this paper Ant Colony Optimization algorithm for a parameter identification of an E. coli fed-batch cultivation process model is proposed. A system of nonlinear ordinary differential equations was used to model the biomass growth and substrate utilization. Parameter optimization was performed using real experimental data set from an E. coli MC4110 fed-batch cultivation process. The objective function was formulated as a distance between model predicted and experimental data. Two different distances were used and compared – Least Square Regression and Hausdorff Distance. Hausdorff Distance was used for first time to solve the considered parameter optimization problem. The results showed that better results about model accuracy is obtained using the objective function with a Hausdorff Distance between the modelled and measured data. Although Hausdorff Distance is more time consuming than Least Square Distance, this metric is more realistic for here discussed problem.
  • Fair flow optimization with advanced aggregation operators in Wireless Mesh Networks
    147 Fairness; Ordered Weighted Averaging; Wireless Mesh Network; Simulated Annealing Jarosław Hurkała, Tomasz Śliwiński, pages 415 – 421. Show abstract Abstract. The problem of fair resource allocation is of considerable importance in many applications. In this paper advanced aggregation operators based on the Ordered Weighted Averaging (OWA) are utilized as consistent and fairness-preserving approach to modeling various preferences with regard to distribution of Internet traffic between network participants. The networking model based on Wireless Mesh Networks is considered. The physical medium properties cause strong interference among simultaneously operating node devices, which makes the optimization problem extremely difficult. We show that in this case OWA-based aggregation operators can be utilized just as easily as traditional lexicographic operators.
  • An Application of Bicriterion Shortest Paths to Collaborative Filtering
    166 Bicriterion path problem, MinSum-MaxMin optimization, Item-based collaborative filtering, Similarity measure Federico Malucelli, Paolo Cremonesi, Borzou Rostami, pages 423 – 429. Show abstract Abstract. Item-based collaborative filtering is one of most widely used and successful neighborhood-based collaborative recommendation approaches. The main idea of item-based algorithms is to compute predictions using the similarity between items. In such approaches, two items are similar if several users of the system have rated these items in a similar fashion. Traditional item-based collaborative filtering algorithms suffer from the lack of available ratings. When the rating data is sparse, as it happens in practice, many items without any rating in common are present. Thus similarity weights may be computed using only a small number of ratings and consequently the item-based approach will make predictions using incomplete data, resulting in biased recommendations. In this paper we present a two phase method to find the similarity between items. In the first phase a similarity matrix is found by using a traditional method. In the second phase we improve the similarity matrix by using a bicreterion path approach. This approach introduces additional similarity links by combining two or more existing links. The two criteria take into account on the one hand the distance between items on a suitable graph (min sum criterion), on the other hand the estimate of the information reliability (max min criterion). Experimental results on the Netflix and Movielens datasets showed that our approach is able to burst the accuracy of existing item-based algorithms and to outperform other algorithms.
  • Experimental Analysis of Different Pheromone Structures in Ant Colony Optimization for Robotic Skin Design
    254 Pheromone strucutre; Constrained Spanning Forest; Ant Colony Optimization; Robotic skin design Cristiano Nattero, Massimo Paolucci, Davide Anghinolfi, Giorgio Cannata, Fulvio Mastrogiovanni, pages 431 – 438. Show abstract Abstract. The optimization of the wire routing in an artificial skin for robots consists in selecting a subset of links between adjacent tactile sensors in order to connect them to a finite set of micro-controllers with a minimum cost.
    The problem has been modeled as a minimum cost Constrained Spanning Forest problem with solution-dependent costs on arcs.
    The problem is NP-hard.
    A MIP formulation is given and an Ant Colony Optimization (ACO) algorithm are given.
    This paper introduces several different alternative pheromone structures, whose effectiveness is evaluated through experimental tests performed on both real and synthetically generated instances.
  • Flow Models for Project Scheduling with Transfer Delays
    128 Scheduling Problem, RCPSP, flow, heuristics Alain Quilliot, Hélène Toussaint, pages 439 – 446. Show abstract Abstract. This paper deals with an extension of the Resource Constrained Project Scheduling Problem (RCPSP), which involves resource transfer delays. A flow model is used in order to formalize this extended RCPSP, which contains the standard RCPS, and leads us to introduce the Timed Flow Polyhedron and to state several structural results. This framework gives rise to a generic Insertion operator, as well as greedy/local search algorithms. We end with numerical tests. Introduction
  • The Hogeneous Non Idling Scheduling Problem
    155 Scheduling, Multi-processor scheduling, Graphs, Matchings Alain Quilliot, Philippe Chretienne, pages 447 – 454. Show abstract Abstract. This paper is about multi-processor scheduling with non idling constraints, i.e constraints which forbid interruption in the use of the processors. We first reformulate the problem, while using a notion of pyramidal shape, and next apply matching techniques in order to get a min-max feasibility characterization criterion, which allows us to derive a polynomial algorithm for the related existence problem and for the Makespan Minimization related problem.
  • Firefly Algorithm Tuning of PID Controller for Glucose Concentration Control during E. coli Fed-batch Cultivation Process
    104 meta-heuristics, firefly algorithm, genetic algorithm, E. coli cultivation process, PID controller, parameter tuning Olympia Roeva, Tsonyo Slavov, pages 455 – 462. Show abstract Abstract. In this paper, a novel meta-heuristics algorithm, namely the Firefly Algorithm (FA), is applied to PID controller parameter tuning in Smith Predictor. The controller is used to control feed rate and to maintain glucose concentration at the desired set point for an E. coli fed-batch cultivation process. A system of nonlinear differential equations is used to model the biomass growth and substrate utilization. The FA adjustments are done based on several pre-tests according to the considered here optimization problem. Simulation results indicate that the applied FA is effective and efficient. Good closed-loop system performance is achieved on the basis of the considered PID controllers tuning procedures. Moreover, the observed results are compared to the ones obtained applying Genetic algorithms. The comparison of both meta-heuristics shows superior performance for FA PID controller tuning of the considered nonlinear control system than GA tuned controller.
  • Broyden restricted class of variable metric methods and oblique projections
    303 quasi-Newton methods, Broyden convex class, oblique projections Andrzej Stachurski, pages 463 – 466. Show abstract Abstract. In the paper the new formulation of the Broyden restricted convex class of updates involving oblique projections is introduced. It is a~sum of two terms: the first one containing special oblique projection and the second standard term ensuring verification of the quasi-Newton condition (it is also an oblique projection multiplied by appropriate scalar). The applied oblique projection involves vector defined as the convex, linear combination of the difference between consecutive iterative points and the image of the previous inverse hessian approximation on the corresponding difference of derivatives, i.e. gradients. Formula relating coefficient in the convex combination of vectors in the oblique projection with its counterpart in the standard representation of the Broyden convex class is presented.
    Some preliminary numerical experiments results on two twice continuously differentiable strictly convex functions with increasing dimension are included.
  • Novel Presolving Techniques For The Connected Facility Location Problem
    13 connected facility location; presolving; network design; Steiner tree; Alessandro Tomazic, pages 467 – 472. Show abstract Abstract. We consider the connected facility location problem (ConFLP), a useful model in telecommunication network design. First we introduce the extended connected facility location problem which generalizes the ConFLP by allowing pre-opened and pre-fixed facilities. This new concept is advantageous for applying complex sequences of reduction tests. By such an analysis of the solution space we anticipate solution dependencies in favor of following optimization methods. Besides transferring existing techniques designed for the facility location problem, the Steiner tree problem and the group Steiner tree problem, specific new reduction methods are introduced. The presented concepts based on graph theoretic formulations are also of theoretical interest. Additionally, we propose an efficient self-adaptive presolving strategy based on test dependencies and test impacts respectively. A computational study shows that the number of edges could be reduced up to 85% and the number of nodes up to 36% respectively on instances from the literature.

1st Workshop on Well-founded Everyday Ontologies–Design, Implementations & Applications

  • Towards Context-Semantic Knowledge Bases
    388 Knowledge base, ontology, context, modularization, context as a box, OntoClean, rigidity, SIM, DL, DDL, DFOL, CKR Krzysztof Goczyła, Aleksander Waloszek, Wojciech Waloszek, pages 475 – 482. Show abstract Abstract. Within the paper we discuss the issue of designing well-founded contextual knowledge bases. Following the base idea that contextualization is a vital part of conceptualization, we extend the definitions of selected notions of OntoClean, the well-known method of assessment of taxonomies, towards contextual approach. This allows us to formulate a set of desirable qualities for context-semantic knowledge bases. In the further part of the paper we show that SIM method, our proposal of organizing contextual knowledge base, is in accordance with the introduced desiderata.
  • Towards beef ontology and its application
    287 beef ontology, semantic search, DOLCE Piotr Kulicki, Robert Trypuz, Jerzy Wierzbicki, pages 483 – 488. Show abstract Abstract.
  • Ontological Analyses of Roles
    73 role, kinds of roles, temporal aspect of roles Riichiro Mizoguchi, Kouji Kozaki, Yoshinobu Kitamura, pages 489 – 496. Show abstract Abstract. This paper discusses roles from ontological point of views. We first propose a most general type including roles and role-like entities in a new way. Then, we discuss ongoing property of roles to in-depth understand temporal aspects of roles. We identify two kinds of roles: original roles and derived roles. Our new findings in this research include all original roles are ongoing and all the derived occurrent-dependent roles are either retrospective or prospective. Finally, we propose a temporal model of derived roles.

Computer Aspects of Numerical Algorithms

  • Stochastic algorithm for estimation of the model's unknown parameters via Bayesian inference
    274 Bayesian inference, stochastic reconstruction, MCMC methods Mieczysław Borysiewicz, Anna Wawrzyńczak-Szaban, Piotr Kopka, pages 501 – 508. Show abstract Abstract. We have applied the methodology combining Bayesian inference with Markov chain Monte Carlo (MCMC) algorithms to the problem of the atmospheric contaminant source localization. The algorithms input data are the on-line arriving information about concentration of given substance registered by sensors' network. A fast-running Gaussian plume dispersion model is adopted as the forward model in the Bayesian inference approach to achieve rapid-response event reconstructions and to benchmark the proposed algorithms. We examined different version of the MCMC in effectiveness to estimate the probabilistic distributions of atmospheric release parameters by scanning 5-dimensional parameters' space. As the results we obtained the probability distributions of a source coordinates and dispersion coefficients which we compared with the values assumed in creation of the sensors' synthetic data. The annealing and burn-in procedures were implemented to assure a robust and efficient parameter-space scans.
  • GPU-accelerated WZ Factorization with the Use of the CUBLAS Library
    371 WZ factorization, GPU, BLAS, CUBLAS Beata Bylina, Jarosław Bylina, pages 509 – 515. Show abstract Abstract. We present a novel implementation of a dense, square, non-structured matrix factorization algorithm, namely the WZ factorization --- with the use of graphics processors (GPUs) and CPUs to gain a high performance at a low cost.
    We rewrite this factorization as operations on blocks of matrices and vectors. We have implemented our block-vector algorithm on GPUs with the use of an appropriate (and ready-to-use) GPU-accelerated mathematical library, namely the CUBLAS library.
    We compared the performance of our algorithm with CPU implementations. In particular, our implementation on an NVIDIA Tesla C2050 GPU outperforms a CPU-based implementation. Our results show that the algorithm scales well with the size of matrices; moreover, the larger the matrix, the better the performance. We also discuss the impact of the size of the matrix and the use of ready-to-use mathematical libraries on the numerical accuracy.
  • Parallel Simultaneous Approach for optimal control of DAE systems
    153 optimal control, DAE systems, Simultaneous Approach Paweł Drąg, Krystyn Styczeń, pages 517 – 523. Show abstract Abstract. In the paper two approaches of parallelization for solving optimal control problems of ODE and index-1 DAE systems were presented and discussed. DAE Optimization Problem can be treated by Direct Nonlinear Programming Approach in two manners, known as Sequential Approach and Simultaneous Approach. Simultaneous Approach seems to be more reliable, because provides initial states in periods and discretized control variables. Therefore there is a possibility of efficient use of Optimization with multiple shooting, which was developed to handle unstable DAE systems. In the article some parallelization methods of the Sequential Quadratic Programming were discussed and compared both in the Jacobian calculation and the numerical integration of DAE models. Augmented objective function, based on Mathematical Programming with Complementarity Constraints, was proposed. The illustrative simulations of Catalyst Mixing Problem were performed in MATLAB, which is a commonly known programming environment.
  • Parallel Finite Element Solver for Multi-Core Computers
    101 Sparse direct solver, multicore computers, multithreading, finite element method, looking-left factorization Sergiy Fialko, pages 525 – 532. Show abstract Abstract. The sparse direct parallel finite element solver PARFES, which is based on block L•S•LT factoring of symmetrical stiffness matrix, where S is the sign diagonal, is developed for analysis of problems of structural and solid mechanics using multi-core desktop computers. The subdivision of sparse matrix into rectangular blocks and the use of procedures from level 3 BLAS leads to high-performance factorization. Comparisons against multi-frontal solvers and PARDISO from the Intel Math Kernel Library when solving real problems from the SCAD Soft collection demonstrate that PARFES is a good alternative to the multi-frontal method and, unlike PARDISO, allows solving large-scale problems using computers with limited amount of RAM, owing to virtualization.
  • Multi-GPU Implementation of the Uniformization Method for Solving Markov Models
    377 Markovian models, uniformization method, GPU, multi-GPU, heterogeneous computations, parallel computing, wireless network models Marek Karwacki, Beata Bylina, Jarosław Bylina, pages 533 – 537. Show abstract Abstract. There are some computational problems during solving Markovian models. They are connected with the size (usually a very huge one) and the time of computations. A useful method for finding transient probabilities in Markovian models is the uniformization. This method is time-consuming, paticulary for the large matrices.
    In this paper, we propose a parallel implementation (with the use of CUDA) of the uniformization method on a multi-GPU architecture. Our gain is that such an implementation can solve a problem described with a matrix exceeding a single GPU's memory and accelerate computations in comparison to a multithreaded CPU. Computational tests have been carried out for a wireless network models. The tests show that a model can be solved which is described with a matrix of the size $3.6\times{}10^7$—with the use of the uniformization method using multi-GPU.
  • Solving Systems of Polynomial Equations on a GPU
    390 nonlinear systems, parallel computing, GPGPU, de Casteljau subdivision Robert Kłopotek, Joanna Porter-Sobieraj, pages 539 – 544. Show abstract Abstract. This paper explores the opportunities of using a GPGPU to solve systems of polynomial equations. We propose numerical real root-finding based on recursive de Casteljau subdivision over an n-dimensional rectangular domain. Two variants of parallelism—multithreading and multiprocessing—have been investigated. The speed, memory consumption and resistance for different sets of input data have also been examined.
  • Adaptive-Blocking Hierarchical Storage Format for Sparse Matrices
    67 hierarchical storage formats, sparse matrices, adaptive-blocking Daniel Langr, Ivan Šimeček, Pavel Tvrdík, Tomáš Dytrych, Jerry P. Draayer, pages 545 – 551. Show abstract Abstract. Hierarchical storage formats (HSFs) can significantly reduce the space complexity of sparse matrices. They vary in storage schemes that they use for blocks and for block matrices. However, the current HSFs prescribe a fixed storage scheme for all blocks, which is not always space-optimal. We show that, generally, different storage schemes are space-optimal for different blocks. We further propose a new HSF that is based on this approach and compare its space complexity with current HSFs for benchmark matrices arising from different application areas.
  • Parallel Communication-Free Algorithm for Triangular Matrix Inversion on Heterogeneous Platforms
    320 communication free; divide and heterogeneous platform ; parallel algorithm; algorithm ; triangular matrix inversion Ryma Mahfoudhi, Zaher Mahjoub, Wahid Nasri, pages 553 – 560. Show abstract Abstract. We address in this paper the parallelization of a recursive algorithm for triangular matrix inversion (TMI) based on the ‘Divide and Conquer' (D&C) paradigm. A series of different versions of an original sequential algorithm are first presented. A theoretical performance study permits to establish an accurate comparison between the designed algorithms. Afterwards, we develop an optimal parallel communication-free algorithm targeting a heterogeneous environment involving processors of different speeds. For this purpose, we use a non equitable and incomplete version of the D&C paradigm consisting in recursively decomposing the original TMI problem in two subproblems of non equal sizes, then decomposing only one subproblem and so on. The theoretical study is validated by a series of experiments achieved on two platforms, namely an 8-core shared memory machine and a distributed memory cluster of 16 nodes. The obtained results permit to illustrate the interest of the contribution.
  • The Analysis and Comparison of Algorithm in QR Decomposition
    397 QR Decomposition, Elementary operation, Orthogonal transformation Anggha Nugraha, T Basaruddin, pages 561 – 565. Show abstract Abstract. QR decomposition of matrix is one of the important problems in the field of matrix theory. Besides, there are also so many extensive applications that using QR decomposition. Because of that, there are many researchers have been studying about algorithm for this decomposition. Two of those researchers are Feng Tianxiang and Liu Hongxia. In their paper, they proposed new algorithm to make QR decomposition with the elementary operation that is elementary row operations. This paper gives review of their paper, the analysis and numerical experiment using their algorithm, comparison with other existing algorithms and also suggestion for using other existing better algorithm that also has same features with theirs. Beside of them, we also compare all of these algorithms for some types of matrix. The result can be seen at this paper also.
  • On LDPC Codes Corresponding to Infinite Family of Graphs A(k;K)
    359 LDPC codes, Sparse matrices, Data Error, Graphs of large girth Monika Polak, Vasyl Ustimenko, pages 567 – 570. Show abstract Abstract. In this paper we investigate correcting properties of LDPC error correcting codes obtained from new infinite family of special extremal graphs. We describe how to construct these codes and compare our results with codes obtained by Guinand and Lodge, corresponding to family of graphs D(k; q) and used by NSA.
  • Parallel GPU-accelerated recursion-based generators of pseudorandom numbers
    380 parallel algorithms, pseudorandom numbers, GPU Przemyslaw Stpiczynski, Dominik Szalkowski, Joanna Potiopa, pages 571 – 578. Show abstract Abstract. The aim of the paper is to show how to design parallel algorithms for linear congruential and lagged Fibonacci pseudorandom numbers generators.
  • FPGA implementation of the 32-point DFT for a wavelet trigger of cosmic rays experiments
    144 FFT, Radix-2, trigger, FPGA, wavelets Zbigniew Szadkowski, pages 579 – 586. Show abstract Abstract. For the observation of ultra high-energy cosmic rays (UHECRs) by the detection of their coherent radio emission an FPGA based wavelet trigger is being developed. Using radio detection, the electromagnetic part of an air shower in the atmosphere may be studied in detail, thus providing information complementary to that obtained by water Cherenkov detectors which are predominantly sensitive to the muonic content of an air shower at ground. For an extensive radio detector array, due to the limited communication data rate, a sophisticated self trigger is necessary. The wavelet trigger investigating online a power of signals is promising, however its implementation requires some optimizations. The digitized signals are converted from the time to frequency domain by a standard Altera library based FFT procedure, then multiplied by wavelet transforms and finally converted to the time-domain again. Altera FFT routines convert ADC data as blocks of $2^N$ samples. FFT coefficients are provided in a serial stream in $2^N$ time bins. An estimated signals power strongly depends on relatively positions of the FFT(data) and the wavelet transforms in a frequency domain. Additional procedure has to calculate a most efficient selection of the sample block to reach a response corresponding to a maximal signal power.
    If a set of FFT coefficients were available in each clock cycle, the signal power could be estimated also in each clock cycle and additional tuning procedure would not be necessary. The paper describes an implementation of the 32-point FFT algorithm into Altera FPGA providing all 32 complex DFT coefficients for the wavelet trigger.
  • A parallel searching algorithm for the insetting procedure in Matlab Parallel Toolbox
    252 insetting procedure; searching algorithm; parallel computing; Dimitris Varsamis, Paris Mastorocostas, Apostolos Papakonstantinou, Nicholas Karampetakis, pages 587 – 593. Show abstract Abstract. In this paper we present the implementation of a parallel searching algorithm, which is used for the insetting procedure in cartography. The calculation time of the above procedure is very long due to the fact that the datasets in cartography are maps with large and very large resolution. The purpose of this proposal is to reduce the calculation time in a multicore machine with shared memory. The proposed algorithm and the performance tests are developed in Matlab Parallel Toolbox.

International Symposium on Frontiers in Network Applications and Network Systems

  • Detectors Generation using Genetic Algorithm for a Negative Selection Inspired Anomaly Network Intrusion Detection System
    116 Genetic Algorithm,Intrusion Detection System Amira Sayed A. Aziz, Mostafa Salama, Aboul Ella Hassanien, Sanaa EL-Ola Hanafi, pages 597 – 602. Show abstract Abstract. This paper presents an approach for detecting network traffic anomalies using detectors generated by the genetic algorithm with deterministic crowding Niching technique. Particularly, the suggested approach is inspired by the negative selection mechanism of the immune system that can detect foreign patterns in the complement (non-self) space. In our paper, we run a number of experiments on the relatively new NSL-KDD data set which was never tested against this algorithm before our work. We run the test using different values for the involved parameters, to find out which values give the best detection rates, so we can give recommendations for future application of the algorithm. Also, Formal Concept Analysis is applied on the generated rules to visualize the relation among attributes. We'll show in the results that the algorithm have very good results through the analysis, compared to other machine learning approaches.
  • Telco 2.0 for UC – an example of integration telecommunications service provider's SDP with enterprise UC system
    212 Unified Communications, API, Telco 2.0, SOA Dariusz Bogusz, Jaroslaw Legierski, Andrzej Podziewski, Kamil Litwiniuk, pages 603 – 606. Show abstract Abstract. This paper presents practical integration of enterprise Unified Communications system with telecommunication service provider Service Delivery Platform. System implemented in Orange Labs, enables sending SMS and USSD messages geo-localizing of user mobile terminal directly from user interface of Unified Communications system.
    This article presents also statement regarding further investigations in the area of integration enterprise UC with telecommunication service provider SDP using Telco 2.0 model and restrictions in implementing this kind of applications.
  • Towards Caching Algorithm Applicable to Mobile Clients
    180 distributed file system; cache; caching policy; LFU-SS; LRFU-SS Pavel Bžoch, Luboš Matějka, Ladislav Pešička, Jiří Šafařík, pages 607 – 614. Show abstract Abstract. Using of mobile devices has grown over the past years. Under the term “mobile devices”, we can see cell phones, personal digital assistants (PDA), smart phones, netbooks, tablets etc. Mobile devices provide many function e.g. accessing internet and e-mail, playing music and movies, accessing files from remote storage. Disadvantage of mobile devices is that connection to the internet can vary. It can be very fast while using 3G mobile network or very slow while using an old GPRS connection. The newest mobile communication technologies are not available everywhere. But the users usually wants to access their files as quickly as they can access them on wire-connection.
    If data are demanded repeatedly, they can be stored in mobile device in an intermediate component called a cache. The cache capacity is limited, so we should store in the cache only data that will be probably required in the future. In this paper, we present innovated caching algorithm. The algorithm is based on local and server statistics that are used to predict user behavior.
  • LibSWD—Serial Wire Debug Open Framework for Low–Level Embedded Systems Access
    279 libswd, swd, jtag, low level security, arm crotex, open source Tomasz Cedro, Marcin Kuzia, Antoni Grzanka, pages 615 – 620. Show abstract Abstract. Modern microelectronics has settled for good in embedded systems that run our everyday life in areas of home entertainment, telecommunications, medical equipment, various industrial applications, even military and aerospace systems. Increasing complexity of these systems requires new tools for development, testing and security analysis. Presented work is an ongoing effort to create from scratch a Free and Open framework for low–level access (In–Circuit–Emulation and On–Chip–Debug) into ARM–Cortex based devices that use new SWD bus (a JTAG alternative). LibSWD is a BSD–licensed software library and it is integrated into well known Open-Source applications such as UrJTAG and OpenOCD.
  • The method of secure data exchange using Flash RAM media
    285 data security, symmetric encryption, asymmetric encryption, device drivers Jan Chudzikiewicz, Janusz Furtak, pages 621 – 625. Show abstract Abstract. This document describes the method for secure transfer of files stored in Flash RAM through unsecured transport channel (e.g: courier) between users. In this method the sender of the file specifies the recipient and the recipient knows who is the sender of the file. The idea of a solution that uses symmetric and asymmetric encryption is described. The following procedures are presented: creating protected file (encryption), generating signatures for that file and reading (decryption) the file.
  • Use of Geographic Information Systems for Flooding Analysis in Urban Drainage
    18 flooding, urban drainage Lothar Fuchs, Thomas Beeneken, Martin Lindenberg, pages 627 – 631. Show abstract Abstract. A detailed flooding analysis for extreme events is a more and more upcoming task in urban drainage studies. For such studies an integrated model simulating the flow in the sewer system normally as a 1-D approach and a 2-D model simulation the flow on the surface is needed. This allows for the simulation of the interaction between the flow in the sewer system and the surface.
    The simulation of flooding for a large urban catchment with a high resolution in time and space is not a technical problem but a computer time consuming process if one likes to simulate flooding the whole urban catchment.
    The paper describes the methodology for a risk analysis in a stepwise procedure with the objective not to simulate the whole urban catchment but for those areas with a potential risk.
  • Content Delivery Network Monitoring
    355 CDN, Time Series Database Krzysztof Kaczmarski, Marcin Pilarski, pages 633 – 639. Show abstract Abstract. This document describes the architecture of a distributed Content Delivery Network (CDN) monitoring system and its deployment in a research environment in one of the biggest telecommunication company in Poland and involves about fifty nodes distributed in the country and database system located on dedicated storage cluster working in RD center in Warsaw.
  • Forecasting threatening situations in a Smart Space
    197 threat, context, public safety Sania Kalitska, Przemysław Kukiełka, Jonczyk Maciej, Legierski Jarosław, Szczekocka Ewelina, pages 641 – 647. Show abstract Abstract. Proposed approach to the identification and forecasting of threatening situations in a Smart Space. This applies to the problem of public safety during a natural disasters, human activities, failures of devices. Warning of threats is becoming increasingly possible with the development of "Smart Connected World". Therefore, this article can be interpreted as “Public safety and future communication networks”.
  • BusStop – Telco 2.0 application supporting public transport in agglomerations
    125 public transport, Telco 2.0 Kamil Litwiniuk, Tomasz Czarnecki, Sebastian Grabowski, Jarosław Legierski, pages 649 – 653. Show abstract Abstract. This paper presents possibility to use Telco 2.0 architecture interfaces to realize a service supporting public transport in agglomerations. Such systems are expected to be easily accessible for everyone and to be available in any location. Such expectations may only be met by a system using the latest achievements in the scope of telecommunication, which are API based on telecommunication operator's networks.
  • How to build a flexible and cost-effective high-speed access network based on FTTB+LAN architecture
    169 FTTB+LAN, GPON, VHBB, flexible, cost-effective Paweł Parol, Michał Pawłowski, pages 655 – 662. Show abstract Abstract. In this paper we propose an approach of how to build a modern high-speed access network based on FTTB+LAN access architecture in multi-dwelling buildings where cat.5 copper infrastructure is available and can be reused. Presented approach allows to build an access network which is easy-to-deploy and cost-effective as compared to another FTTH and FTTB-based topologies. We propose a flexible network architecture design enabling various service profiles cohabitation in one access network. The study presents also methods of how to carry user traffic in effective way within the considered architecture. The proposed approach was verified during a field trial, which results are discussed in the paper.
  • Emergency Button – a Telco 2.0 application in the e-health environment
    135 emergency location, fall detection, telco 2.0 Andrzej Podziewski, Kamil Litwiniuk, Jarosław Legierski, pages 663 – 667. Show abstract Abstract. The paper presents the idea of Telco 2.0 with an e-health usage scenario. Since numerous elderly people are going missing every year, proposed emergency location service depicts a way in which mobile operators' networks, the Internet and possibilities given by rapid improvement of smartphones' functionalities can converge in order to relieve the problem. The descriptions are supplemented with sets of accuracy measurements and usability tests, conducted during test deployment in Polish Orange Labs R&D Centre. The results confirm usability potential of the service, giving green light for further research and development. Still, in order to make the service reliable, the algorithms used to determine location and detect falls need to be improved.
  • Telco 2.1 Plugin for Integration of Blog Software With Mobile Communication Network
    405 Grzegorz Sabak, pages 669 – 672. Show abstract Abstract. At the present time Telecom Web Services (TWS) become more and more widely known and many mobile network operators decide to expose their networks through Application Programmer Interfaces (API) and include access to their infrastructure as a part of their offer. This document presents an idea of development of blog software plug-in which would allow blog owners to use telecommunication services. This should be available even to those who do not have programming skills and do not want to invest in dedicated integration their blogs with available APIs. Proposed functionalities of such plugin are listed and a short case study is presented. The case study shows how telecommunication functions could enrich a travel blog with more interactivity and communication features which currently are not available. The document also describes a working prototype which was prepared in order to verify feasibility of the proposed idea. Basic technical specification is provided and example functionality is described with command reference and screenshots of dedicated Web portal which was set up for plug-in testing purposes.
  • Folksonomy implementation based on the ART-1 neural network
    362 Folksonomy, ART-1, Neural network Adam Sobaniec, Bohdan Macukow, pages 673 – 677. Show abstract Abstract. This document describes the sample implementation of a very popular classification method in the modern internet web applications – folksonomy. This method forces users to assign a particular keywords to the content that they are uploading. Basing on the assigned keywords it is possible to find a similar content. In this paper, there is described the method of finding similar content basing on the ART-1 neural network. Such solution allows to perform a background clustering of a content and speed up the process of retrieving the related data. In case of a heavy exploitation of an internet application, this might be a big advantage
  • Building well-balanced CDN
    149 CDN, Content Distribution Network, Planetlab Piotr Stapp, Piotr Zgadzaj, pages 679 – 683. Show abstract Abstract. The following document describes building well-balanced CDN evolution process. We start with very intuitive, but unfortunately wrong solution and change it to the one which works almost ideally. We realized our experiments on Planetlab environment, which is a good internet simulation. Every experiment description is in common format for easy comparison. Document include for each experiment methodology: environment description, system architecture, short description of experiments, result analyzing and conclusions.
  • Implementation of Brutlag's algorithm in Anomaly Detection 3.0
    118 Anomaly Detection, Holt-Winters exponential smoothing Maciej Szmit, Sławomir Adamus, Sebastian Bugała, Anna Szmit, pages 685 – 691. Show abstract Abstract. This paper presents information about Anomaly Detection—a Snort-based network traffic monitoring tool. The article concerns use of based on Holt-Winters forecasting method in real-time behavioral analysis of network traffic.

International Symposium on Multimedia Applications and Processing

  • Underdetermined Blind Source Separation based on Fuzzy C-Means and Semi-Nonnegative Matrix Factorization
    69 blind source separation - Fuzzy C-Means Ossama. Alshabrawy, Wael Awad, Aboul Ella Hassanien, pages 695 – 700. Show abstract Abstract. Conventional blind source separation is based on overdetermined with more sensors than sources but the underdetermined is a challenging case and more convenient to actual situation. Non-negative Matrix Factorization (NMF) has been widely applied to Blind Source Separation (BSS) problems. However, the separation results are sensitive to the initialization of parameters of NMF. Avoiding the subjectivity of choosing parameters, we used the Fuzzy C-Means (FCM) clustering technique to estimate the mixing matrix and to reduce the requirement for sparsity. Also, decreasing the constraints is regarded in this paper by using Semi-NMF. In this paper we propose a new two-step algorithm in order to solve the underdetermined blind source separation. We show how to combine the FCM clustering technique with the gradient-based NMF with the multi-layer technique. The simulation results show that our proposed algorithm can separate the source signals with high signal-to-noise ratio and quite low cost time compared with some algorithms.
  • Fractional Delay Filter Design for Sample Rate Conversion
    295 sample rate conversion, fractional delay filter Marek Blok, pages 701 – 706. Show abstract Abstract. With a large number of different standards of sample ratios we often need to use sample rate conversion algorithms. If the resampling ratio is not expressed as the ratio of small integer numbers or is not a fixed value, the sample rate conversion algorithm based on fractional delay filters might be used since it allows for arbitrary resampling ratios. The only factor which influences the performance of such algorithm depends solely on the method used to design fractional delay filters. In this paper we propose a novel classification of fractional delay filter design methods dividing them into three general categories: optimal fractional filter design, offset window method and polyphase decomposition. The proposed classification is based on differences in properties of sample rate conversion algorithm based on those filters.
  • A comparative study between recursive and non-recursive algoritms content search in a dual multimedia databases with images
    20 multimedia database, endoscopic images, different images, methods of deciding the similarities, color space, algorithms content search Gabi Daniela Garaiman, Dumitru Dan Garaiman, pages 707 – 710. Show abstract Abstract. This is a comparative study for recursive and non-recursive content search algorithms in a dual multimedia databases with medical (endoscopic) and different (natural) images. The recursive algorithms in two stages use the same method of determining similarity at each stage and models to represent color content of images in quantified uniform spaces. The non-recursive algorithms use a single iteration and models to represent color content of images in unevenly quantified color spaces. The performance of the search has been measured according to four parameters: reappeal, precision, the quality of retrieval and the cost of retrieval. These are based on two methods of deciding the similarities between the models of images: Minkowski distance and Jaccard generalized measure. The model of representing the images in multimedia database used here are the normalized color histogram. The color space of the images are RGB reduced to 64 and 125 colors and HSV reduced to 60 and 162 colors. The study was realized in a dual database containing 360 endoscopic images grouped in 23 categories and 280 different images grouped in 10 categories. The results are presented both in tables and graphs.
  • Novice User involvement in information architecture for a mobile tablet application through card sorting
    102 inteface, infrmation architecture, card -sorting, novice user, mobile tablet Chrysoula Gatsou, Anastasios Politis, Dimitrios Zevgolis, pages 711 – 718. Show abstract Abstract. The purpose of this paper is to describe the process, analysis, results and implications of a card-sorting usability study. The study was conducted in order to investigate user-behavior during the design of a mobile tablet application for inexperienced users centred on the topic of “first aid”. Card sorting is a participant-based knowledge elicitation technique for grouping information into categorical domains. We identified nine categories of cards and three cards were used by a small percentage of users. The categories showed indications of grouping by shared words and task. Differences in grouping were probably due to various mental representations on the part of users. Novices tend to group cards in one level without sub-groupings. Participants made many suggestions regarding possible new content.
  • Analysis of Long-distance Word Dependencies and Pronunciation Variability at Conversational Russian Speech Recognition
    148 automatic speech recognition, speech variability, pronunciation vocabulary, language model Irina Kipyatkova, Alexey Karpov, Vasilisa Verkhodanova, Miloš Železný, pages 719 – 725. Show abstract Abstract. The key issues of conversational Russian speech processing at phonemic and language model levels are considered in the work. The multiple transcriptions for modeling word pronunciation variety and joint application of statistic and syntactic analysis of training text data for modeling long-distance grammatical relations between words in the phrase are proposed. The word error rate of the developed speech recognition system was 33% for the collected conversational speech corpus.
  • The Use of Wet Paper Codes With Audio Watermarking Based on Echo Hiding
    6 Audio-signal, Cepstrum, Watermarking, Wet paper codes Valery Korzhik, Guilermo Morales-Luna, Ivan Fedyanin, pages 727 – 732. Show abstract Abstract. We consider an audio watermarking technique based on echo hiding that provides both a very high quality of audio signals just after embedding of some hidden messages and robustness of their extraction under the condition of natural signal transforms. The technique of cepstrum analysis is used for hidden message extraction along with its parameter optimization. Since the extracted bit error probability is kept still significant for an acceptable sound fidelity and embedding rate, we propose to use wet paper codes to reduce the error probability to zero at the cost of a very negligible embedding rate degradation.
  • Computer models for algorithmic composition
    54 algorithmic composition, MIDI, artificial art, music machine learning Łukasz Mazurowski, pages 733 – 737. Show abstract Abstract. The algorithmic composition models used in the domain of systems generating music compositions are presented in the paper. Moreover, the model based on the transition matrix of music events (music patterns described by notes, measures and durations) and classification of the instrumental parts appearing in the input music work is presented. An exemplary implementation of the model is described using MIDI Toolbox implemented in Matlab. In the summary possible extensions of the presented model are described as well as the place of system functioning results in the form of output music compositions are indicated
  • The Design of eLeTK – Software System for Enhancing On-Line Educational Environments
    141 educational data mining, software system, toolkit Marian Cristian Mihăescu, pages 739 – 744. Show abstract Abstract. This paper presents the design of eLeTK, which is a software system that may be used for enhancing on-line educational environments. The main concept introduced in this paper is represented by educational data/knowledge flow. The data flow is transformed into a knowledge flow provided that all input data represents activity produced by an on-line educational environment. On the other hand, the output of the software system is redirected towards the educational environment in an attempt to enhance its capabilities. eLeTK may become a recommender system for students or professors, a knowledge self-assessment tool for students or an custom learning path builder. The core business logic of eLeTK is represented by custom integration of different machine learning algorithms adapted to work with data provided by on-line educational environments.
  • Graph-Based Volumetric Data Segmentation on a Hexagonal-Prismatic Lattice
    317 volumetric, segmentation, hexagonal, lattice Mihai Popescu, Razvan Tanasie, pages 745 – 749. Show abstract Abstract. In this paper we present a graph-based volumetric data segmentation method based on a 3D hexagonal prismatic lattice. We evaluate the advantages and disadvantages of using this lattice in contrast with classic ones. One of the main advantages are high isoperimetric quotient, near equidistant neighbours (ability to represent curves better, resulting in a better segmentation) and high connectivity. Disadvantages are due to the main stream lack of interest in this area and thus data sets must be converted back and forth from rectangular to hexagonal latices both in acquisition and visualization processes.
  • Unsupervised Partitioning of Numerical Attributes Using Fuzzy Sets
    306 clustering, fuzzy sets, partitioning Bogdan Popescu, Andreea Popescu, Marius Brezovan, Eugen Ganea, pages 751 – 754. Show abstract Abstract. The current paper presents an improved partitioning mechanism for numeric data. The efficiency of our method will be illustrated through a solid set of tests that have been performed. We have planned this partitioning phase as an initial step in a more complex algorithm to be further studied and implemented. The final goal is to use it for further decision making in image annotation. Fuzzy Sets theory has been used as a base for our clustering algorithm and partitioning. We included this mechanism as a component of a framework we developed for image processing, more exactly for image segmentation evaluation model we are building.
  • Facial Biometrics for 2D Barcodes
    162 2D Barcode Authentication Marco Querini, Giuseppe F. Italiano, pages 755 – 762. Show abstract Abstract. This paper proposes a novel use of 2D barcodes to store biometric data, in particular for facial recognition, which represents one of the least invasive biometric techniques. To accomplish this, we deploy 2D color barcodes, which allow larger storage capabilities than traditional 2D barcodes. To improve the quality of facial recognition, we combine local feature descriptors, such as SURF descriptors, together with shape landmarks identified through statistical models for discriminating faces. The biometric information can be secured through digital signature algorithms, in order to protect biometric data from malicious tampering. The use of color barcodes is crucial in this setting, as traditional barcodes cannot store a suitable number of SURF descriptors for discriminating faces and cannot even afford to store an additional cryptographic payload. We report the results of an experimental evaluation of our system on realworld data sets (i.e., a face database).
  • Incorporating Random Forest Trees with Particle Swarm Optimization for Automatic Image Annotation
    234 particle swarm optimization algorithm, image annotation, random forest Mohamed Sami, Nashwa Ebendary, Aboul Ella Hassanien, pages 763 – 769. Show abstract Abstract. This paper presents an automatic image annotation approach that integrates the random forest classifier with particle swarm optimization algorithm for classes scores weighting. The proposed hybrid approach refines the output of multiclass classification that is based on the usage of random forest classifier for automatically labeling images with a number of words. Each input image is segmented using the normalized cuts segmentation algorithm in order to create a descriptor for each segment. Images feature vectors are clustered into K clusters and a random forest classifier is trained for each cluster. Particle swarm optimization algorithm is employed as a search strategy to identify an optimal weighting for classes scores from random forest classifiers. The proposed approach has been applied on Corel5K benchmark dataset. Experimental results and comparative performance evaluation, for results obtained from the proposed approach and other related researches, demonstrate that the proposed approach outperforms the performance of other approaches, considering annotation accuracy, for the experimented dataset.

International Conference on Wireless Sensor Networks

  • Energy-efficient security in Implantable Medical Devices
    332 Energy-efficient security in Wireless Sensor Networks Krzysztof Daniluk, Ewa Niewiadomska-Szynkiewicz, pages 773 – 778. Show abstract Abstract. In presented survey are discussed topics concerning implantable medical devices (IMDs) creating wireless sensor networks on the patient's body. There are mainly discussed possibilities to combine energy-efficient and security issues for IMDs. Implantable medical devices are very sensitive for energy constraints and for providing secure transmission between them and external devices, which aim is to monitor and control IMDs. Presented subjects are security & privacy risks during monitoring, controlling, drug disposition and part identification of IMDs as well as are discussed existing solutions on the field of energy-efficient issues providing secure transmission for implantable medical devices. In the final part of this paper is presented author's novel concept for energy-efficient security in wearable devices monitoring health conditions.
  • A QoS based Heuristics for Clustering in Two-Tier Sensor Networks
    33 Heterogeneous sensor networks; Voronoi; Tabu; Clustering; Routing Kanwalinderjit Kaur Gagneja, Kendall E. Nygard, pages 779 – 784. Show abstract Abstract. Once sensors detect an event they always have to route the data to base station where the data is processed. Since sensors usually have concerns regarding coverage, energy, processing power and memory, etc achieving Quality of Service is hard in sensor networks. Therefore to deal with such issues of sensors and to maximize the Quality of Service, initially the two tiers Heterogeneous Sensor Networks approach is used to route the data. Second, the sensors are partitioned into clusters to increase the network coverage and to reduce transportation costs and energy utilization. Voronoi clustering and Tabu search meta-heuristics have been used for making such clusters. An Improved Tree Routing technique is applied to two-tier Heterogeneous Sensor Networks to route the data through cluster heads. This approach largely increases the performance of sensor networks. Through simulation results, we show that the Voronoi-Tabu based clustering technique when added to Improved Tree Routing has better Quality of Service than Directed Diffusion and Low Energy Adaptive Clustering Hierarchy routing protocols. Furthermore, empirical evaluations show that Voronoi-Tabu based clustering increases the throughput of the network, in addition to decreasing the energy utilization and network delays.
  • Sensor for Vehicles Classification
    215 magnetometer, WSN, vehicles classification Ondrej Karpis, pages 785 – 789. Show abstract Abstract. This paper focuses on problems of gathering parameters of traffic flow using simple sensors. First part of the paper describes properties of a sensor node based on magnetometer. Influence of various parameters (vehicle velocity, sensor location and orientation) on sensor output is evaluated. We found that the sensor is sufficiently sensitive to be located on road verge. In the second part, the sensor is used for vehicles classification based on estimate of their length. Velocity of vehicles is measured by a speed trap. The results of classification are compared with measurement where the velocity of vehicles is just estimated.
  • Supercapacitor power unit for an event-driven wireless sensor node
    205 Supercapacitor, WSN, sensor node, power supply unit Michal Kochláň, Peter Ševčík, pages 791 – 796. Show abstract Abstract. This paper discusses power unit based on supercapacitors for event-driven wireless sensor network node. In such nodes, most of the energy is consumed when transmitting or receiving data. Sleep-wake scheduling mechanism is an effective way how to prolong node's lifetime. However, this mechanism can result in delays because a transmitting node has to wait until his neighbor wakes up. Another solution is event-driven communication model, where nodes communicate in case an event occurs. Although, events occur asynchronously, the node needs to send a keepalive message periodically. An event can raise very high temperature, indicating fire, or presence of movement in a surveillance system. Good communication pat-tern can reduce energy consumption. However, the main issue remains the design of network node power unit. We propose supercapacitor power unit circuit which is charged from solar cells and discuss a sensor node energy balance.
  • Technical Infrastructure for Monitoring the Transportation of Oversized and Dangerous Goods
    298 dangerous goods transport, sensor network, on-board unit, wireless communication Emil Kršák, Patrik Hrkút, Peter Vestenický, pages 797 – 802. Show abstract Abstract. The paper presents the technical infrastructure of the system for monitoring the road transportation of oversized and dangerous goods. It describes the basic components of the system – vehicle sensor network, vehicle on-board unit (OBU), monitoring centre and wireless communication system. Moreover, it specifies some important system parameters, especially the period of sending the data about the state / condition of goods to the monitoring centre.
  • WSN Sensor Node for Protected Area Monitoring
    237 wireless sensor network, sensor nod, supercapacitor, energy harvesting Juraj Miček, Ján Kapitulík, pages 803 – 807. Show abstract Abstract. The article is dedicated to sensor node design focused on minimalization of energy consumption. The mote is proposed for low rate or occasional data transmission so that communication subsystem energy consumption can be minimal. In case of simple applications, the sensor could be energized from two super-capacitors 50F/2.3V during 14 hours. The mote consists of acoustic sensor monitoring specific events and switching microcontroller from stop-status to active-status. All rest mote functions are controlled by microcontroller. Real time clock (RTC) can activate sensor node too. It can be used for time synchronization of communication subsystem as well.
  • WSN for Forest Monitoring to Prevent Illegal Logging
    208 WSN, illegal logging, signal evaluation Jozef Papán, Matúš Jurečka, Jana Púchyová, pages 809 – 812. Show abstract Abstract. Illegal logging is in these days widespread problem. In this paper we propose the system based on principles of WSN for monitoring the forest. Acoustic signal processing and evaluation system described in this paper is dealing with the detection of chainsaw sound with autocorrelation method. This work is describing first steps in building the integrated system.
  • Modeling and Simulation of MISO Diversity for UHF RFID Communication
    14 RFID, Communication, Diversity Methods, Numerical Simulation, Rician-, Dyadic Backscatter Channel Grzegorz Smietanka, Jürgen Götze, pages 813 – 820. Show abstract Abstract. Radio Frequency Identification (RFID) is used in high scattering environments where deep fading exists. This makes diversity particularly interesting for this communication scenario. In this paper the potential of using multiple tag antennas for RFID-communication is shown. The bit error rate (BER) and packet error rate (PER) is presented, which include the backscattering answer of a RFID-Tag according to the EPC Class-1 Gen-2 protocol. The rates are regarded in combination with the relevant fading channel models for RFID communication such as the Rician- and the Dyadic Backscatter Channel. It is shown that the possible diversity gain from the signal according to EPC Class-1 Gen-2 protocol is several dB regarding the error rates. It is also shown that this diversity gain increases with the correlation of the forward and backward link and decreases with the usage of a more robust encoding scheme and a correlation between the several transmission paths. Additionally the performance of Multiple Input Single Output (MISO) system with different spatial and forward/backward correlation situations is regarded to have a detailed view on a correlated RFID transmission system using diversity. The performance of this model is verified, using simulations of this propagation system.

Information Systems Education & Curricula Workshop

  • Cataloging Teaching Units: Resources, Evaluation and Collaboration
    368 CSchool, Teaching Units, teaching resources, teaching evaluation and collaboration, Cloud computing, Web Services, Students Curriculum, ICT Antonio Paules Cipres, Habib M. Fardoun, Abdulfattah Mashat, pages 825 – 830. Show abstract Abstract. The teaching unit is a way to plan the teaching/ learning process about a content item that becomes the focus of a specific educative process, bringing consistency and significance. This way of organizing knowledge and experience should consider the diversity of elements that contextualize the process (level of student development, social background, family and market, Project Curriculum, available resources), to regulate the practice of the contents, to select the basic objectives intended to achieve, the methodological guidelines with which they work and finally the teaching/learning experiences necessary to perfect the process. In this research work we present a system that allow the organization of a set of teaching and learning activities and meets, at its highest level of detail all the elements that compose the curriculum: setting goals and content, design and development of activities and evaluation, organization of space and time, and providing the necessary resources.
  • Assessing EHEA methods in the HCI1 subject at the College of Computer Science at the University of Castilla-La Mancha (Spain): an experience in the Promotion Course to Degree
    220 Human-Computer Interaction, European Higher Education (EHEA), learning evaluation, skills. Ana Isabel Molina Díaz, Carmen Lacave Rodero, Miguel Ángel Redondo Duque, Manuel Ortega Cantero, pages 831 – 838. Show abstract Abstract. This paper describes the experience developed in the subject Human-Computer Interaction I (HCI1) in the Promotion Course to Degree that has begun to be taught during 2011/2012 in the College of Computer Science of Ciudad Real. This experience has provided us an opportunity to measure and evaluate various aspects from the point of view of the students. Those aspects are related to the applicability of this subject's contents, the effort to assimilate them, the suitability of the proposed activities, the methodologies, criteria and evaluation methods, and so on. The study conducted and presented in this paper has allowed us to contrast the opinion and experience of different groups of students with the approaches of the European Higher Education Area (EHEA). In this study we analyze the results applying classical statistic methods and probabilistic graphical models (Bayesian Networks). This last representation has a richer semantics and provides, at first glance, a snapshot of the relevant relationships among the variables under consideration.
  • Smalltalk: the Leading Language to Learn Object-Oriented Programming
    360 Object-Oriented programming, Smalltalk Jose A. Gallud, Pedro Gonzalez Villanueva, pages 839 – 840. Show abstract Abstract. The use of Java in the first courses of Computing, Computer Sciences and similar degrees is widely accepted. However, many programming professors realize that while is possible for students to use an Object-Oriented language, is also possible to program with them without applying an Object-Oriented mentality. This paper defends the use of Smalltalk programming language as the best option for students to learn Object-Oriented programming and acquiring an Object-Oriented mentality at the same time. This study is based on three years of experience in a course on Software Design.
  • Does a Successful e-Commerce Project Require Technological Skills Only? Experience in Teaching e-Business Course
    160 e-Business, e-Commerce, ABET accreditation, Computing Information Systems Shehab Gamalel-Din, pages 841 – 848. Show abstract Abstract. This article discusses the question raised by the title of this paper in the light of our experience in teaching CPIS380 “Introduction to e-Business”—an undergraduate course of the IS program of the Faculty of Computing and Information Technology at King Abdulaziz University in Saudi Arabia. There is a belief among students that being a good web programmer is sufficient to entrepreneur an e-Commerce project. CPIS380 students have background knowledge in both business and technology, therefore, this course tries to fill in the gap between these two dispersed areas and integrates them into a unified framework that will contribute to building a deeper understanding of the bigger picture of the business environment in the new era of Internet. The course stresses on highlighting the paradigm shift in doing business through applying the technology. Elevating many soft skills is also a goal of this course. The design of CPIS380 abides to ABET's guidelines.
  • Co-BrainSystem: A Solution to Enhance Collaborative Work in Educational Environments through Brainstorming
    261 CSCL, interactive systems, educational environmets, RFID, Brainstorming Elena de la Guía, María D. Lozano, Victor R. Penichet, pages 849 – 855. Show abstract Abstract. Co-BrainSystem is a collaborative and interactive system aimed at improving brainstorming sessions in learning scenarios. The system is composed of a shared and a private workspace. The Shared workspace is used to display information, ideas, documents and feedback of what is happening in the session. The private workspace consists of Augmented Interfaces integrating emerging technology, in this case RFID tags and a mobile device that integrates an RFID Reader. The system is based on MDE (Multi-Device Environments) and uses a new mode of interaction called Approach & Remove that allows users to handle Distributed User Interfaces easily.
  • Learning Experience+ within 3D Immersive Virtual Environments
    370 learning experience, 3d immersive virtual environments, zone of proximal flow Niki Lambropoulos, Stylianos Mystakidis, pages 857 – 862. Show abstract Abstract. Enhancing the learning experience by engaging learners in immersive environments has been proved to accelerate the learning pace as well as enhance the actual knowledge meaning, skills and competencies by the learners. Similar strategies exist in performance related targets and training such as sports; athletes accelerate and enhance their performance via intensive crash courses. In fact, studies reveal that learning can be accelerated and deepened if specific systematic procedures, techniques and methodologies are in place. Based on such research, we accept the aforementioned results and we propose that such a crash course can be strategically and dynamically structured specifically for eLearning. Moreover, we propose an example of an innovation management crash eCourse implemented in the Second Life 3D virtual world so to provide technology enhanced learning by immersive learner experience, called Learning eXperience+ (LeX+). This is possible by engaging the learners in a learning zone, called the zone of proximal flow also by combining different teaching and learning styles for more personalised learning experience.
  • Coaching for Students: Parents Tutoring Children as part of their Educational Process
    293 Coaching, educational process, curriculum managment, social networks, Sebastian Romero López, Habib M. Fardoun, Daniyal M. Alghazzawi, pages 863 – 870. Show abstract Abstract. Parents are essential for the child development, and for that, they forms a socialization agent of first order because, like in school, are in these areas where children make their first fundamental learning, that will affect them throughout of further development. For this, it is important to consider these both factors in the education of children; and as a methodological strategy, we propose a joint action of both, school and parents, so it could be given a coordinated and cooperated task in the educational work, to become flattering and rewarding for students. Therefore, and because both the family and the school have a key role in education, in this paper we propose an application, that will help parents and students to combine two of its major daily tasks. For students, it will help them in their educational process; and it will help parents to control efficiently this educational process. Becoming by this, a tool especially designed to improve the academic performance, and the potential future of the students
  • A Teaching Experience on a Data Mining Module
    337 Data Mining, Teaching methods, Matlab Francesco Maiorana, pages 871 – 874. Show abstract Abstract. Data mining is recognized as an important field where one has the possibility to become accustomed both with analysis techniques and methods and with a state of mind. By means of data mining it is possible to develop critical skills that are essential in today information technology. We present our experience in teaching a data mining module, within an Information System course, centered around a few key aspects: a convergence of theoretical Information Systems aspects and computing skills through programming a complete data mining analysis in Matlab; a project centered learning experience; a sharing of resources that are commented on both by the teacher and by peers facilitating the flow of information and the development of critical skills; a guided inquiry process where the students, when needed, are guided through appropriate questions in the right direction; and finally special attention to requiring motivation of each decision and step undertaken. As a case study we present and summarize the experience performed by two groups of students in a data mining study aiming at predicting a liquidity crisis of companies
  • Teaching Emotional Intelligence to Computer Science students
    248 Emotional Intelligence; Comuting curricula Esperanza Marcos, Juan Manuel Vara, Veronica Bollati, Marcos Lopez, pages 875 – 881. Show abstract Abstract. Computing professionals are, on an ever-increasing basis, being requested to acquire emotional skills in addition to, or even before, acquiring technical knowledge. The skills needed to work in a team or to speak in public, to attain leadership capacities or to adapt to change are important requirements in perhaps any job. However, these types of skills, which are characteristic of what is denominated as Emotional Intelligence, become indispensable in the profile of a Computer Engineer. This type of professional will have to work in a group, have to confront the project management, give talks and, which may be the most difficult, keep up to date with a technology whose new advancements will be obsolete in scarcely five years. Nevertheless, and in spite of the importance of this type of skills, Emotional Intelligence continues to be a subject which is pending in Computer Science students' curricula. In this paper we present an experiment in which a competition is used in an attempt to motivate Emotional Intelligence in Computer Science students studying the subject of Databases.
  • Mobility and Memory Training through Movement Interaction
    282 education, memory, mobility, disability, ubiquity, context-awareness Juan Enrique Garrido Navarro, Víctor Manuel Ruiz Penichet, María Dolores Lozano Perez, Luis Antonio Sánchez, pages 883 – 889. Show abstract Abstract. Current technology makes easier the evolution of learning processes, techniques and environments. Movement interaction devices offer important capabilities to create learning systems where students are able to interact with them through natural movements and gestures. In this paper, we present a system based on Kinect, whose main objective is to improve and train two important students' faculties: memory and motor abilities. The system is inspired on the Simon Says game where students have to repeat postures that the system shows previously. Students with disabilities can find out with the system an adequate environment with which improve their memory and motor disabilities out of cold rehabilitation centers, such us hospitals and clinics. They can perform rehabilitation exercises in the same place and in the same way where their classmates train the same faculties.
  • Robot simulator facilitating the education of concurrent programming
    246 teaching programming, simulation environments Łukasz Szweda, Jakub Flotyński, Daniel Wilusz, Paweł Dąbrowski, pages 891 – 896. Show abstract Abstract. The paper presents an experiment of teaching of Java-based concurrency using a robot simulator. The computer programming education is a challenging task, especially when non-computer science students are taught complex programming concepts. Recently, great deals of simplified programming languages, environments and simulating software have been proposed to support teaching and self-learning different programming techniques. But still, there is no solution facilitating effective teaching in the domain of concurrent programming in the Java language. In this paper we present our original concept of exercises using a robot simulator to teach Java-based concurrency. The simulator seems to be a good solution facilitating the education of concurrent programming, as actions performed in real-time by the simulator allow students to quickly identify their mistakes.

International Workshop on Advances in Business ICT

  • Simulation driven design of the German toll system—evaluation and enhancement of simulation performance
    74 simulation, discrete-events simulation, performance, toll system, systems engineering Tommy Baumann, Bernd Pfitzinger, Thomas Jestädt, pages 901 – 909. Show abstract Abstract. We study the performance issues of a realistic simulation of the German toll system for heavy trucks using a discrete-event system (DES) simulation. The article first introduces the German toll system and the simulation framework developed to analyze the systems' behavior. A number of performance limitations of several commercial and non-comercial DES simulation kernels are described, measured and benchmarked. The application-level performance of a DES implementation of the German toll system is then compared using two commercial DES tools and several optimiziations are introduced both on the simulation model and kernel level to achieve the necessary performance for a detailed and realistic simulation of a fleet of trucks.
  • Implementing Ubiquitous Services with Ontologies: Methodology and Case Study
    372 ubiquitous information services, data ontology, task ontology, distributed systems. Alfio Costanzo, Alberto Faro, Daniela Giordano, Concetto Spampinato, pages 911 – 914. Show abstract Abstract. Modern Ubiquitous Information Systems (UISs) appear more and more as intelligent systems that provide business services for mobile users by taking into account both sensed data and administrative records stored on different servers. Currently, data integration is achieved by a centralized relational database where all the information coming from the remote servers is stored. To improve such architecture we have proposed to convert the relational archive into an RDF triple store where data are represented by standard terms interrelated by subject-predicate-object relations (also called data ontology). In this way, ubiquitous applications could be developed independently on the technology used to collect the data and on how the data are formatted on the various specialized servers. The paper proposes to implement such an architecture in a distributed environment to achieve higher reliability and time performance. Also, how structuring the service interface according to task ontology is outlined since this improves usability. Implementation issues illustrate how the software environments based on the paradigm Models-View-Controllers, e.g., Ruby on Rails powered by JQMobile and Flash Builder, may facilitate the implementation of the proposed methodology.
  • Geoportal as Decision Support System with Spatial Data Warehouse
    111 Geoportal, Decision Support System, Spatial Data Warehouse Almir Karabegovic, Mirza Ponjavic, pages 915 – 918. Show abstract Abstract. There is increasing interest of organization for advanced presentation and data analysis for public users. This paper shows how to integrate data from enterprise data warehouse with spatial data warehouse, publish them together to online interactive map, and enable public users to perform analysis in simple web interface. As case study is used Business Intelligence system for Investors, where data comes from different sources, different levels, structured and unstructured. This approach has three phases: creating spatial data warehouse, implementing ETL (extract, transform and load) procedure for data from different sources (spatial and non-spatial) and, finally, designing interface for performing data analysis. The fact, that this is a public site, where users are not known in advanced and not trained, calls for importance of usability design and self-evident interface. Investors are not willing to invest any time in learning the basics of a system. Geographic information providers need geoportals to enable access to spatial data and services via the Internet; and it is a first step in creating Spatial Data Infrastructure (SDI).
  • Proposal of Square Metrics for Measuring Business Process Model Complexity
    273 BPMN, process model quality, metrics Krzysztof Kluza, Grzegorz J. Nalepa, pages 919 – 922. Show abstract Abstract. Business Process (BP) “aim” metrics of use for controlling the quality and improving BP models. We give an overview of the existing metrics for describing various aspects of BP models. We propose simple yet practical square metrics for describing complexity of a~BP model. These metrics are easy to interpret and provide some information about the structural complexity of the model. The proposed metrics are to be used with the models built with the Business Process Model and Notation (BPMN). It is currently the most widespread visual language used for BP modeling.
  • Standardization Approaches within Cloud Computing: Evaluation of Infrastructure as a Service Architecture
    219 Cloud Computing, Standardization, Infrastructure as a Service Stine Labes, Alexander Stanik, Jonas Repschläger, Odej Kao, Rüdiger Zarnekow, pages 923 – 930. Show abstract Abstract. Cloud Computing is becoming increasingly established and offers several opportunities to obtain IT services in an on-demand manner. Especially infrastructure services, like storage and scalable computing resources, are gaining relevance and provide alternatives to conventional sourcing models. Despite the Cloud paradigm of flexible and limitless scalability the lack of standardization presents a big challenge in this context. Due to many providers, which are using different Cloud software including proprietary interfaces, the interoperability in the Cloud remains a theoretical construct. In this paper we examine standardization approaches within Cloud Computing and provide a comparison to practical implementations of interfaces of relevant Cloud software on the market. Finally, characteristics for a potential Cloud standard on the infrastructure level will be postulated.
  • AI Approach to Formal Analysis of BPMN Models. Towards a Logical Model for BPMN Diagrams
    272 BPMN, Prolog, rules, logical model, formal analysis Antoni Ligęza, Krzysztof Kluza, Grzegorz J. Nalepa, Tomasz Potempa, pages 931 – 934. Show abstract Abstract. Modeling Business Procesess has become a challenging issue of todays' Knowledge Management. As such it is a core activity of Knowledge Engineering. There are two principal approaches to modeling such processes, namely Busines Proces Modeling and Notation (BPMN) and Business Rules (BR). Both these approaches are to certain degree complementary, but BPMN seems to become a standard supported by OMG. In this paper we investigate how to build a logical model of BPMN using logic, logic programming and rules. The main focus in on logical reconstruction of BPMN semantics which is necessary to define some formal requirements on model correctness enabling formal verification of such models.
  • Perspectives of Using Temporal Logics for Knowledge Management
    17 knowledge management, computer system, temporal logic Maria Mach-Król, pages 935 – 938. Show abstract Abstract. The paper concerns the possibility of using temporal logics for knowledge management. The idea of knowledge management is presented, along with the most typical computer solutions for this area. The temporal aspect of knowledge management is pointed out. Having in mind this temporal aspect, the paper presents possible advantages of extending knowledge representation for knowledge management with temporal formalisms.
  • Problems of automatic processing and analysis of information from legal texts
    401 legal text processing, similarity analysis, LSA algorithm, dimensionality reduction Tomasz Pełech-Pilichowski, Piotr Potiopa, Wojciech Cyrul, pages 939 – 943. Show abstract Abstract. In the paper, problems of legal information digitalization are investigated. Conditions for extraction information from legal texts (i.a. legal acts) related to the common ones processing (non-legal terms) are outlined. Problems of dimensionality reduction and application of similarity measures are discussed. Sample results of similarity analysis is presented. Further research aimed at semantic analysis of legal texts are outlined.
  • Innovation in Business Intelligence Systems: Spatial Component for Advanced Threat and Risk Analysis
    113 Business Intelligence Systems, Spatial, Threat and Risk Analysis Mirza Ponjavic, Almir Karabegovic, pages 945 – 948. Show abstract Abstract. This paper shows an innovative approach for implementation Business Intelligence Systems in Advanced Threat and Risk Analysis using Spatial component. It demonstrates how to improve intelligence of complete information system by involving spatial extension. Most of business data in data warehouses are often spatial per sue, and without using this component, analysis missing very important dimension of the data nature. From other side, frequent problem in enterprise data warehouse is creating relations between tables which come from different sources and without any common attributes; that could be very easily solved by spatial relation. This paradigm of spatialization assumes changing overall system architecture, from data storage, via retrieving to its presentation mechanism. Particular benefit of this approach for Threat and Risk Analysis is effective utilization of location data, advanced spatial analysis techniques and more variety in data visualization. Examples of organizations which need such system are intelligence agencies, emergence services or epidemiology centers.
  • Model Driven Architecture and classification of business rules modelling languages
    400 MDA, Semantic Webs, business rules Iwona Skalna, Bartłomiej Gaweł, pages 949 – 952. Show abstract Abstract. An organisation's activity in conditions of dynamic changes requires continuous improvement of business practices. This implies the necessity of refine decision making process. Business rules [T. Halpin, “Business Rules and Object Role Modeling,” in: issue of Database Programming & Design, vol. 9, no. 10, 1996, pp. 66--72], [T. Morgan, Business Rules and Information Systems, Boston: Addison-Wesley Publishing, 2002.] enable experts to transfer enterprise strategy onto the operational level using simple sentences which, in turn, can automate reactions to subsequent events both inside and outside the organisation. The main advantage of the business rules is their simplicity and flexibility, which makes them easy to apply by different organisations for different purposes. In order to represent knowledge in a pseudo-natural language understandable to information systems (business rules engines) notation and description standards are required. In this study, an overview and classification of the most popular business rules description languages are presented.

10th Conference on Advanced Information Technologies for Management

  • Bayesian networks in business analytics
    190 Bayesian networks, decision support Michael Ashcroft, pages 955 – 961. Show abstract Abstract. Bayesian networks are a popular and powerful tool in artificial intelligence. They have many applications in commercial decision support. The point of this paper is to provide an overview of the techniques involved from this perspective. We will proceed by giving a simplified mathematical overview of what Bayesian networks are and the flavors they come in. We then look at how they can be created or learnt from data and the situations that lead to the use of ensemble models. Then we look at how an application of such a technology would proceed, using the human resources example of talent retention for international firms in China, examining the full process rather than technology specific elements. Finally we look at the outputs that would be generated from such an application.
  • A-Trader – consulting agent platform for stock exchange gamblers
    269 multi-agent system, financial markets, artificial intelligence, time series analysis Maciej Bac, Jerzy Korczak, Aleksander Fafuła, Krzysztof Drelczuk, pages 963 – 968. Show abstract Abstract. The authors of this paper present the architecture of a multi-agent system which supports investment decisions. The individual components of the system, the manner of communication between them, the mechanism of assessing the individual agents are discussed here. Combining their common open/close position signals and relearning with the use of the selected data create the never-ending learning concept. New methods of transforming financial series, the behavioural model of stock exchange gamblers and the manners of translating the modelled patterns into the open and close position signals are described. The results of the research are described and the directions of the further development of the platform are provided in the conclusion.
  • The Use of Business Intelligence Systems in Healthcare Organizations in Poland
    76 business intelligence, healthcare, HIS, EHR Celina Olszak, Kornelia Batko, pages 969 – 976. Show abstract Abstract. Interest in applications of Business Intelligence (BI) in different areas of the economy has been growing from year to year. In recent years, it has been increasing in Poland as well. A relatively new area of using this systems is the healthcare area. Intelligent techniques provide an effective computational methods and robust environment for business intelligence in the healthcare domain. It seems to be very important, due to the fact, that much of the data storage in all kinds of system used in healthcare organizations resides in proprietary silos which makes access difficult [1].What is worth noting using of BI systems is determined by the efficiency of the intelligent techniques, methodologies and tools. This paper discusses the essence of BI, characteristics of the healthcare sector and potential applications of BI systems in the healthcare sector. Also tools and examples of BI systems used in the healthcare sector are presented in the paper.
  • Changes in informatization strategies of Polish companies and institutions in reaction to the economic crisis. Summary of the surveys from the years 2009-2011
    244 economic crisis, comparative surveys, changes of informatization strategies, statistical analysis Mirosław Dyczkowski, Tomasz Dyczkowski, pages 977 – 985. Show abstract Abstract. The paper discusses results of comparative surveys from years 2009-2011 which aimed at determining how the recent economic crisis had influenced informatization strategies in Polish companies and institutions. The obtained results supported a working hypothesis that the economic crisis affected – to smaller or greater extent – short- and long-term informatization strategies in the majority of companies or institutions. Even if relative importance of identified changes, intensity of their visible symptoms or areas of IT applications where they were the most noticeable differed in subsequent years, those variations were minor and concerned only some elements of informatization strategies and IT applications. Details are included in this paper.
  • Cloud-Based Content Centric Storage for Large Systems
    331 Cloud Storage, Metadata, Video Production, Michael C. Jaeger, Alberto Messina, Mirko Lorenz, Spyridon V. Gogouvitis, Dimosthenis Kyriazis, Elliot K. Kolodner, Xiaomeng Su, Enver Bahar, pages 987 – 994. Show abstract Abstract. Content centric storage refers to a paradigm where applications access data objects through information about their content, rather than their path in a hierarchical structure. The application does not need any knowledge about the data store organization, or the place in a storage hierarchy. Rather, the application is able to query for the desired content based on metadata associated with the data objects.
    We illustrate the need for content centric storage and access through an example from a media production application. Our approach for building an industrial-capable content centric storage employs a cloud storage technology where metadata is treated as first class citizen and extends this technology with an API layer that manages and leverages metadata regarding content.
  • Online Multi-bilateral Negotiations and Multi-attribute Reverse Auctions: An Experimental Study of Concession-making
    374 auctions, negotiations, concession making, multiattribute auctions, online auctions, e-negotiations, decision support, experimental study, e-procurement Gregory Kersten, Dmitry Gimon, Shikui Wu, pages 995 – 1002. Show abstract Abstract. Concession-making plays an important tactical role in interactions among business partners. In multi-issue negotiations a concession refers to the amount of utility a party decides to give up by making next offer. In multi-attribute auctions concession is reflected in the next bid by a bidding party that abides by the rules of a given auction mechanism. The purpose of this work is to share insights into concession-making behavior in multi-bilateral multi-issue negotiations vs. multi-attribute reverse auctions. To this end experiments have been conducted featuring auction and negotiation mechanisms. One finding indicates that participants in auctions tend to make larger concessions than those involved in negotiations. Another finding shows that the negotiators' effort to make a concession may not be perceived by their counterparts
  • Intelligent Dashboard for SME Managers. Architecture and functions
    251 Intelligent Dashboard for Managers, business intelligence, visual data exploration, economic and financial knowledge Jerzy Korczak, Helena Dudycz, Mirosław Dyczkowski, pages 1003 – 1007. Show abstract Abstract. The article presents the main features of the InKoM project, whose aim is the realization and implementation of an Intelligent Dashboard for Managers. This project is conducted by a consortium led by the University of Economics in Wroclaw, and the other principal member is the company UNIT4 TETA BI Center. Credit Agricole Polska also participates in the project. As part of the project, an innovative Intelligent Dashboard for Managers will be developed, which is on one hand a complement, and on the other a development of the TETA BI system, that is the set of business intelligence tools offered on the market by UNIT4 TETA BI Center. The innovativeness of the InKoM system involves particularly the wide use of methods, techniques and tools for visual data mining of economic and financial knowledge. The InKoM will offer managers especially from micro, small and medium enterprises analytical and information functions not previously available to them, thereby increasing the quality, effectiveness, and efficiency of the decision making process.
  • Towards the Development of an Automated, Web-based, Horizon Scanning System
    214 Horizon scanning; Web search engines; information retrieval; emerging risks; business intelligence Marco Palomino, Tim Taylor, Richard Owen, pages 1009 – 1016. Show abstract Abstract. Horizon scanning is an increasingly important part of management decision making in all sectors. It involves the systematic search for incipient trends, opportunities and constraints that might affect the probability of achieving management goals and objectives. This requires the continuous acquisition of up-to-date information to anticipate issues, collect data about them and thus inform critical decisions. Although horizon scanning has its roots in the pre-electronic information era, it has blossomed with the availability of electronic databases and Web-based information. In this paper, we propose the implementation of a horizon scanning system centred on the use of keyword-based, Web search engines. Leveraging on the existing infrastructure of proven search engines, our system aims to automate the human-intensive process of seeking information and emerging trends. A prototype application that integrates the software that we plan to use has been developed to accompany this paper, and we discuss the potential for its application.
  • Good practices in requirements, project and risk management in educational IT projects
    327 change management, domain, e-experiment in physics, education, executive support stakeholders', good practices, project management, risk analysis, requirement engineering, social risk, stakeholder, triple constraints, validation and verification Małgorzata Alicja Płotka, Paweł Syty, pages 1017 – 1021. Show abstract Abstract. One can find many learning aids and simulations of physical phenomena on the market - provided as a standalone application or as part of an educational package. However, only a few of them allow for the building of interactive experiments: experiments similar to those that should be conducted in physics laboratories at schools. Gdańsk University of Technology decided to fill this market niche by designing and constructing a set of virtual experiments - so called e-experiments. To avoid common problems that a lot of IT products brought to defeat, they prepared procedures in accordance with the best practices of software engineering. The paper below describes the process of the e-experiments development paying special attention to requirement, project and risk management. Authors try to prove that by not escaping from difficult matters such as carefully planning and risk analysis success can and will be achieved.
  • Shadow IT Evaluation Model
    394 Shadow IT; Shadow IT Evaluation Model; User-driven; IT-Governance; End User Computing Christopher Rentrop, Stephan Zimmermann, pages 1023 – 1027. Show abstract Abstract. Shadow IT describes the supplement of “official” IT by several, autonomous developed IT systems, processes and organizational units, which are located in the business departments. These systems are generally not known, accepted and supported by the official IT department. From a company's, IT governance and IT management's perspective it is necessary to find a way to deal with this phenomenon. As a part of an integrated methodology to control shadow IT, this paper presents an evaluation model for identified shadow IT instances.
  • CRM as integration environment of the process organization
    305 CRM, cloud, SOA, virtual organization, distributed organization,virtualization, organization management, IMIS Piotr Skopiński, Piotr Zaskórski, pages 1029 – 1033. Show abstract Abstract. In the paper attempts to identify the systemic aspects of the use of CRM tools for the integration of distributed processes organization. The overriding criterion for the applicability of CRM is to increase the efficiency of management the virtual organization. Presents the requirements and restrictions for the virtualization of access to technical resources, technology and information in the so-called. "Cloud" as a way to reduce costly IT investments especially in the SME-class organizations.
  • Consensus determining algorithm in multiagent decision support system with taking into consideration improving agent's knowledge
    167 knowledge improving, consensus methods, multiagent systems Jadwiga Sobieska-Karpińska, Marcin Hernes, pages 1035 – 1040. Show abstract Abstract. This document describes using a consensus methods to improving agent's knowledge in multiagent decision support systems.. The problem of improving agents knowledge, structure of decision, profile and criterions of consensus determining are presented in first part of article. Next a two stage algorithm of consensus determining was elaborated. This algorithm, among other things, allows to shorten the period of time necessary to take a decision, to limit the risk associated with this process and leads to increased effectiveness of decision taking since solutions generated by agents of inadequate level of knowledge are not taken into account.
  • Analysis and Implementation Phases in the Two-Segmental Model of Information Systems Lifecycle
    313 systems lifecycle, software packages, ERP systems Jędrzej Wieczorkowski, Przemysław Polak, pages 1041 – 1046. Show abstract Abstract. Analysis and implementation phases in the lifecycle of ERP software packages involve many resources and are most relevant to the buyers of such software. Therefore, it is important for them to understand the objectives of those phases and activities involved with them. The two-segmental model proposed by the authors is aimed at better representation of the lifecycle of information systems. This article aims to demonstrate that the actual course of the two phases is better mapped in the proposed model. The classical waterfall model of the life cycle of information systems was used as a reference point for the investigation.
  • E-government Application at the Regional Level in Poland–the Case of SEKAP
    80 e-government, Poland, A2B, A2C, A2A Ewa Ziemba, Tomasz Papaj, pages 1047 – 1054. Show abstract Abstract. The aim of this paper is to explore e-government concept as well as present and assess the application of e government in the Upper Silesia (the Silesian Voivodship), Poland. In the cognitive part, the essence of e-government as well as the initiatives for building e-government in Europe and Poland have been identified. In the empirical part, the Electronic Communication System for Public Administration (SEKAP) as an example of a "good practice" of an e government has been presented and the diagnosis of SEKAP application is given. The achieved results can be useful while undertaking activities aimed at e government development in a country and particular regions.
  • Semantic Web Recommendation Application
    278 web 2.0, semantic search, recommender system, semantic content creation, mobile applications Szymon Łazaruk, Jakub Dzikowski, Monika Kaczmarek, Witold Abramowicz, pages 1055 – 1062. Show abstract Abstract. This paper focuses on a semantically-enhanced Social Web Recommendation application, called Taste It! Try It! It is a mobile restaurants' review and recommendation application based on a Linked Data source and integrated with a social network. The application is consuming Linked Data (while creating the reviews), producing semantic annotations (about the reviewed entities) and then querying the gathered data in order to offer personalized recommendations.

Information Technology for Disabilities

  • Research on improving communication between the blind and the sighted in the area of mathematics, and related requirements
    121 Braille, mathematic notation, Braille mathematical texts, Blind students, Braille technology, Jolanta Brzostek-Pawłowska, Dariusz Mikułowski, pages 1065 – 1069. Show abstract Abstract. Attempts to allow the blind to read and write mathematical texts have been made for many years. Such research is conducted both in Poland and abroad. Nevertheless, there is still no complex solution to this problem that would satisfy the users thoroughly. The greatest difficulty facing blind and sighted individuals working on texts containing mathematical formulae together is that in sighted people's notation, these expressions take form of two-dimensional sets of graphical symbols, while Braille provides linear, often context-dependent notation for such expositions. This paper presents identified cases of work and education in mathematics that require new technologies to improve cooperation between the blind and the sighted who do not know Braille nor Braille Mathematics Notation. The research initiated by the Institute of Mathematical Machines discussed herein is aimed to develop innovative technologies that will improve communication, Web-based and else, between blind and sighted individuals in the area of mathematics.
  • Gaussian Hand Gesture Recognition Based Mobility Device Controller
    43 Gaussian, Hand Gesture, Recognition, control algorithm Rytis Maskeliunas, Vidas Raudonis, Paulius Lengvenis, pages 1071 – 1074. Show abstract Abstract. The development and investigation of alternative mobility device control is presented in this work. The system uses 2D visual information, which is acquired from an ordinary web-cam, and controls the electrical drives of the mobility device by tracking and recognizing the gestures of the hand. Hand tracking is achieved by using an algorithm, which combines two methods: a statistical Gaussian method and a discrete Fourier transformation. Proposed algorithm is adaptive and flexible allowing utilizing unique gesture commands which depend on person's motor abilities. Experimental investigation proves the stable robustness, performance and high accuracy of the proposed mobility controller.
  • Voice controlled environment for the assistive tools and living space control
    158 assistive tools, voice user interface, speech engine, smart home, universal platform Vytautas Rudzionis, Rytis Maskeliunas, Kestutis Driaunys, pages 1075 – 1080. Show abstract Abstract. This paper describes our efforts developing the smart home environment for the assistive living. The key element of the smart environment is the ubiquitous voice user interface with the several additional capabilities (such as recognition of several gestures). This work is the further development of voice controlled wheelchair. The presence of the commercial speech recognition engines and our experience adapting foreign language engine to recognize Lithuanian voice commands suggested expansion of the platform including the possibilities to control various devices in the living space. The key element of the proposed platform is its universal nature, the possibility to adapt the platform for the personal needs and the economical solutions used. Platform was developed using inexpensive hardware and software elements available on the market. The field tests with several sets of voice commands used by people with motoric disabilities showed high robustness of proposed platform.

Workshop on Information Technologies for Logistics

  • LOGICAL—Development of Cloud Computing Platforms and Tools for Logistics Hubs and Communities
    314 Cloud Computing, LOGICAL, Logistics Cloud, Logistics Mall Uwe Arnold, Jan Oberländer, Björn Schwarzbach, pages 1083 – 1090. Show abstract Abstract. Logistics service providers (LSP) are facing an increasing complexity of the logistics sector, i.e. growing levels of process fragmentation plus increasing speed, customization and service demands of logistics clients. Adequate powerful, integrated ICT infrastructure and tools are a prerequisite for keeping pace with the ever increasing service level demands within international logistics. The Cloud Computing technology offers significant advantages for data, process and service man-agement and integration. To cope with the related innovation and migration needs, the Central Europe project LOGICAL focuses on the development and implementation of innovative cloud computing technologies. Special attention is devoted to in-ternational cooperation of SME-size LSPs. This paper introduces the conceptual basics of LOGICAL, basic use cases requested by the LSPs, and the addressed target groups of LOGICAL clouds. The results of an extensive user survey and demand analysis are presented as well as the related consequences for the cloud architecture.
  • A data mining approach for bill of material motor revision
    312 Data Mining, Clustering algorithms, Marriott criterion, Supply Chain Managment Francesco Maiorana, Angelo Mongioj, pages 1091 – 1096. Show abstract Abstract. Supply chain management is a core business process and is recently considered the focus of competitive analysis. Business enterprises are data overloaded and hence using data mining techniques to transform the vast amount of data into meaningful information can be beneficial. We will present a data mining approach for inventory forecasting and planning bill of material in an high competitive environment such as an Italian car racing team. By exploiting clustering algorithms and by using statistical techniques to identify the optimal number of clusters we, optimally, clustered a multi-year dataset containing the products used in car revision after each rally competition during a three year period. The Bill of Materials was used as input for the Material Resource Planning
  • Recent Developments with Single Wagon Load Services, Policy and Practice in Europe
    58 rail freight, SWL, ICT Marin Marinov, Clare Woroniuk, Thomas Zunder, pages 1097 – 1104. Show abstract Abstract. This research aims to gain an understanding of how Single Wagonload (SWL) services, policy and practice can benefit from the implementation of scientific methods and information technologies. For the purposes of this research a discussion on EU rail freight transport and current SWL trends is presented. An evaluation of EU rail freight policy is offered, followed by a discussion on policy measures to assist SWL growth. A review of scientific methods and models for rail freight planning is also presented that concludes that there is a need to update freight planning models in order to better support decisions for effective rail freight services and address current rail freight needs. An overview of information technology available for SWL is offered as well that suggests that it is possible for information communication technologies (ICT) implementation to benefit SWL operations, namely the level of efficiency. A lack of standards and integration should be addressed, however, in order for SWL and rail freight in Europe to gain further benefits.
  • Applying Linked Data Concepts in BPM
    258 BPM, SCM, Linked Data, semantic data, choreography, logistics Silva Robak, Bogdan Franczyk, Marcin Robak, pages 1105 – 1110. Show abstract Abstract. One of the contemporary problems in business networks of supply chains are the information integration issues. They are either related to information interchange cross incompatible, independently designed data systems or to the lack of common semantic model in the domain. The networked supply chains need mechanisms to describe the choreographies of the cooperating business units. In this paper we analyze the possibilities of an application of the (some) Linked Data concepts into the interaction models for the choreographies in business process management BPM. We present our approach on a 4PL integrator example.
  • Cost optimization of supply chain with multimodal transport
    182 Optimization, multimodal transport, MILP, decision support Pawel Sitek, pages 1111 – 1118. Show abstract Abstract. The article presents the problem of optimizing the supply chain from the perspective of a multimodal logistics provider and includes a mathematical model of multilevel cost optimization for a supply chain in the form of MILP (Mixed Integer Linear Programming). The costs of production, transport, distribution and environmental protection were adopted as an optimization criterion. Timing, volume, capacity and mode of transport were also taken into account. The model was implemented in the LINGO ver.12 package. The implementation details, the basics of LINGO as well as the results of the numerical tests are presented and discussed. The numerical experiments were carried out using sample data to show the possibilities of practical decision support and optimization of the supply chain.

8th Conference on Knowledge Acquisition and Management

  • A survey of data warehouse architectures: preliminary results
    369 data warehouse, architecture, architecture selection Moh'd Alsqour, Kamal Matouk, Mieczysław Lech Owoc, pages 1121 – 1126. Show abstract Abstract. In this abridged version, which is a summary of the authors' study, we present some of our findings for consideration. The principal objective of the study was to investigate empirically the architectures of data warehouse (DW), or more specifically, the types of the architectures and a number of factors, which are believed to influence their selection, were explored. A questionnaire survey, which targeted the information systems managers, was used to collect data from 150 Polish companies about the respondents' firms, the architecture they use, and the factors which influence the selection of the architecture. The findings of the study give us practical insights into DW's field in Poland
  • Adaptive Conjoint Analysis. Training Data: Knowledge or Beliefs?
    32 adaptive conjoint analysis, preferences, preference change, beliefs, logic, belief revision, preference logics Adrian Giurca, Ingo Schmitt, Daniel Baier, pages 1127 – 1133. Show abstract Abstract. The foundational model of conjoint analysis is to model consumer purchase preferences by means of utility functions. Analysts run surveys and interviews to obtain a basic set of training data, typically user preferences on which the utility function is mapped. The utility theory trust the training data as knowledge while there is large literature emphasizing that users preference may change, may be incomplete and sometimes inconsistent. This paper argues on a logic-based model of conjoint analysis, particularly by proposing an alternative model of preferences as belief instead as fully trust knowledge. We adopt the categorical beliefs approach but the quantitative, probabilistic approach may be considered too. In the context of adaptive conjoint analysis, we identified three kinds of beliefs, describe a mechanism of mapping answers to beliefs and provide the basis on belief update when new information occurs. Future work on our logic-based framework will focus on how we can obtain an optimal logic-based preference aggregation including by relaxing Pareto efficiency in Arrow's aggregation framework as well as researching on non-prioritized belief revision in adaptive conjoint analysis.
  • Process mining challenges in hospital information systems
    376 process mining, hospital information system, complexity, health care Payam Homayounfar, pages 1135 – 1140. Show abstract Abstract. Over the last years hospital information systems became more integrated. Hospital information systems with the wide variety of systems are highly suitable to use process mining for knowledge discovery and optimization of processes. Applying process mining in hospital information systems is a modern and recommendable approach in health care. But process mining techniques can only provide a high result quality if the structure of data is known and if the structure of the event logs are maintained appropriately. This paper describes process mining, hospital information systems and shows where the challenges are if the two areas are combined.
  • The Knowledge Maturing Scorecard: A Model Based Approach for Managing the Innovation
    296 Knowledge Maturing, Innovation Management, Knowledge Space, Meta Models, Balance Scorecard, Intellectual Capital Vedran Hrgovcic, Gwen Wilke, pages 1141 – 1148. Show abstract Abstract. The paper proposes a model based approach to manage knowledge based innovation aspects of an enterprise by applying Knowledge Maturing Scorecards. The Knowledge Maturing Scorecard approach is used to define, manage and visualize indicators for the innovation potential of established enterprise goals. The paper (1) introduces the conceptual background of Knowledge Maturing Scorecards, (2) proposes a generic modelling framework for managing knowledge based innovation, and (3) presents a web based prototype for the proposed model.
  • From Knowledge worker to Knowledge Cultivator-effective dynamics
    307 Knowledge Worker, Knowledge Management, Knowledge Cultivator, Innovation Ecosystems, Corporate Social Responsibility Gulgun Kayakutlu, Eunika Mercier-Laurent, pages 1149 – 1153. Show abstract Abstract. As complexity increases in the business world the concept of value creation, measures for success and sustainability are changing. More attention is given to the critical human resources for reducing the risks of managerial decisions. The right knowledge worker in the right place is not anymore just accumulating, sharing and using the knowledge. As the technology evolves and the ecological risks increase critical role integrates the individual and collaborative skills to learn and innovate by converting the social network connections and feedbacks into value. This study analyses the skills and expectations of these critical roles and proposes a discussion on success measures of the Knowledge Cultivator. The suggested frameworks will facilitate evaluating the performance of Knowledge Cultivator. This new vision will be beneficial for managers, human resource experts, and educators.
  • Using .tel domains to support knowledge Management
    385 knowledge in enterprise, .tel domains, knowledge sharing, expert database Artur Kotwica, pages 1155 – 1158. Show abstract Abstract. .tel domain seem to be a very good tool for construction of expert database serving both stationary and mobile devices. The base supports the combination process in I. Nonaka's model and the knowledge localization process in G. Probst's model.
  • Design of Learner-Centered constructivism based Learning Process
    63 Constructivism based learning, Learner centered learning, Course development, Collaborative learning, Competencies based learning. Jeanne Schreurs, Ahmad Al-Huneidi, pages 1159 – 1164. Show abstract Abstract. A Learner-centered learning is constructivism based and Competence directed. We define general competencies, domain competencies and specific course competencies. Constructivism based learning activities are based on constructivism theory. For each course module the intended learning level will be defined. A model is built for the design of a learner centered constructivism based and competency directed learning process. The application of it in two courses are presented.
  • Models of information and knowledge transfer in IT outsourcing projects
    39 IT outsourcing, knowledge transfer, knowledge management Małgorzata Sobińska, Kazimierz Perechuda, pages 1165 – 1169. Show abstract Abstract. The aim of the authors' research/article will attempt to define the knowledge management role of customer-supplier relationship with particular emphasis on IT knowledge outsourcing. The next aim it will be the verification of the following search hypotheses:
    • In the classical outsourcing contract, there is none critical knowledge transfer (this is not the aim of an outsourcing project);
    • Outsourcing should be supplemented with key knowledge transfer in both directions-from the client to the vendor and from the vendor to the client;
    • Outsourcing based on key knowledge transfer is an essential / important instrument of creation and business development of virtual organization based on knowledge.
    The result of the research will be models of IT knowledge outsourcing for IT outsourcing. It will be shown the potential areas of application IT outsourcing, IT knowledge outsourcing and the benefits of such projects in the context of knowledge management.
  • Web User Navigation Patterns Discovery from WWW Server Log Files
    326 web mining, knowledge management, web user navigation patterns Paweł Weichbroth, Mieczysław Lech Owoc, Michał Pleszkun, pages 1171 – 1176. Show abstract Abstract. Continued growth of user number and size of shared content on Web sites cause the necessity of automatic adjusting content to users' needs. In the literature of Web Mining, such actions are referred to personalization and recommendation which led to improve the visibility of presented content. To perform adequacy actions which correspond to the expected users' needs we can utilize web server log files. Mining such data with accurate constraints can lead to the discovery of web user navigation patterns. Such knowledge is used by personalization and recommendation systems (PRS) due to performed actions against user behavior during a visit on the web portal. In these paper we present the system framework for mining web user navigation patterns in order to knowledge management. We focus on constraints which are critical factors to evaluate the effectiveness of the implemented algorithm. On the other hand, these constraints can be perceived as knowledge validation criteria due to its adequacy. Thus only adequate knowledge can be added to existing in PRS knowledge base.

Techniques and Applications for Mobile Commerce

  • Security feeling of mobile phone users
    207 Security, Mental Models, Human Factors Zinaida Benenson, Olaf Kroll-Peters, Matthias Krupp, pages 1179 – 1183. Show abstract Abstract. Mobile devices are becoming more and more powerful. Therefore the number of potential security threats is increasing. Although science has shown that the responsibility of users for IT-security is important, research in the mobile area concentrates mostly on technical security measures. In our paper we do a first step in examining the role of users for the IT-security of mobile devices by creating mental models which were based on the results of interviews. Mental models are the representation of things in reality in people's mind. Although these representation are normally often inaccurate or faulty they can be used for predicting future actions of people. Mental models of IT-security form the basis of user's efforts to ensure the IT-security of their devices.
  • Context Aware Services for Mobile Users: JQMobile vs Flash Builder Implementations
    256 ubiquitous systems, location intelligence, context awareness Alfio Costanzo, Alberto Faro, Daniela Giordano, Concetto Spampinato, pages 1185 – 1192. Show abstract Abstract. Current mobility information systems generally lack of the basic context information that helps people in time-changing environments: they don't take into account current traffic, weather or car pollution conditions, neither the mobile users are timely informed about accidents, or repairs of road. Also, such systems don't consider the user personal sphere that highly influences their context awareness such as age, health status, and so on. Aim of the paper is to illustrate how context aware services may be offered by an implementation architecture based on a server following the Models-Views-Controllers paradigm, i.e., Ruby on Rails (RoR), whose controllers implement the use stories of the mobile users, and the views are JQMobile scripts that provide for each story the most suitable scenario by an user interface based on the familiar Google Maps. How Flash Builder applications resident on the user mobiles may provide the users with similar RoR views, but saving RoR time, is also widely discussed. Furthermore the paper claims that involving actively the mobiles into the mobility information system may support more effectively the context aware decisions of the mobile users.
  • Energy optimisation of the wireless access network through aggregation of mobile terminals
    186 Distributed AI, Energy aware systems, bio-inspired optimisation Hanno Hildmann, Sebastien Nicolas, Fabrice Saffre, pages 1193 – 1198. Show abstract Abstract. We suggest the assignment of mobile terminals (MT) to base stations (BS) such that the number of expendable BSs is maximised. The investigation uses an existing implementation as base case and investigates a variety of specific scenarios. The problem is formally defined and (simulated) computational results are provided for the individual test scenarios.
  • Improving Mobile Device Interaction by Eye Tracking Analysis
    383 Eye tracking, mobile device, haar detector, CAMSHIFT Carmelo Pino, Isaak Kavasidis, pages 1199 – 1202. Show abstract Abstract. This paper describes a non-intrusive eyetracking tool for mobile devices by using images acquired by the front camera of the iPhone and iPod Touch. By tracking and interpreting the user's gaze to the smartphone's screen coordinates the user can interact with the device by using a more natural and spotaneous way. The application uses a Haar classifier based detection module for identifying the eyes in the acquired images and subsequently the CAMSHIFT algorithm to find and track the eyes movement and detect the user's gaze. The performance of the proposed tool was evaluated by testing the system on 16 users and the results shown that in about 79% of the times it was able to detect correctly the users' gaze.

Workshop on Agent Based Computing: from Model to Implementation

  • Agent-oriented Integration of Body Sensor Networks and Building Sensor Networks
    406 Giancarlo Fortino, Raffaele Gravina, Antonio Guerrieri, pages 1207 – 1214. Show abstract Abstract. In this paper we propose an agent-based approach for the integration of building networks (BNs) and body sensor networks (BSNs). Such an integration has the potential to enable a novel set of smart environments for ambient assisted liv - ings further enhancing the concept of smart buildings. Specifically, inhabitants of buildings who are equipped with a BSN can expose their real and virtual sensed data to the BN that can use them to effectively support differentiated services: people identification and localization, information exchange, safety, security, and context-aware personal support. The proposed approach currently uses a gateway developed in JADE interfacing BNs based on the BMF framework with BSNs based on the SPINE framework. A system use case is shown that elucidates the BN/BSN integration based on the agent gateway. Finally, an evaluation of the overhead introduced by the defined application- level solution is also provided.
  • Multiagent scheme for voice conference moderation
    264 Auctions, Social welfare, Moderation Adam Połomski, pages 1215 – 1220. Show abstract Abstract. Conferencing systems utilising text messages as the communication medium have been around for many years. Since in highly populated social platforms moderation is more of a necessity than luxury, many different mechanisms exist and function being more or less effective. As the Internet bandwidth is becoming more and more accessible, voice over IP communication is gaining on popularity. Multiuser voice conferencing platforms raise the need for different type of moderation mechanisms. Determining a fair moderation scheme which would in the same time maximise the overall discussion quality for each participant is not a trivial task. We introduce a multiagent model for voice conference moderation which utilises Vickrey auctions as a resource allocation procedure. By applying a concept of communication channel as a resource with an equally shared ownership, we enforce that rules of a fair discussion are fulfilled.
  • Method for Rapid Prototyping of Societal Information Systems
    310 Agent-oriented software engineering, modeling, design, prototyping, simulation, healthcare Kuldar Taveter, Hongying Du, Michael N. Huhns, pages 1221 – 1228. Show abstract Abstract. We first define and explain the notion of societal information systems. Thereafter we introduce a design method appropriate for developing societal information systems – agent-oriented modeling. Following, we describe a “proof-of-concept” case study of applying agent-oriented modeling to fast prototyping of a societal information system for finding an appropriate physician. In the description, we first present analysis models and then show how they can be mapped to the respective design models. Finally, we explain how the resulting design constructs can be turned into the programming constructs of NetLogo for rapid prototyping. The article finishes by drawing conclusions on designing societal information systems.

International Workshop on Multi-Agent Systems and Simulation

  • Hybrid Multi-Agent system simulations: cognitive and social agents
    230 multiagent based simulation; social simulation; cognitive modeling Alberto Caballero, Juan Botía, pages 1231 – 1238. Show abstract Abstract. Simulating social and cognitive agent abilities is a very important aspect of agent-based computing. Multi-Agent Based Social Simulations (MABS) could benefit from incorporating cognitive behaviours. An hybrid simulating approach, considering social and cognitive abilities, provide a more realistic basis for modelling agents and their social interactions. But, how social and cognitive behaviours could be integrated in MABS? Is always advantageous using cognitive capabilities into social simulations?
    This paper offers a set of general considerations about how cognitive capabilities could be integrated into social multi-agent simulations. It points out the most relevant cognitive requirements of social simulations of a great amount of real scenarios where some agents could carry out cognitive processing while others (a great majority) behave in reactive way. The suitability of several alternatives for integrating social and cognitive capabilities of agents are discussed. The paper also offers several efficiency related arguments and recommendations for use one of the three considered approaches.
  • Simulating a Societal Information System for Healthcare
    223 agent-oriented modeling, societal information system, simulation, healthcare Hongying Du, Kuldar Taveter, Michael N. Huhns, pages 1239 – 1246. Show abstract Abstract. Societal information systems are intended to assist the members of a society in dealing with the complexities of their interactions with each other, especially regarding the resources they share. Because the members are distributed and autonomous, we believe that software agents, having these same characteristics, are a natural basis for representing the members and their interests in a societal information system. This paper describes a simulation of an agent-based societal information system for healthcare. Our design methodology is based on agent-oriented modeling. We apply this methodology for the analysis and design of the proposed system and its simulation. We execute the simulation to investigate four different strategies for choosing a physician combined with three waiting strategies in three common social network models. The results show that the societal information system can decrease the number of annual sick days per person by 0.42-1.84 days compared with randomly choosing a physician.
  • Sociodynamic Discrete Choice Applied to Travel Demand: Multi-Agent Based Simulation and Issues in Estimation
    75 Multi-agent based social simulation, Social influence, Heterogeneity, Choice behavior, Network density Elenna Dugundji, László Gulyás, pages 1247 – 1254. Show abstract Abstract. This paper discusses a multi-agent based model of binary choice behavior with interdependence of decision-makers' choices. Analytical results established by other authors are briefly summarized where agent heterogeneity is not explicitly treated. Next the well-known Erdős-Rényi network class is considered to introduce agent heterogeneity via an explicit local interaction structure. Then the model is applied in an example of intercity travel demand using empirical data to introduce individual agent heterogeneity beyond that induced by the local interaction structure. Studying the long run behavior of more than 120,000 multi-agent based simulation runs reveals that the initial estimation process can be highly sensitive to small variations in network instantiations. We show that this is an artifact of two issues in estimation, and highlight particular attention that is due at low network density and at high network density. Limitations in the present work are summarized and suggestions for future research efforts are outlined.
  • Immersive Face Validation: A new Validation Technique for Agent-based Simulation
    365 Validation, Immersive visualisation, Virtual Reality Athanasia Louloudi, Franziska Klügl, pages 1255 – 1260. Show abstract Abstract. This contribution proposes a new approach to validate agent-based simulation models. To this end, a novel face validation technique is presented that enables systematic plausibility checks by a human expert immersed in a fine grain virtual reality environment that is the exact representation of the simulated multiagent model. It turns out that Immersive Face Validation, is a technically feasible process which offers great insight into the behaviour context of individual agents.
  • Scalability and Robustness Analysis of a Multi-Agent based Self-healing Resource-flow System
    194 scalability; robustness; multi-agent based simulation; self-organizing; self-healing; decentralized coordination; stochastic model Thomas Preisler, Wolfgang Renz, pages 1261 – 1268. Show abstract Abstract. In resource–flow systems e.g. production lines resources are processed by agents applying certain capabilities to them. Such systems profit from self–organization like self–healing as they become more robust against failures. In this paper the development of a decentralized coordination process for such a system is described. The system is realized as a multi–agent system for the purpose of simulating large systems. Furthermore, a stochastic model is developed and compared to the simulation results. The scalability and robustness of the proposed coordination process is shown in good agreement of simulation results and analytic results for the stochastic model.
  • Multi-Agents in a Virtual Regional Landscape
    117 Multi-Agents, Large 3D terrains, Complex systems Harald Yndestad, Robin T Bye, Siebe van Albada, pages 1269 – 1274. Show abstract Abstract. Virtual Region More is a research project to investigate how to utilize visualization and simulation methods as a planning tool for research and teaching in the region More at the vest coast of Norway. The project was developed in a bottom up process for integration of multi-agents in large 3D terrain models. Virtual Region More became an arena for testing agent models shipping, fish farming, eco systems, virus swarms and energy production.
    The most important result of the project was the recognition that simulation of multi-agent systems in position dependent landscapes, will lead to landscape dependent complex systems. Multi-agent in large landscapes needs generic methods to modulate generic agents. Generic multi-agents in large 3D landscapes need a system view of representing multi-agents in organizations, controlled by cost functions

1st International Workshop on Smart Energy Networks & Multi-Agent Systems

  • OpenNode: A Smart Secondary Substation Node and its Integration in a Distribution Grid of the Future
    114 OpenNode, Smart grids, Distributed information systems, Distribution automation, Grid automation, Automation, AMM, AMI, DER, DSM, DMS, OSGi, IEC standards, Multi-agent systems, Automatic control, Telecontrol equipment, Distributed control, SCADA Marta Alberto, Raúl Soriano, Jürgen Götz, Ralf Mosshammer, Nicolás Espejo, Florent Leménager, Raúl Bachiller, pages 1277 – 1284. Show abstract Abstract. In this paper we will present the EU project OpenNode from a technological point of view. OpenNode designs a massively distributed system (both in number of nodes and geographically) for managing the operations of a distribution smart grid. Our proposed system is prepared for being a base platform where to build a smart grid, able to manage AMM, AMI, DSM, DER, and EV integration. We will describe the components, innovations, problems faced and what we call distribution of intelligence. Finally we will propose our system as a base platform where to build MAS systems for distribution grid automation and management.
  • A Distributed Greedy Algorithm for Constraint-based Scheduling of Energy Resources
    31 distributed energy management, constraint handling, optimization, support vector machines, decoder methods Joerg Bremer, Michael Sonnenschein, pages 1285 – 1292. Show abstract Abstract. The current upheaval in the electricity sector is leading to distributed generation schemes and new grid structures. At the same time, this change is heading for a paradigm shift in controlling energy resources within the grid. Pro-active scheduling of active power within a (from a controlling perspective) loosely coupled group of distributed energy resources demands for distributed optimization methods that take into account the individual feasible region in local search spaces. We propose a method that uses support vector based black-box models for constructing feasible regions for automated, local solution repair during scheduling and combine it with a distributed greedy approach for finding an appropriate partition of a desired target schedule into operable schedules for each participating actor.
  • A Framework for Agent-Based Simulations of Hybrid Energy Infrastructures
    338 Multi-Agent Systems, Agent-based Simulations, Smart Grids, Future Energy Networks, Hybrid Energy Networks Christian Derksen, Cherif Branki, Rainer Unland, pages 1293 – 1299. Show abstract Abstract. A structured and systematic development of future energy networks strongly requires the application of Multi-Agent based Simulations. That is not only because of the general high interests to agents in current research activities and the consequent proximity between real and simulated systems. This requirement arises rather from the fact that classical and proven simulation techniques and tools are not able to represent the autonomous, communication-driven and highly diversified character of the future energy supply. Today's developments are basically driven by electrical Smart Grids, but it is to recognize that the interaction between different energy networks, like natural gas or district heat, will strongly increase. Inspired by ideas and techniques like Mini-CHP or Power to Gas, it is expected that future energy networks has to be seen as an entire hybrid energy network, where quantities of energy and energy forms will be dynamically exchanged and substituted as required.
    In this article we propose our framework Agent.GUI as a simulation-toolkit and framework for such complex and diversified energy systems. Based on the well known JADE platform, our framework provides a wealth of possibilities in order to model, implement, configure, execute and distribute Multi-Agent based simulations for hybrid energy networks.
  • What the term Agent stands for in the Smart Grid, Definition of Agents and Multi-Agent Systems from an Engineer's Perspective
    243 Smart Grid, Agent, Multi-Agent systems, optimizer, web-services, loop control, control structures Gregor Rohbogner, Simon Fey, Ulf Hahnel, Pascal Benoit, Bernhard Wille-Haussmann, pages 1301 – 1305. Show abstract Abstract. This paper aims to initiate a discussion of what an agent in the context of Smart Grid is. But not as usually done from a computational perspective but rather from an engineer's perspective. This discussion seems to be missing with respect to questions periodically occurring when Smart Grid researchers get in touch with agent technology: What is the difference between an optimizer or an Energy Management System and an agent? Why are web-services not enough for a future Smart Grid control system? How are Multi-Agent systems structured? These are only some of the questions we will discuss to arrive at an application-oriented definition of an “agent”, understandable for Smart Grid researchers of various disciplines. Fostering such an interdisciplinary discussion seems to be essential when trying to sell the advantages of control systems based on Multi-Agent technologies.

3rd International Workshop Automating Test Case Design, Selection and Evaluation

  • Formal specification to support advanced model based testing
    90 test automation, test case scenario generation, guidance application Karel Frajták, Miroslav Bureš, Ivan Jelínek, pages 1311 – 1314. Show abstract Abstract. Reliability and correctness of a web application are crucial to its success. Errors occurring in non-deterministic moments do not attract the target audience of the application. It is quite impossible to deliver 100% reliable application. Hidden errors are discovered when target users are using the application. With proper tooling and support the time between the error discovery or report and its elimination can be reduced. In this paper we are proposing new approach to application testing based on the model of the system with the help of an application for tester's guidance. The application will help the tester through the testing process and provider better feedback.
  • Towards a Methodology for Testing of Business Processes
    201 business processes, service-oriented architecture, testing methodology, web services Sylvia Ilieva, Ilina Manova, Dessislava Petrova-Antonova, pages 1315 – 1322. Show abstract Abstract. Business processes naturally integrate web services implemented with different languages and technologies and executed in heterogeneous environment. Usually the integrated web services and their underlying infrastructure used to exchange messages are not under the control of process architects. This reflects the development and specifically complicates the testing. In this paper a methodology for testing of business processes is presented, which aims to enable automatic test case generation for path coverage functional testing, as well as to provide fault injection mechanisms for negative functional testing. The methodology is supported by a testing framework, called TASSA, that consists of several tools for design time testing of business process described according Web Service Business Process Execution Language (WS-BPEL) standard. The framework follows the Service-Oriented Architecture (SOA) principles and is validated through sample business process scenarios.
  • Comparison of Approaches to Prioritized Test Generation for Combinatorial Interaction Testing
    38 classification tree method, prioritized test case generation, test suite optimization Peter M. Kruse, Ina Schieferdecker, pages 1323 – 1330. Show abstract Abstract. Due to limited test resources, it is often necessary to prioritize and select test cases for a given system under test. Although test case prioritization is well studied and understood, its combination with test data generation is difficult and not completely solved yet. For example, the Classification Tree Method is a well established method for test data generation, however the application of prioritization techniques to it is a current research topic. We present an extension of the classification tree method that allows the generation of optimized test suites, containing test cases ordered according to their importance with respect to test goals. The presented algorithms are incorporated into the Classification Tree Editor and empirically evaluated on a set of benchmarks.

2nd Workshop on Model Driven Approaches in System Development

  • Granulated Code Generation of Interfering Functionalities
    209 Model-Driven Engineering, metaprogramming, code generation, aspects Igor Gelfgat, Shmuel Tyszberowicz, Amiram Yehudai, pages 1333 – 1340. Show abstract Abstract. The Model-Driven Software Development approach is becoming widely used as powerful model-driven tools are becoming available for the developer. Its primary goals are portability, interoperability and reusability, through architectural separation of concerns. Yet, it is not suitable to model, and therefore to generate the code, for all the aspects handled in the development stage. As a result, the Model-Driven Software Development approach is not as widely used as it could.
    In this paper we present a technique that extends the capabilities of Model-Driven Engineering with behavioral aspects, by modeling concerns and using them in code generation. Common concerns can be defined for design patterns, software infrastructures and other common aspects. Independent concerns can be effectively combined when applied to the same model element. Software architects are advised to apply common concerns to their system models and also to create system-specific concerns and apply them at the modeling stage. We name it enriching a model with concerns.
    With the help of code definition for each concern, our tool automatically generates code for the enriched model. Thus, at the end of the modeling stage the developers will have the structure of the code and all the glue code ready, so they will only have to fill the business logic in the manual implementation methods created for them. They will also maintain the enriched model and not the code they would otherwise write manually.
  • Producing the Platform Independent Model of an Existing Web Application
    276 reverse engineering, agile approach, case study Igor Rožanc, Boštjan Slivnik, pages 1341 – 1348. Show abstract Abstract. A reverse engineering procedure for producing a platform independent model (PIM) of an existing Web application is presented using a case study. It focuses on extracting the domain knowledge built into the application and thus it produces the PIM, leaving the hypertext and presentation models aside. It is especially focused on reverse engineering of applications produced using agile software development methodology where documentation is scarce, and as it assumes that in large part the activity diagrams are produced and refined manually, it is particularly useful in environments where at least some developers of the original agile team are still available. Rather than being a result of a theoretical work, the method has crystallized during a lot of practical work. As such it is aimed at practitioners and following the spirit of its formulation, it is presented as a case study where it has been applied.
  • Using structured grammar domain models to capture software system essence
    112 domain engineering, application logic recovery, requirements modelling, metamodels, essential complexity Michal Smialek, Albert Ambroziewicz, Wiktor Nowakowski, Tomasz Straszak, Jacek Bojarski, pages 1349 – 1356. Show abstract Abstract. Creation of a precise domain vocabulary is crucial for capturing the essence of any software system, either when recovering knowledge from a legacy system or when formulating requirements for a new one. Software specifications usually maintain noun notions and include them in central vocabularies. Verb or adjective phrases are easily forgotten and their definitions buried inside imprecise paragraphs of text. This paper proposes a model-based language for comprehensive treatment of domain knowledge, expressed through constrained natural language phrases that are grouped by nouns and include verbs, adjectives and prepositions. In this language, vocabularies can be formulated to describe behavioural, declarative and conditional characteristics of a given problem domain. What is important, these characteristics can be used (linked) from within other specifications similarly to a wiki. In particular, the application logic can be formulated through sequences of imperative Subject-Predicate sentences containing only links to the phrases in the vocabulary. This paper presents initial tooling framework to capture application logic specifications and make them available for further automated transformations.
  • Transformation of Special Multiplicity Constraints—Comparison of Possible Realizations
    138 MDD, UML, OCL, transformation, realization, multiplicity constraints, experiments Zdenek Rybola, Karel Richta, pages 1357 – 1364. Show abstract Abstract. This paper deals with transformation of a binary relationship from Platform Independent Model (PIM) to Platform Specific Model (PSM) for relational database from the point of view of Model Driven Development (MDD). The paper summarizes the transformation of a binary relationship with multiplicity constraints and focus on problems of the current approach for such transformations. In this particular paper, we define a special OCL constraint for source entity parciality that is usually omitted during transformation. This constraint is easily transformable to PSM and can be transformed automatically with the model. We suggest three various possible realizations for the defined constraint. We also present some experiments to demonstrate that our approach with the special constraint is equivalent in execution time to the current approach of a binary relationship realization in a relational database while providing better consistency control. Finally, we generalize our proposed solution to be used for special multiplicity values used in PIM that cannot be enforced by the foreign key mechanism.
  • Using Action Reports for Testing Meta-models, Models, Generators and Target Interpreter in Domain-Specific Modeling
    143 Model transformations, Document Engineering, Model testing Verislav Djukić, Ivan Luković, Aleksandar Popović, Vladimir Ivančević, pages 1365 – 1372. Show abstract Abstract. In this paper, we present an approach to the testing of model and generated code, as well as target interpreter, by using modeling tools and model transformation languages. When compared to the existing Model Driven Development (MDD) approaches and tools supporting Domain Specific Modeling (DSM), contributions of our research include: (i) introduction of action reports, which allow semantic actions on the elements of a graphical interface for modeling; (ii) creation of recommendations and the interface for integrating modeling tools with some applications; and (iii) construction of a language for the description of the structure of arbitrary user controls, as well as construction of a component for embedding such controls into modeling and meta-modeling tools. The basic idea behind the approach is to use a transformation language to construct complex objects and applications, as well as specify operations on complex objects and interface. In this manner, we not only generate the target platform code from the selected domain-specific graphical language (DSGL) models but also directly use these models and appropriate tools as client applications. The applicability of action reports is demonstrated in the examples concerning validation of document models and their generators.
  • SEA_L: A Domain-specific Language for Semantic Web enabled Multi-agent Systems
    206 Domain-specific Languages; Metamodel; Multi-agent Systems; Semantic Web Sebla Demirkol, Moharram Challenger, Sinem Getir, Tomaž Kosar, Geylani Kardaş, Marjan Mernik, pages 1373 – 1380. Show abstract Abstract. Autonomous, reactive and proactive features of software agents make development of agent-based software systems complex. A Domain-specific Language (DSL) can provide the required abstraction and hence support a more fruitful methodology for the development of Multi-agent Systems (MASs) especially working on the new challenging environments such as the Semantic Web. Based on our previously introduced domain-specific metamodel, in this paper we propose a textual concrete syntax of a DSL for MASs working on the Semantic Web and show how the specifications of this DSL can be utilized during the code generation of exact MASs. The new DSL is called Semantic web Enabled Agent Language (SEA_L). The syntax of SEA_L is supported with textual modeling toolkits developed with Xtext. The practical use of SEA_L is illustrated with a case study which considers the modeling of a multi-agent based e-barter system.
  • One approach to partial formalization of SOA design patterns using production rules
    202 service oriented architecture, design pattern, design pattern formalization, domain specific modeling, model driven development, jboss drools Roman Šelmeci, Viera Rozinajová, pages 1381 – 1384. Show abstract Abstract. Service oriented architecture (SOA) is nowadays one of the dominant styles in developing new information systems. These information systems often have complex models, which can contain mistakes. In order to minimize them patterns as components of software development could be used in Model driven development or in Domain specific modelling. Design patterns have been identified in SOA by T. Erl. However, they are represented in form which is suitable for humans, but not for computers. Although the design patterns can bring significant benefits, this form of representation restricts patterns application in process of software development. In context of machine processing formal representation of patterns would be advantageous. In this paper we present our approach to partial formal representation of SOA design patterns using production rules. We attempt to support effective system design based on SOA principles in this way. We verified our approach by implementing a prototype using JBoss Drools system.
  • Applied Metamodelling to Collaborative Document Authoring
    335 collaborative authoring; multi-structured document; metamodelling; domain specific language Anna Kocurova, Samia Oussena, Tony Clark, pages 1385 – 1390. Show abstract Abstract. This document describes a domain specific language tailored for collaborative document authoring processes. The language can support communication between content management systems and user interfaces in web collaborative applications. It allows dynamic rendering of user interfaces based on a collaboration model specified by end users. The construction of the language is supported by a metamodel. We demonstrate the use of the proposed language by implementation of a simple document authoring system.
  • An IFRS Complaint Balance Sheet Metamodel
    250 Financial Service, Balance Sheet, Model Driven Architecture, Tableau Model Nenad Krdzavac, Rafiqul Haque, Tom Butler, pages 1391 – 1395. Show abstract Abstract. This paper proposes a balance sheet metamodel using Model Driven Architecture (MDA) methodology. To do that, we use the international financial reporting standard (IFRS) as a good starting point owing to its wider adoption across countries in preparing balance sheet. The balance sheet is summery of financial position of a financial entity such as credit institutions. The are two reasons for applying MDA in developing balance sheet metamodel. The first one is to automate transfer and sharing knowledge of regulations of capital adequacy of credit institutions. The second one is to make a clear difference between conceptual and concrete modeling of regulations of capital adequacy of credit institutions.
  • Modeling of Multiversion Concurrency Control System Using Event-B
    379 Database system, Formal Method, Transaction, Event-B, Verification, Multiversion. Raghuraj Singh Suryavanshi, Divakar Yadav, pages 1397 – 1401. Show abstract Abstract. Concurrency control in a database system involves the activity of controlling the relative order of conflicting operations, thereby ensuring database consistency. Multiversion concurrency control is timestamp based protocol that can be used to schedule the operations to maintain the consistency of the databases. In this protocol each write on a data item produces a new copy (or version) of that data item while retaining the old version. A systematic approach to specification is essential for the production of any substantial system description. Formal methods are mathematical technique that provide systematic approach for building and verification of model. We have used Event-B as a formal technique for construction of our model.
    Event-B provides complete framework by rigorous description of problem at abstract level and discharge of proof obligations arising due to consistency checking. In this paper, we outline formal construction of model of multiversion concurrency control scheme for database transactions using Event-B.

The 4th International Symposium on Web Services

  • A Web Service-Based Framework for an Online 3D Model Viewer
    200 Web Services and Systems, Building Information Systems, Performance and Scalability Hany ElYamany, Mohamed Elawady, Emad Abdelhay, Ahmed Khalil, pages 1405 – 1410. Show abstract Abstract. Building Information Model (BIM) is an IT methodology to construct a facility virtually in details. The digital format representation is essential to facilitate information sharing and exchange among multiple contractors. Industry Foundation Classes (IFC) is utilized to embed the shared data in an XML standard structure to improve the interoperability among all interrelated participants within the facility construction process. Web Service (WS) is an XML component which encapsulates and transfers a business process and its contiguous data safely, concurrently, and less-cost among several sides through accessing the web. In this paper, a web service-based framework is introduced to enhance the performance of IFC elements exchange and accessibility through the web. The framework employs multiple web service replicas in order to obtain a fast real-time 3D view. Each individual service replica retrieves a specific section of the IFCs information from the DB backend and draws its corresponding object. Experimental evaluation shows that the framework effectively enhances the time required to sketch the overall 3D view for the participant.
TeXnical Editor: Aleksander Denisiuk
E-mail:
Phone/fax: +48-55-2393802