Logo PTI Logo FedCSIS

Position and Communication Papers of the 16th Conference on Computer Science and Intelligence Systems

Annals of Computer Science and Information Systems, Volume 26

Towards Evolutionary Emergence


DOI: http://dx.doi.org/10.15439/2021F111

Citation: Position and Communication Papers of the 16th Conference on Computer Science and Intelligence Systems, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 26, pages 5560 ()

Full text

Abstract. Cyber-physical systems demand self-organizing algorithms that rely on emergent behavior; local observations and decision aggregate to global behavior without explicitly programmed rules. Designing these algorithms is error prone. Widely applicable design patterns are scarce. We opt for a machine learning approach that learns mechanisms for targeted emergent behavior automatically. We use Cartesian genetic programming. As an example, that demonstrates the general applicability of this idea, we trained a swarm-based heuristics and present first results showing that the learned swarm behavior is significantly better than just random search. We also discuss the encountered pitfalls and remaining challenges on the research agenda.


  1. M. Jipp and P. L. Ackerman, “The impact of higher levels of automation on performance and situation awareness,” Journal of Cognitive Engineering and Decision Making, vol. 10, no. 2, pp. 138–166, 2016.
  2. T. B. Sheridan and R. Parasuraman, “Human-automation interaction,” Reviews of Human Factors and Ergonomics, vol. 1, no. 1, pp. 89–129, 2016.
  3. R. Parasuraman and C. D. Wickens, “Humans: still vital after all these years of automation,” Human factors, vol. 50, no. 3, pp. 511–520, 2008.
  4. European Commission, “ Draft Ethics Guidelines for Trustworthy AI ,” European Commission, Tech. Rep., 12 2018.
  5. B. Rapp, A. Solsbach, T. Mahmoud, A. Memari, and J. Bremer, “It-for-green: Next generation cemis for environmental, energy and resource management,” in EnviroInfo 2011 - Innovations in Sharing Environmental Observation and Information, Proceedings of the 25th EnviroInfo Conference ’Environmental Informatics’, W. Pillmann, S. Schade, and P. Smits, Eds. Shaker Verlag, 2011, pp. 573 – 581.
  6. L. Cardwell and A. Shebanow, “The efficacy and challenges of scada and smart grid integration,” Journal of Cyber Security and Information Systems, vol. 1, no. 3, pp. 1–7, 2016.
  7. D. W. McKee, S. J. Clement, J. Almutairi, and J. Xu, “Survey of advances and challenges in intelligent autonomy for distributed cyber-physical systems,” CAAI Transactions on Intelligence Technology, vol. 3, no. 2, pp. 75–82, 2018.
  8. J. Collier, “Fundamental properties of self-organization,” Causality, emergence, self-organisation, pp. 287–302, 2003.
  9. H. Parzyjegla, A. Schröter, E. Seib, S. Holzapfel, M. Wander, J. Richling, A. Wacker, H.-U. Heiß, G. Mühl, and T. Weis, “Model-driven development of self-organising control applications,” in Organic Computing – A Paradigm Shift for Complex Systems. Springer, 2011, pp. 131–144.
  10. J. Dormans et al., Engineering emergence: applied theory for game design. Universiteit van Amsterdam [Host], 2012.
  11. M. Parhizkar, G. D. M. Serugendo, and S. Hassas, “Leaders and followers: a design pattern for second-order emergence,” in 2019 IEEE 4th International Workshops on Foundations and Applications of Self* Systems (FAS* W). IEEE, 2019, pp. 269–270.
  12. C. Hinrichs and M. Sonnenschein, “A distributed combinatorial optimisation heuristic for the scheduling of energy resources represented by self-interested agents,” International Journal of Bio-Inspired Computation, vol. 10, no. 2, pp. 69–78, 2017.
  13. C. Prehofer and C. Bettstetter, “Self-organization in communication networks: principles and design paradigms,” IEEE Communications Magazine, vol. 43, no. 7, pp. 78–85, 2005.
  14. J. F. Miller and S. L. Harding, “Cartesian genetic programming,” in Proceedings of the 10th annual conference companion on Genetic and evolutionary computation, 2008, pp. 2701–2726.
  15. L. Buşoniu, R. Babuška, and B. De Schutter, “Multi-agent reinforcement learning: An overview,” Innovations in multi-agent systems and applications-1, pp. 183–221, 2010.
  16. J. R. Koza and R. Poli, “Genetic programming,” in Search methodologies. Springer, 2005, pp. 127–164.
  17. D. O. Hebb, “The organization of behavior; a neuropsycholocigal theory,” A Wiley Book in Clinical Psychology, vol. 62, p. 78, 1949.
  18. R. S. Sutton and A. G. Barto, Reinforcement learning: An introduction. MIT press, 2018.
  19. L. P. Kaelbling, M. L. Littman, and A. W. Moore, “Reinforcement learning: A survey,” Journal of artificial intelligence research, vol. 4, pp. 237–285, 1996.
  20. D. A. Pisner and D. M. Schnyer, “Support vector machine,” in Machine Learning. Elsevier, 2020, pp. 101–121.
  21. M. Van Gerven and S. Bohte, “Artificial neural networks as models of neural information processing,” Frontiers in Computational Neuroscience, vol. 11, p. 114, 2017.
  22. A. Burkov, The hundred-page machine learning book. Andriy Burkov Canada, 2019, vol. 1.
  23. L. Busoniu, R. Babuska, and B. De Schutter, “A comprehensive survey of multiagent reinforcement learning,” IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), vol. 38, no. 2, pp. 156–172, 2008.
  24. M. Tan, “Multi-agent reinforcement learning: Independent vs. cooperative agents,” in Proceedings of the tenth international conference on machine learning, 1993, pp. 330–337.
  25. F. Martinez-Gil, M. Lozano, and F. Fernandez, “Emergent behaviors and scalability for multi-agent reinforcement learning-based pedestrian models,” Simulation Modelling Practice and Theory, vol. 74, pp. 117–133, 2017.
  26. A. M. Turing, “Computing machinery and intelligence,” Mind, vol. 49, no. 236, pp. 433–460, Oct. 1950.
  27. R. Forsyth, “BEAGLE a Darwinian approach to pattern recognition,” Kybernetes, vol. 10, no. 3, pp. 159–166, 1981.
  28. N. L. Cramer, “A representation for the adaptive generation of simple sequential programs,” in Proceedings of an International Conference on Genetic Algorithms and the Applications, J. J. Grefenstette, Ed., Carnegie-Mellon University, Pittsburgh, PA, USA, 24-26 Jul. 1985, pp. 183–187.
  29. J. R. Koza, “Non-linear genetic algorithms for solving problems,” United States Patent 4935877, 19 Jun. 1990, filed may 20, 1988, issued june 19, 1990, 4,935,877. Australian patent 611,350 issued september 21, 1991. Canadian patent 1,311,561 issued december 15, 1992.
  30. ——, “Hierarchical genetic algorithms operating on populations of computer programs,” in Proceedings of the Eleventh International Joint Conference on Artificial Intelligence IJCAI-89, N. S. Sridharan, Ed., vol. 1. Detroit, MI, USA: Morgan Kaufmann, 1989, pp. 768–774.
  31. ——, Genetic Programming: On the Programming of Computers by Means of Natural Selection. Cambridge, MA, USA: MIT Press, 1992.
  32. L. F. D. P. Sotto, P. Kaufmann, T. Atkinson, R. Kalkreuth, and M. P. Basgalupp, “A study on graph representations for genetic programming,” in Proceedings of the 2020 Genetic and Evolutionary Computation Conference, ser. GECCO ’20. New York, NY, USA: Association for Computing Machinery, 2020, pp. 931–939. [Online]. Available: https://doi.org/10.1145/3377930.3390234
  33. J. Miller, Cartesian Genetic Programming, 06 2003, vol. 43.
  34. J. F. Miller, P. Thomson, and T. Fogarty, “Designing electronic circuits using evolutionary algorithms. arithmetic circuits: A case study,” Genetic algorithms and evolution strategies in engineering and computer science, pp. 105–131, 1997.
  35. S. Harding, J. Leitner, and J. Schmidhuber, “Cartesian genetic programming for image processing,” in Genetic programming theory and practice X. Springer, 2013, pp. 31–44.
  36. M. M. Khan, A. M. Ahmad, G. M. Khan, and J. F. Miller, “Fast learning neural networks using cartesian genetic programming,” Neurocomputing, vol. 121, pp. 274–289, 2013.
  37. R. Hrbacek and V. Dvorak, “Bent function synthesis by means of cartesian genetic programming,” in International Conference on Parallel Problem Solving from Nature. Springer, 2014, pp. 414–423.
  38. J. R. Koza and J. R. Koza, Genetic programming: on the programming of computers by means of natural selection. MIT press, 1992, vol. 1.
  39. J. F. Miller and S. L. Smith, “Redundancy and computational efficiency in cartesian genetic programming,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 2, pp. 167–174, 2006.
  40. J. Miller, “What bloat? cartesian genetic programming on boolean problems,” in 2001 Genetic and Evolutionary Computation Conference Late Breaking Papers. San Francisco, California, USA, 2001, pp. 295–302.
  41. A. J. Turner and J. F. Miller, “Cartesian genetic programming: Why no bloat?” in European Conference on Genetic Programming. Springer, 2014, pp. 222–233.
  42. ——, “Recurrent cartesian genetic programming applied to famous mathematical sequences,” in Proceedings of the Seventh York Doctoral Symposium on Computer Science & Electronics, 2014, pp. 37–46.
  43. E. O. Scott and S. Luke, “Ecj at 20: toward a general metaheuristics toolkit,” in Proceedings of the genetic and evolutionary computation conference companion, 2019, pp. 1391–1398.
  44. J. Miller and N. C. Series, “Resources for cartesian genetic programming,” Cartesian Genetic Programming, p. 337.
  45. J. Clegg, J. A. Walker, and J. F. Miller, “A new crossover technique for cartesian genetic programming,” in Proceedings of the 9th annual conference on Genetic and evolutionary computation, 2007, pp. 1580–1587.
  46. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of ICNN’95-international conference on neural networks, vol. 4. IEEE, 1995, pp. 1942–1948.
  47. V. Balaban, S. Lim, G. Gupta, J. Boedicker, and P. Bogdan, “Quantifying emergence and self-organisation of enterobacter cloacae microbial communities,” Scientific reports, vol. 8, no. 1, pp. 1–9, 2018.
  48. E. P. Hoel, L. Albantakis, and G. Tononi, “Quantifying causal emergence shows that macro can beat micro,” Proceedings of the National Academy of Sciences, vol. 110, no. 49, pp. 19 790–19 795, 2013. [Online]. Available: https://www.pnas.org/content/110/49/19790
  49. A. Shahbakhsh and A. Nieße, “Modeling multimodal energy systems,” Automatisierungstechnik : AT, vol. 67, no. 11, pp. 893–903, 2019.
  50. J.-P. Watson, “An introduction to fitness landscape analysis and cost models for local search,” in Handbook of metaheuristics. Springer, 2010, pp. 599–623.
  51. M. Jamil, X.-S. Yang, and H.-J. Zepernick, “8 - test functions for global optimization: A comprehensive survey,” in Swarm Intelligence and Bio-Inspired Computation, X.-S. Yang, Z. Cui, R. Xiao, A. H. Gandomi, and M. Karamanoglu, Eds. Oxford: Elsevier, 2013, pp. 193–222.
  52. P. Grassberger and I. Procaccia, “Characterization of strange attractors,” Physical review letters, vol. 50, no. 5, p. 346, 1983.
  53. H. E. Hurst, “The problem of long-term storage in reservoirs,” Hydrological Sciences Journal, vol. 1, no. 3, pp. 13–27, 1956.
  54. ——, “A suggested statistical model of some time series which occur in nature,” Nature, vol. 180, no. 4584, pp. 494–494, 1957.
  55. P. Grassberger, T. Schreiber, and C. Schaffrath, “Nonlinear time sequence analysis,” International Journal of Bifurcation and Chaos, vol. 1, no. 03, pp. 521–547, 1991.
  56. K. A. De Jong, “Analysis of the behavior of a class of genetic adaptive systems,” Ph.D. dissertation, University of Michigan, Ann Arbor, 1975.
  57. R. Weron, “Estimating long-range dependence: finite sample properties and confidence intervals,” Physica A: Statistical Mechanics and its Applications, vol. 312, no. 1-2, pp. 285–299, 2002.
  58. A. J. Turner and J. F. Miller, “Recurrent cartesian genetic programming of artificial neural networks,” Genetic Programming and Evolvable Machines, vol. 18, no. 2, pp. 185–212, 2017.
  59. A. Manazir and K. Raza, “Recent developments in cartesian genetic programming and its variants,” ACM Computing Surveys (CSUR), vol. 51, no. 6, pp. 1–29, 2019.