Logo PTI Logo FedCSIS

Proceedings of the 20th Conference on Computer Science and Intelligence Systems (FedCSIS)

Annals of Computer Science and Information Systems, Volume 43

Applying Evolutionary Techniques to Enhance Graph Convolutional Networks for Node Classification: Case Studies

, ,

DOI: http://dx.doi.org/10.15439/2025F0041

Citation: Proceedings of the 20th Conference on Computer Science and Intelligence Systems (FedCSIS), M. Bolanowski, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 43, pages 321326 ()

Full text

Abstract. In recent years, significant efforts have been made to address graph node classification tasks by applying graph neural networks and methods based on label propagation. Despite the progress achieved by these approaches, their success often hinges on complex architectures and algorithms, sometimes leading to the oversight of crucial technical details. In designing artificial neural networks, one crucial aspect of the innovative approach is suggesting a novel neural architecture. Currently used architectures have mostly been developed manually by human experts, which is a time-consuming and error-prone process. That is why the adoption of more sophisticated semi-automatic methods, such as Neural Architecture Search, has become commonplace. This paper introduces and assesses an evolutionary-based approach for the design of graph convolutional neural networks in the context of node classification. Our approach aims to systematically define the graph convolutional networks parameter space, drawing inspiration from recent research on design principles. By doing so, our method seeks to strike a balance between achieving satisfactory performance and optimizing memory and computation resources, thus offering a more efficient alternative to conventional approaches from the neural architecture search area.

References

  1. C. Liu, B. Zoph, J. Shlens, W. Hua, L.-J. Li, L. Fei-Fei, A. L. Yuille, J. Huang, and K. P. Murphy, “Progressive neural architecture search,” in European Conference on Computer Vision, 2017.
  2. B. Zoph and Q. V. Le, “Neural architecture search with reinforcement learning,” ArXiv, vol. abs/1611.01578, 2016.
  3. H. Pham, M. Guan, B. Zoph, Q. Le, and J. Dean, “Efficient neural architecture search via parameters sharing,” in International conference on machine learning. PMLR, 2018, pp. 4095–4104.
  4. T. Elsken, J. H. Metzen, and F. Hutter, “Neural architecture search: A survey,” ArXiv, vol. abs/1808.05377, 2018.
  5. G. Bender, P.-J. Kindermans, B. Zoph, V. Vasudevan, and Q. V. Le, “Understanding and simplifying one-shot architecture search,” in International Conference on Machine Learning, 2018.
  6. C. White, W. Neiswanger, and Y. Savani, “Bananas: Bayesian optimization with neural architectures for neural architecture search,” in AAAI Conference on Artificial Intelligence, 2019.
  7. M. Verma, P. P. Sinha, K. Goyal, A. Verma, and S. Susan, “A novel framework for neural architecture search in the hill climbing domain,” 2019 IEEE Second International Conference on Artificial Intelligence and Knowledge Engineering (AIKE), pp. 1–8, 2019.
  8. Y. Liu, Y. Sun, B. Xue, M. Zhang, and G. G. Yen, “A survey on evolutionary neural architecture search,” IEEE Transactions on Neural Networks and Learning Systems, vol. 34, pp. 550–570, 2020.
  9. X. Zhou, A. K. Qin, Y. Sun, and K. C. Tan, “A survey of advances in evolutionary neural architecture search,” 2021 IEEE Congress on Evolutionary Computation (CEC), pp. 950–957, 2021.
  10. C. Pan and X. Yao, “Neural architecture search based on evolutionary algorithms with fitness approximation,” in 2021 International Joint Conference on Neural Networks (IJCNN), 2021, pp. 1–8.
  11. R. Shang, S. Zhu, J. Ren, H. Liu, and L. Jiao, “Evolutionary neural architecture search based on evaluation correction and functional units,” Knowledge-Based Systems, vol. 251, p. 109206, 2022.
  12. Y. Gao, H. Yang, P. Zhang, C. Zhou, and Y. Hu, “Graph neural architecture search,” in International Joint Conference on Artificial Intelligence, 2020.
  13. ——, “Graphnas: Graph neural architecture search with reinforcement learning,” ArXiv, vol. abs/1904.09981, 2019.
  14. Y. Gao, P. Zhang, H. Yang, C. Zhou, Z. Tian, Y. Hu, Z. Li, and J. Zhou, “Graphnas++: Distributed architecture search for graph neural networks,” IEEE Transactions on Knowledge and Data Engineering, vol. 35, pp. 6973–6987, 2023.
  15. K. Zhou, Q. Song, X. Huang, and X. Hu, “Auto-gnn: Neural architecture search of graph neural networks,” Frontiers in Big Data, vol. 5, 2019.
  16. H. Zhao, L. Wei, and Q. Yao, “Simplifying architecture search for graph neural network,” ArXiv, vol. abs/2008.11652, 2020.
  17. W. Zhang, Y. Shen, Z. Lin, Y. Li, X. Li, W. Ouyang, Y. Tao, Z. Yang, and B. Cui, “Pasca: A graph neural architecture search system under the scalable paradigm,” Proceedings of the ACM Web Conference 2022, 2022.
  18. P. Xu, L. Zhang, X. Liu, J. Sun, Y. Zhao, H. Yang, and B. Yu, “Do not train it: A linear neural architecture search of graph neural networks,” ArXiv, vol. abs/2305.14065, 2023.
  19. X. Zheng, M. Zhang, C. cheng Jason Chen, Q. Zhang, C. Zhou, and S. Pan, “Auto-heg: Automated graph neural network on heterophilic graphs,” Proceedings of the ACM Web Conference 2023, 2023.
  20. J. Chen, J. Gao, Y. Chen, B. M. Oloulade, T. Lyu, and Z. Li, “Autognas: A parallel graph neural architecture search framework,” IEEE Transactions on Parallel and Distributed Systems, vol. PP, pp. 1–1, 2022.
  21. X. Miao, W. Zhang, Y. Shao, B. Cui, L. Chen, C. Zhang, and J. Jiang, “Lasagne: A multi-layer graph convolutional network framework via node-aware deep architecture (extended abstract),” in 2022 IEEE 38th International Conference on Data Engineering (ICDE), 2022, pp. 1561–1562.
  22. O. Shchur, M. Mumme, A. Bojchevski, and S. Günnemann, “Pitfalls of graph neural network evaluation,” ArXiv, vol. abs/1811.05868, 2018.
  23. J. Ma and D. Yarats, “Quasi-hyperbolic momentum and adam for deep learning,” arXiv preprint https://arxiv.org/abs/1810.06801, 2018.
  24. D. Wang, D. Tan, and L. Liu, “Particle swarm optimization algorithm: an overview,” Soft computing, vol. 22, pp. 387–408, 2018.
  25. P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, and Y. Bengio, “Graph attention networks,” arXiv, 2018.
  26. T. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” ArXiv, vol. abs/1609.02907, 2017.
  27. H. Pei, B. Wei, K. C.-C. Chang, Y. Lei, and B. Yang, “Geom-gcn: Geometric graph convolutional networks,” 2020.
  28. Y. Yan, M. Hashemi, K. Swersky, Y. Yang, and D. Koutra, “Two sides of the same coin: Heterophily and oversmoothing in graph convolutional neural networks,” 2022 IEEE International Conference on Data Mining (ICDM), pp. 1287–1292, 2021.
  29. W. Huang, X. Guan, and D. Liu, “Revisiting homophily ratio: A relation-aware graph neural network for homophily and heterophily,” Electronics, vol. 12, no. 4, 2023. [Online]. Available: https: //www.mdpi.com/2079-9292/12/4/1017
  30. M. Krzywda, S. Łukasik, and A. H. Gandomi, “Cartesian genetic programming approach for designing convolutional neural networks,” arXiv preprint https://arxiv.org/abs/2410.00129, 2024.
  31. R. Shen, A. S. Bosman, A. Schreuder, M. Krzywda, and S. Łukasik, “Training graph neural networks with particle swarm optimisation,” Sacair 2023, 2023.