Logo PTI
Polish Information Processing Society
Logo FedCSIS

Annals of Computer Science and Information Systems, Volume 11

Proceedings of the 2017 Federated Conference on Computer Science and Information Systems

Evolving KERAS Architectures for Sensor Data Analysis

,

DOI: http://dx.doi.org/10.15439/2017F241

Citation: Proceedings of the 2017 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 11, pages 109112 ()

Full text

Abstract. Deep neural networks enjoy high interest and have become the state-of-art methods in many fields of machine learning recently. Still, there is no easy way for a choice of network architecture. However, the choice of architecture can significantly influence the network performance. This work is the first step towards an automatic architecture design. We propose a genetic algorithm for an optimization of a network architecture. The algorithm is inspired by and designed directly for the Keras library that is one of the most common implementations of deep neural networks. The target application is the prediction of air pollution based on sensor measurements. The proposed algorithm is evaluated on experiments on sensor data and compared to several fixed architectures and support vector regression.

References

  1. F. Chollet, “Keras,” https://github.com/fchollet/keras, 2015.
  2. I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning. MIT Press, 2016, http://www.deeplearningbook.org.
  3. Y. Lecun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, 5 2015. http://dx.doi.org/10.1038/nature14539. [Online]. Available: http://dx.doi.org/10.1038/nature14539
  4. S. D. Vito, E. Massera, M. Piga, L. Martinotto, and G. D. Francia, “On field calibration of an electronic nose for benzene estimation in an urban pollution monitoring scenario,” Sensors and Actuators B: Chemical, vol. 129, no. 2, pp. 750 – 757, 2008. http://dx.doi.org/10.1016/j.snb.2007.09.060. [Online]. Available: http://dx.doi.org/10.1016/j.snb.2007.09.060
  5. S. De Vito, G. Fattoruso, M. Pardo, F. Tortorella, and G. Di Francia, “Semi-supervised learning techniques in artificial olfaction: A novel approach to classification problems and drift counteraction,” Sensors Journal, IEEE, vol. 12, no. 11, pp. 3215–3224, Nov 2012. http://dx.doi.org/10.1109/JSEN.2012.2192425. [Online]. Available: http://dx.doi.org/10.1109/JSEN.2012.2192425
  6. B. u. Islam, Z. Baharudin, M. Q. Raza, and P. Nallagownden, “Optimization of neural network architecture using genetic algorithm for load forecasting,” in 2014 5th International Conference on Intelligent and Advanced Systems (ICIAS), June 2014. http://dx.doi.org/10.1109/ICIAS.2014.6869528 pp. 1–6. [Online]. Available: http://dx.doi.org/10.1109/ICIAS.2014.6869528
  7. J. Arifovic and R. Genay, “Using genetic algorithms to select architecture of a feedforward artificial neural network,” Physica A: Statistical Mechanics and its Applications, vol. 289, no. 34, pp. 574 – 594, 2001. http://dx.doi.org/10.1016/S0378-4371(00)00479-9. [Online]. Available: http://dx.doi.org/10.1016/S0378-4371(00)00479-9
  8. K. O. Stanley and R. Miikkulainen, “Evolving neural networks through augmenting topologies,” Evolutionary Computation, vol. 10, no. 2, pp. 99–127, 2002. [Online]. Available: http://nn.cs.utexas.edu/?stanley:ec02
  9. K. O. Stanley, D. B. D’Ambrosio, and J. Gauci, “A hypercube-based encoding for evolving large-scale neural networks,” Artif. Life, vol. 15, no. 2, pp. 185–212, Apr. 2009. http://dx.doi.org/10.1162/artl.2009.15.2.15202. [Online]. Available: http://dx.doi.org/10.1162/artl.2009.15.2.15202
  10. F. Gomez, J. Schmidhuber, and R. Miikkulainen, “Accelerated neural evolution through cooperatively coevolved synapses,” Journal of Machine Learning Research, pp. 937–965, 2008. [Online]. Available: http://www.cs.utexas.edu/users/ai-lab/?gomez:jmlr08
  11. I. Loshchilov and F. Hutter, “CMA-ES for hyperparameter optimization of deep neural networks,” CoRR, vol. abs/1604.07269, 2016. [Online]. Available: http://arxiv.org/abs/1604.07269
  12. J. Koutnı́k, J. Schmidhuber, and F. Gomez, “Evolving deep unsupervised convolutional networks for vision-based reinforcement learning,” in Proceedings of the 2014 Annual Conference on Genetic and Evolutionary Computation, ser. GECCO ’14. New York, NY, USA: ACM, 2014. http://dx.doi.org/10.1145/2576768.2598358. ISBN 978-1-4503-2662-9 pp. 541–548. [Online]. Available: http://dx.doi.org/10.1145/2576768.2598358
  13. T. Salimans, J. Ho, X. Chen, and I. Sutskever, “Evolution Strategies as a Scalable Alternative to Reinforcement Learning,” ArXiv e-prints, Mar. 2017. [Online]. Available: https://arxiv.org/abs/1703.03864
  14. R. Miikkulainen, J. Z. Liang, E. Meyerson, A. Rawal, D. Fink, O. Francon, B. Raju, H. Shahrzad, A. Navruzyan, N. Duffy, and B. Hodjat, “Evolving deep neural networks,” CoRR, vol. abs/1703.00548, 2017. [Online]. Available: http://arxiv.org/abs/1703.00548
  15. O. E. David and I. Greental, “Genetic algorithms for evolving deep neural networks,” in Proceedings of the Companion Publication of the 2014 Annual Conference on Genetic and Evolutionary Computation, ser. GECCO Comp ’14. New York, NY, USA: ACM, 2014. http://dx.doi.org/10.1145/2598394.2602287. ISBN 978-1-4503-2881-4 pp. 1451–1452. [Online]. Available: http://dx.doi.org/10.1145/2598394.2602287
  16. T. H. Maul, A. Bargiela, S.-Y. Chong, and A. S. Adamu, “Towards evolutionary deep neural networks,” in ECMS 2014 Proceedings, F. Squazzoni, F. Baronio, C. Archetti, and M. Castellani, Eds. European Council for Modeling and Simulation, 2014. http://dx.doi.org/10.7148/2014-0319. [Online]. Available: http://dx.doi.org/10.7148/2014-0319
  17. M. Mitchell, An Introduction to Genetic Algorithms. Cambridge, MA: MIT Press, 1996.
  18. Z. Michalewicz, Genetic algorithms + data structures = evolution programs (3rd ed.). London, UK: Springer-Verlag, 1996. ISBN 3-540-60676-9
  19. P. Vidnerová, “GAKeras,” github.com/PetraVidnerova/GAKeras, 2017.
  20. F. Pedregosa et al., “Scikit-learn: Machine learning in Python,” Journal of Machine Learning Research, vol. 12, pp. 2825–2830, 2011.
  21. J. Roch, “Minos,” https://github.com/guybedo/minos, 2017.