Logo PTI
Polish Information Processing Society
Logo FedCSIS

Annals of Computer Science and Information Systems, Volume 15

Proceedings of the 2018 Federated Conference on Computer Science and Information Systems

Kestrel-based Search Algorithm (KSA) and Long Short Term Memory (LSTM) Network for feature selection in classification of high-dimensional bioinformatics datasets

, , ,

DOI: http://dx.doi.org/10.15439/2018F52

Citation: Proceedings of the 2018 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 15, pages 1520 ()

Full text

Abstract. Although deep learning methods have been applied to the selection of features in the classification problem, current methods of learning parameters to be used in the classification approach can vary in accuracy at each time interval, resulting in potentially inaccurate classification of features. To address this challenge, this study proposes an approach to learning these parameters by using two different aspects of Kestrel bird behavior to adjust the learning rate until the optimal value of the parameter is found: random encircling from a hovering position and learning through imitation from the well-adapted behaviour of other Kestrels. The proposed bio-inspired approach is integrated with deep learning method (that is, recurrent neural network with long short term memory network). A benchmark dataset (with continuous data attributes) was chosen to test the proposed search algorithm. The results showed that KSA is comparable to BAT, ACO and PSO as the test statistics (that is, Wilcoxon signed rank test) show no statistically significant differences between the mean of classification accuracy at level of significance of 0.05. However, KSA, when compared with WSA-MP, shows a statistically significant difference between the mean of classification accuracy

References

  1. Longbottom, C. and Bamforth, R., (2013), “ Optimising the data warehouse.” Dealing with large volumes of mixed data to give better business insights. Quocirca.
  2. Dash, M. and Liu, H. (1997), “ Feature selection for classification, intelligent data analysis 1”, pg 131-156.
  3. Kumar, V. and Minz, S. (2014), “ Feature selection: A literature review.” Smart Computing Review, vol. 4, No. 3
  4. Lin, C-J., Support vector machines: status and challenges. 2006. Available on: https://www.csie.ntu.edu.tw/~cjlin/talks/caltech.pdf
  5. Deng, Li and Yu, Dong (2013), Deep Learning: Methods and Applications. Vol. 7, Nos. 3–4 pages: 197–387.
  6. Deng, Li., T hree Classes of Deep Learning Architectures and T heir Applications: A T utorial Survey. research.microsoft.com. 2013.
  7. Marcus, G. Deep Learning: A Critical Appraisal. 2018 https://arxiv.org/abs/1801.00631
  8. Patel, A. B., Nguyen, T. and Baraniuk, R. G., A Probabilistic Theory of Deep Learning. 2015.
  9. Deng, L., T hree Classes of Deep Learning Architectures and T heir Applications: A T utorial Survey. 2012.
  10. LeCun, Y., Bengio, Y. and Hinton, G. 2015. Review: Deep learning. Nature. Vol. 521
  11. Li, J., Fong, S., Wong, R. K., Millham, R. and Wong, K. K. L., (2017), “Elitist binary wolf search algorithm for heuristic feature selection in high-dimensional bioinformatics datasets.”
  12. Agbehadji, I. E. (2011), “ Solution to the travel salesman problem, using omicron genetic algorithm. Case study: tour of national health insurance schemes in the Brong Ahafo region of Ghana.” Online Master’s Thesis from KNUST , Accra-Ghana.
  13. Dorigo M. and Cambardella, L. M. (1997), “ Ant colony system: A cooperative learning approach to traveling salesman problem,” IEEE T rans. Evol., Comput. 1 (1), pp. 53-66.
  14. Kennedy, J. and Eberhart, R. C. (1995), “Particle swarm optimization.” Proc. of IEEE International Conference on Neural Networks, Piscataway, NJ. pp. 1942-1948.
  15. X.-S. Yang, “ A new metaheuristic bat-inspired algorithm,” Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), pp. 65–74, 2010
  16. T ang, R., Fong, S., Yang, X-S and Deb, S. (2012 ), “ Wolf search algorithm with ephemeral memory.”
  17. Agbehadji, I. E., Millham, R. and Fong, S. (2016), “Wolf search algorithm for numeric association rule mining.” 2016 IEEE International Conference on Cloud Computing and Big Data Analysis (ICCCBDA 2016). Chengdu, China.
  18. Varland, D. E. (1991), “Behavior and ecology of post-fledging American Kestrels.” Retrospective Theses and Dissertations Paper 9784.
  19. Vlachos, C, Bakaloudis, D., Chatzinikos, E., Papadopoulos, T. and T salagas, D. (2003), “Aerial hunting behaviour of the lesser Kestrel falco naumanni during the breeding season in thessaly (Greece).”
  20. Kumar, R. (2015), “Grey wolf optimizer (GWO)”.
  21. Spencer, R. L. (2002), “ Introduction to Matlab.”
  22. Cui, X., Gao, J. and Potok, T. E. (2006), “ A flocking based algorithm for document clustering analysis.” 2006.
  23. Blum, A. L. and Langley, P. (1997), “Selection of relevant features and examples in machine learning.” Artificial Intelligence, vol. 97, pp. 245-271.
  24. Mafarja, M. and Mirjalili, S. Whale optimization approaches for wrapper feature selection. Applied Soft Computing. 2018.
  25. Batres-Estrada, G. 2015, Deep Learning for Multivariate Financial Time Series