Citation: Proceedings of the 2017 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 11, pages 149–152 (2017)
Abstract. This paper introduces an ensemble model that solves the binary classification problem by incorporating the basic Logistic Regression with the two recent advanced paradigms: extreme gradient boosted decision trees (xgboost) and deep learning. To obtain the best result when integrating sub-models, we introduce a solution to split and select sets of features for the sub-model training. In addition to the ensemble model, we propose a flexible robust and highly scalable new scheme for building a composite classifier that tries to simultaneously implement multiple layers of model decomposition and outputs aggregation to maximally reduce both bias and variance (spread) components of classification errors. We demonstrate the power of our ensemble model to solve the problem of predicting the outcome of Hearthstone, a turn-based computer game, based on game state information. Excellent predictive performance of our model has been acknowledged by the second place scored in the final ranking among 188 competing teams.
- Hearthstone, http://us.battle.net/hearthstone/en/
- AAIA’17 Data Mining Challenge: Helping AI to Play Hearthstone, https://knowledgepit.fedcsis.org/contest/view.php?id=120.
- D.R. Cox, "The regression analysis of binary sequences (with discus- sion)," J Roy Stat Soc B., vol. 20, pp. 215–242, 1958.
- C.R. Boyd, M.A. Tolson, and W.S. Copes, "Evaluating trauma care: The TRISS method. Trauma Score and the Injury Severity Score," The Journal of trauma, vol. 27, no. 4, pp. 370–378, 1987.
- F.E. Harrell, Regression Modeling Strategies, Springer-Verlag, ISBN 0- 387-95232-2, 2001.
- M. Strano, B.M. Colosimo "Logistic regression analysis for experimen- tal determination of forming limit diagrams," International Journal of Machine Tools and Manufacture, vol. 46, no. 6, pp. 673–682, 2006.
- M.J.A. Berry, "Data Mining Techniques For Marketing, Sales and Cus- tomer Support," Wiley, pp 10, 1997.
- L. Breiman, “Bagging predictors,” Machine Learning, vol. 24, no. 2, pp. 123-140, 1996.
- L. Mason, J. Baxter, P. Bartlett, and M. Frean, "Boosting Algorithms as Gradient Descent," in S.A. Solla, T.K. Leen, and K.R. Muller, editors, Advances in Neural Information Processing Systems 12, pp. 512-518, MIT Press.
- L. Mason, J. Baxter, P.L. Bartlett, and M. Frean, "Boosting Algorithms as Gradient Descent" In S.A. Solla and T.K. Leen and K. Müller, Advances in Neural Information Processing Systems 12. MIT Press. pp. 512–518, 1999.
- J.H. Friedman, "Greedy function approximation: A gradient boosting machine," Ann. Statist., vol. 29, no. 5, pp. 1189-1232, 2001.
- XGBoost, https://github.com/dmlc/xgboost/.
- XGBoost:Machine Learning Challenge Winning Solutions, https://github.com/dmlc/xgboost/tree/master/demo#machine-learning-challenge-winning-solutions, Retrieved 2016-08-01.
- L. Deng, “Three Classes of Deep Learning Architectures and Their Applications: A Tutorial Survey,” APSIPA Transactions on Signal and Information Processing, 2012.