Logo PTI
Polish Information Processing Society
Logo FedCSIS

Annals of Computer Science and Information Systems, Volume 15

Proceedings of the 2018 Federated Conference on Computer Science and Information Systems

Regression Networks for Robust Win-rates Predictions of AI Gaming Bots

, , ,

DOI: http://dx.doi.org/10.15439/2018F364

Citation: Proceedings of the 2018 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 15, pages 181184 ()

Full text

Abstract. Designing a robust and adaptable Artificial Intelligence (AI) opponent in a computer game would ensure the game continues to challenge, immerse and excite the players at any stage. The outcomes of card based games such as``Heartstone: Heroes of Warcraft'', aside the player skills, heavily depend on the initial composition of player card decks. To evaluate this impact we have developed a new robust regression network in a context of the AAIA Data Mining Competition 2018, which tries to predict the average win-rates of the specific combinations of bot-player and card decks. Our network is composed of 2 levels: the entry level with an array of finely optimized state of the art regression models including Extreme Learning Machines (ELM), Extreme Gradient Boosted decision tree (XGBOOST), and Least Absolute Shrinkage and Selection Operator (LASSO) regression trained via supervised learning on the labeled training dataset; and just a single ELM at the 2nd level installed to learn to correct the predictions from the 1st level. The final solution received the root of the mean squared error (RMSE) of just 5.65\% and scored the 2nd place in AAIA'2018 competition. This paper also presents two other runner-up models receiving RMSE of 5.7\% and 5.86\%, scoring the 4th and the 6th place respectively.

References

  1. ScienceDaily, https://www.sciencedaily.com/.
  2. Five reasons why online games have become so popular, https://www.belfasttelegraph.co.uk/woman/life/five-reasons-why-online-games-have-become-so-popular-28656803.html
  3. J.H. Friedman, "Greedy function approximation: A gradient boosting machine," Ann. Statist., vol. 29, no. 5, pp. 1189-1232, 2001.
  4. XGBoost, https://github.com/dmlc/xgboost/.
  5. XGBoost:Machine Learning Challenge Winning Solutions, https://github.com/dmlc/xgboost/tree/master/demo#machine-learning-challenge-winning-solutions, Retrieved 2016-08-01.
  6. G. B. Huang, Q. Y. Zhu, and C. K. Siew, "Extreme learning machine: Theory and applications," Neurocomputing, vol. 70, pp. 489-501, 2006.
  7. J.P. Vert, K. Tsuda, and B. Schölkopf "A primer on kernel methods," Kernel Methods in Computational Biology, 2004.
  8. R. Tibshirani, "Regression Shrinkage and Selection via the lasso," Journal of the Royal Statistical Society, Series B (methodological), Wiley, vol .58, no. 1, pp. 267–88.
  9. L. Breiman, "Better Subset Regression Using the Nonnegative Garrote," Technometrics. Taylor and Francis, vol. 37, no. 4, pp. 373–384, 1995.
  10. R. Tibshirani, "The lasso Method for Variable Selection in the Cox Model," Statistics in Medicine, vol. 16, pp. 385–395.
  11. V. Nair and G. Hinton, "Rectified Linear Units Improve Restricted Boltzmann Machines," 27th International Conference on Machine Learning, pp. 807-814, 2010.
  12. N. Srivastava, G. Hinton, A. Krishevsky, I. Sutskever, and R. Salakhutdinov, "A Simple Way to Prevent Neural Networks from Overfitting", Journal of Machine Learning Research, vol. 15, pp. 1929-1958, 2014.