Logo PTI
Polish Information Processing Society
Logo FedCSIS

Annals of Computer Science and Information Systems, Volume 18

Proceedings of the 2019 Federated Conference on Computer Science and Information Systems

Greedy Incremental Support Vector Regression

, ,

DOI: http://dx.doi.org/10.15439/2019F364

Citation: Proceedings of the 2019 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 18, pages 79 ()

Full text

Abstract. Support Vector Regression (SVR) is a powerful supervised machine learning model especially well suited to the normalized or binarized data. However, its quadratic complexity in the number of training examples eliminates it from training on large datasets, especially high dimensional with frequent retraining requirement. We propose a simple two-stage greedy selection of training data for SVR to maximize its validation set accuracy at the minimum number of training examples and illustrate the performance of such strategy in the context of Clash Royale Challenge 2019, concerned with efficient decks' win rate prediction. Hundreds of thousands of labelled data examples were reduced to hundreds, optimized SVR was trained on to maximize the validation R2 score. The proposed model scored the first place in the Cash Royale 2019 challenge, outperforming over hundred of competitive teams from around the world.

References

  1. V. Vapnik and A. Lerner, “Pattern Recognition Using Generalized Portrait Method”, Automation and Remote Control 24:774–780, 1963.
  2. V. Vapnik and A. Chervonenkis, “A Note on One Class of Perceptrons”, Automation and Remote Control 25, 1964.
  3. B. Boser, I. Guyon, and V. Vapnik, "A training algorithm for optimal margin classifiers," Proc. Fifth Annual Workshop of Computational Learning Theory, vol. 5, pp. 144–152, Pittsburgh, 1992.
  4. V. Vapnik, "The Nature of Stat. Learning Theory", Springer, NY, 1995.
  5. C. Cortes and V. Vapnik, "Support-Vector Networks", Machine learning 20(3):273-297, 1995.
  6. V. Vapnik, S. Golowich and A. Smola, “Support Vector Method for Function Approximation, Regression Estimation, and Signal Processing,” in M. Mozer, M. Jordan, and T. Petsche (eds.), Neural Information Processing Systems, vol. 9, MIT Press, Cambridge, MA., 1997.
  7. A. Smola, and B. Schölkopf, "A Tutorial on Support Vector Regression," Statistics and computing, vol. 14, pp. 199-222, 2003.
  8. X. Xia,M. Lyu, T. Lok, G. Huang, "Methods of Decreasing the Number of Support Vectors via k-Mean Clustering," Proc. Int. Conf. Intelligent Computing, pp. 717-726, 2005.
  9. C. Burges, "Simplified support vector decision rules," Proc. 13th Int. Conf. Mach. Learning, pp. 71-77, 1996.
  10. E. Osuna and F. Girosi, "Reducing the run-time complexity of support vector machines," Int. Conf. Pattern Recognition, Australia, 1998.
  11. D. Geebelen, J. Suykens, J. Vandewalle, "Reducing the number of support vectors of SVM classifiers using the smoothed separable case approximation", IEEE Trans Neural Net Learn Sys. 23(4):682-688, 2012.
  12. T. Downs, K. Gates, and A. Masters, "Exact simplification of support vector solutions", Machine Learning Research 1:293-297, 2001.
  13. G. Bakir, J. Weston, and L. Bottou, "Breaking SVM complexity with cross-training," Advances in Neural Information Processing Systems, vol. 17, pp. 81-88, 2005.