Shallow, Deep, Ensemble models for Network Device Workload Forecasting
Cenru Liu
DOI: http://dx.doi.org/10.15439/2020F137
Citation: Proceedings of the 2020 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 21, pages 101–104 (2020)
Abstract. Reliable prediction of workload-related characteristics of monitored devices is important and helpful for management of infrastructure capacity. This paper presents 3 machine learning models (shallow, deep, ensemble) with different complexity for network device workload forecasting. The performance of these models have been compared using the data provided in FedCSIS'20 Challenge. The R2 scores achieved from the cascade Support Vector Regression (SVR) based shallow model, Long short-term memory (LSTM) based deep model, and hierarchical linear weighted ensemble model are 0.2506, 0.2831, and 0.3059, respectively, and was ranked 3rd place in the preliminary stage of the challenges.
References
- FedCSIS 2020 Challenge: Network Device Workload Prediction, https://knowledgepit.ml/fedcsis20-challenge/.
- S. Di, D. Kondo, W. Cirne, "Host load prediction in a Google compute cloud with a Bayesian model," Proc. of IEEE Int. Conf. on High Performance Computing, Networking, Storage and Analysis, 2012.
- F. Benhammadi, Z. Gessoum, A. Mokhtari, "CPU load prediction using neuro-fuzzy and Bayesian inferences," Neurocomputing, vol. 74, pp. 1606–1616, 2011.
- J. Kumar, A. Singh, "Workload prediction in cloud using artificial neural network and adaptive differential evolution," Futur. Gener. Comput. Syst., vol. 81, pp. 41–52, 2018.
- R. Calheiros, E. Masoumi, R. Ranjan, R. Buyya, "Workload prediction using ARIMA model and its impact on cloud applications’ QoS," IEEE Trans. Cloud Comput., vol. 3, no. 4, pp. 449–458, 2014.
- Z. Huang, J. Peng, H. Lian, J. Guo, and W. Qiu, "Deep recurrent model for server load and performance prediction in data center," Complexity, 2017.
- J. Kumar, R. Goomer, and A. Singh, "Long short term memory recurrent neural network (LSTM-RNN) based workload forecasting model for cloud datacenters,". Procedia Comput.(Elsevier), vol. 125, pp. 676–682, 2018.
- B. Song, Y. Yu, Y. Zhou, Z. Wang, and S. Du, "Host load prediction with long short-term memory in cloud computing," The Journal of Supercomputing, vol. 74, 6554–6568, 2018.
- B.E. Boser, I.M. Guyon, V. Vapnik, "A training algorithm for optimal margin classifiers," Proceedings of the Annual Conference on Computational Learning Theory, ACM, pp. 144–152, Pittsburgh, PA 1992.
- I. Guyon, B. Boser, and V. Vapnik, "Automatic capacity tuning of very large VC-dimension classifiers," Advances in Neural Information Processing Systems 5, pp. 147–155, Morgan Kaufmann Publishers, 1993.
- C. Cortes, and V. Vapnik, Support vector networks, Machine Learning, vol. 20, pp. 273–297, 1995.
- B. Schölkopf, C. Burges, and V. Vapnik, "Extracting support data for a given task," Proceedings of First International Conference on Knowledge Discovery and Data Mining, AAAI Press, 1995.
- B. Schölkopf, C. Burges, and V. Vapnik, "Incorporating invariances in support vector learning machines," Artificial Neural Networks, Springer Lecture Notes in Computer Science, Vol. 1112, pp. 47–52, Berlin, 1996.
- V. Vapnik, S. Golowich and A. Smola, “Support Vector Method for Function Approximation, Regression Estimation, and Signal Processing,” in M. Mozer, M. Jordan, and T. Petsche (eds.), Neural Information Processing Systems, vol. 9, MIT Press, Cambridge, MA., 1997.
- V. Vapnik and A. Chervonenkis, “Theory of Pattern Recognition” (in Russian), Nauka, 1974.
- V. Vapnik, “Estimation of dependences based on empirical data,” Springer Verlag.
- V. Vapnik, "The Nature of Statistical Learning Theory," Springer, New York.
- B. Schölkopf, P. Simard, A. Smola, and V. Vapnik, "Prior knowledge in support vector kernels," M.I. Jordan, M.J. Kearns, and S.A. Solla (Eds.), Advances in Neural Information Processing Systems 10, MIT Press, Cambridge, MA, pp. 640–646, 1998.
- V. Blanz, B. Schölkopf, H. Bulthoff, C. Burges, V. Vapnik, and T. Vetter, "Comparison of view-based object recognition algorithms using realistic 3D models," Artificial Neural Networks, Springer Lecture Notes in Computer Science, vol. 1112, pp. 251–256, Berlin, 1996.
- B. Schölkopf, K. Sung, C. Burges, F. Girosi, P. Niyogi, T. Poggio, and V. Vapnik, "Comparing support vector machines with Gaussian kernels to radial basis function classifiers," IEEE Transactions on Signal Processing, vol. 45, pp. 2758–2765, 1997.
- K.R. Muller, A. Smola, G. Ratsch, B. Schölkopf, J. Kohlmorgen, and V. Vapnik, "Predicting time series with support vector machines," Artificial Neural Networks, Springer Lecture Notes in Computer Science, vol. 1327, pp. 999–1004, Berlin, 1997.
- H. Drucker, C.J.C. Burges, L. Kaufman, A. Smola, and V. Vapnik, "Support vector regression machines," Advances in Neural Information Processing Systems 9, pp. 155–161, MIT Press, Cambridge, MA, 1997.
- M. Stitson, A. Gammerman, V. Vapnik, V. Vovk, C. Watkins, and J. Weston, "Support vector regression with ANOVA decomposition kernels," Advances in Kernel Methods—Support Vector Learning, MIT Press Cambridge, MA, pp. 285–292, 1999.
- A. Smola, and B. Schölkopf, "A Tutorial on Support Vector Regression," STATISTICS AND COMPUTING, vol. 14, pp. 199-222, 2003.
- D. Basak, S. Pal, and D. Patranabis, "Support Vector Regression," Neural Information Processing – Letters and Reviews, vol. 11, Non. 10, pp. 203-224, October 2007.
- fitrsvm: Fit a support vector machine regression mode, https://www.mathworks.com/help/stats/fitrsvm.html.
- S. Dupond. "A thorough review on the current advance of neural network structures,". Annual Reviews in Control, vol. 14, pp. 200–230, 2019.
- J. Schmidhuber, "Deep Learning in Neural Networks: An Overview," Neural Networks, vol. 61, pp. 85–117, 2015.
- M. Miljanovic, "Comparative analysis of Recurrent and Finite Impulse Response Neural Networks in Time Series Prediction," Indian Journal of Computer and Engineering, vol. 3, no. 1, 2012.
- S. Hochreiter, J. Schmidhuber, "Long short-term memory," Neural Computation, vol. 9, no. 8, pp. 1735-1780, 1997.