Citation: Proceedings of the 2020 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 21, pages 427–430 (2020)
Abstract. Prediction models are widely applied in several fields. In this study we present a discussion on using Recurrent Neural Network as predictor for salaries of future graduates. The model is based on feature analysis which leads to input values of the predictor. We have analyzed several compositions and ideas. As a result we have selected Recurrent Neural Network to be the most accurate. Presented results confirm this selection and show high precision.
- L. L. Taylor, J. N. Lahey, M. I. Beck, and J. E. Froyd, “How to do a salary equity study: With an illustrative example from higher education,” Public Personnel Management, vol. 49, no. 1, pp. 57–82, 2020.
- M. Kwiek, “Academic top earners. research productivity, prestige generation, and salary patterns in european universities,” Science and Public Policy, vol. 45, no. 1, pp. 1–13, 2018.
- M. Hernandez, D. R. Avery, S. D. Volpone, and C. R. Kaiser, “Bargaining while black: The role of race in salary negotiations.” Journal of Applied Psychology, vol. 104, no. 4, p. 581, 2019.
- E. Silva and Q. Galbraith, “Salary negotiation patterns between women and men in academic libraries,” College & research libraries, vol. 79, no. 3, p. 324, 2018.
- A. Pawha and D. Kamthania, “Quantitative analysis of historical data for prediction of job salary in india-a case study,” Journal of Statistics and Management Systems, vol. 22, no. 2, pp. 187–198, 2019.
- M. He, D. Shen, Y. Zhu, R. He, T. Wang, and Z. Zhang, “Career trajectory prediction based on cnn,” in 2019 IEEE International Conference on Service Operations and Logistics, and Informatics (SOLI). IEEE, 2019, pp. 22–26.
- R. Pascanu, C. Gulcehre, K. Cho, and Y. Bengio, “How to construct deep recurrent neural networks,” arXiv preprint https://arxiv.org/abs/1312.6026, 2013.
- H. Sak, A. W. Senior, and F. Beaufays, “Long short-term memory recurrent neural network architectures for large scale acoustic modeling,” 2014.
- J. Wang and Z. Cao, “Chinese text sentiment analysis using lstm network based on l2 and nadam,” in 2017 IEEE 17th International Conference on Communication Technology (ICCT). IEEE, 2017, pp. 1891–1895.
- A. Petrosian, D. Prokhorov, R. Homan, R. Dasheiff, and D. Wunsch II, “Recurrent neural network based prediction of epileptic seizures in intra- and extracranial eeg,” Neurocomputing, vol. 30, no. 1-4, pp. 201–218, 2000.
- H. Yao, H. Vuthaluru, M. Tade, and D. Djukanovic, “Artificial neural network-based prediction of hydrogen content of coal in power station boilers,” Fuel, vol. 84, no. 12-13, pp. 1535–1542, 2005.
- S. Ruder, “An overview of gradient descent optimization algorithms,” arXiv preprint https://arxiv.org/abs/1609.04747, 2016.
- J. Zhang, “Gradient descent based optimization algorithms for deep learning models training,” arXiv preprint https://arxiv.org/abs/1903.03614, 2019.
- D. Soydaner, “A comparison of optimization algorithms for deep learning,” International Journal of Pattern Recognition and Artificial Intelligence, p. 2052013, 2020.
- N. Rajesh and A. A. L. Selvakumar, “Association rules and deep learning for cryptographic algorithm in privacy preserving data mining,” Cluster Computing, vol. 22, no. 1, pp. 119–131, 2019.
- A. Barakat and P. Bianchi, “Convergence analysis of a momentum algorithm with adaptive step size for non convex optimization,” arXiv preprint https://arxiv.org/abs/1911.07596, 2019.