Logo PTI
Polish Information Processing Society
Logo FedCSIS

Annals of Computer Science and Information Systems, Volume 21

Proceedings of the 2020 Federated Conference on Computer Science and Information Systems

Future Graduate Salaries Prediction Model Based On Recurrent Neural Network

, ,

DOI: http://dx.doi.org/10.15439/2020F52

Citation: Proceedings of the 2020 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 21, pages 427430 ()

Full text

Abstract. Prediction models are widely applied in several fields. In this study we present a discussion on using Recurrent Neural Network as predictor for salaries of future graduates. The model is based on feature analysis which leads to input values of the predictor. We have analyzed several compositions and ideas. As a result we have selected Recurrent Neural Network to be the most accurate. Presented results confirm this selection and show high precision.

References

  1. L. L. Taylor, J. N. Lahey, M. I. Beck, and J. E. Froyd, “How to do a salary equity study: With an illustrative example from higher education,” Public Personnel Management, vol. 49, no. 1, pp. 57–82, 2020.
  2. M. Kwiek, “Academic top earners. research productivity, prestige generation, and salary patterns in european universities,” Science and Public Policy, vol. 45, no. 1, pp. 1–13, 2018.
  3. M. Hernandez, D. R. Avery, S. D. Volpone, and C. R. Kaiser, “Bargaining while black: The role of race in salary negotiations.” Journal of Applied Psychology, vol. 104, no. 4, p. 581, 2019.
  4. E. Silva and Q. Galbraith, “Salary negotiation patterns between women and men in academic libraries,” College & research libraries, vol. 79, no. 3, p. 324, 2018.
  5. A. Pawha and D. Kamthania, “Quantitative analysis of historical data for prediction of job salary in india-a case study,” Journal of Statistics and Management Systems, vol. 22, no. 2, pp. 187–198, 2019.
  6. M. He, D. Shen, Y. Zhu, R. He, T. Wang, and Z. Zhang, “Career trajectory prediction based on cnn,” in 2019 IEEE International Conference on Service Operations and Logistics, and Informatics (SOLI). IEEE, 2019, pp. 22–26.
  7. R. Pascanu, C. Gulcehre, K. Cho, and Y. Bengio, “How to construct deep recurrent neural networks,” arXiv preprint https://arxiv.org/abs/1312.6026, 2013.
  8. H. Sak, A. W. Senior, and F. Beaufays, “Long short-term memory recurrent neural network architectures for large scale acoustic modeling,” 2014.
  9. J. Wang and Z. Cao, “Chinese text sentiment analysis using lstm network based on l2 and nadam,” in 2017 IEEE 17th International Conference on Communication Technology (ICCT). IEEE, 2017, pp. 1891–1895.
  10. A. Petrosian, D. Prokhorov, R. Homan, R. Dasheiff, and D. Wunsch II, “Recurrent neural network based prediction of epileptic seizures in intra- and extracranial eeg,” Neurocomputing, vol. 30, no. 1-4, pp. 201–218, 2000.
  11. H. Yao, H. Vuthaluru, M. Tade, and D. Djukanovic, “Artificial neural network-based prediction of hydrogen content of coal in power station boilers,” Fuel, vol. 84, no. 12-13, pp. 1535–1542, 2005.
  12. S. Ruder, “An overview of gradient descent optimization algorithms,” arXiv preprint https://arxiv.org/abs/1609.04747, 2016.
  13. J. Zhang, “Gradient descent based optimization algorithms for deep learning models training,” arXiv preprint https://arxiv.org/abs/1903.03614, 2019.
  14. D. Soydaner, “A comparison of optimization algorithms for deep learning,” International Journal of Pattern Recognition and Artificial Intelligence, p. 2052013, 2020.
  15. N. Rajesh and A. A. L. Selvakumar, “Association rules and deep learning for cryptographic algorithm in privacy preserving data mining,” Cluster Computing, vol. 22, no. 1, pp. 119–131, 2019.
  16. A. Barakat and P. Bianchi, “Convergence analysis of a momentum algorithm with adaptive step size for non convex optimization,” arXiv preprint https://arxiv.org/abs/1911.07596, 2019.