Logo PTI Logo FedCSIS

Proceedings of the 17th Conference on Computer Science and Intelligence Systems

Annals of Computer Science and Information Systems, Volume 30

Predicting the Costs of Forwarding Contracts Using XGBoost and a Deep Neural Network

,

DOI: http://dx.doi.org/10.15439/2022F295

Citation: Proceedings of the 17th Conference on Computer Science and Intelligence Systems, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 30, pages 425429 ()

Full text

Abstract. This article presents an application of an XGBoostand deep neural network ensemble as a solution for a task assigned at the FedCSIS 2022 Challenge: Predicting the Costs of Forwarding Contracts. We demonstrate that prediction quality can be improved by combining the two approaches. We present a neural network architecture based on three independent flows. We then discuss the influence of long short-term memory units on the risk of overfitting. Finally, we show that the static XGBoost model can complement a neural network that processes dynamic data.

References

  1. Z. H. Munim and H.-J. Schramm, “Forecasting container shipping freight rates for the far east–northern europe trade lane,” Maritime Economics & Logistics, vol. 19, no. 1, pp. 106–125, 2017.
  2. S.-J. Joo, H. Min, and C. Smith, “Benchmarking freight rates and procuring cost-attractive transportation services,” The International Journal of Logistics Management, 2017.
  3. A. Ubaid, F. Hussain, and J. Charles, “Modeling shipment spot pricing in the australian container shipping industry: case of asia-oceania trade lane,” Knowledge-based systems, vol. 210, p. 106483, 2020.
  4. T.-Y. Chou, G.-S. Liang, and T.-C. Han, “Application of fuzzy regression on air cargo volume forecast,” Quality & Quantity, vol. 45, no. 6, pp. 1539–1550, 2011.
  5. S. Nataraj, C. Alvarez, L. Sada, A. Juan, J. Panadero, and C. Bayliss, “Applying statistical learning methods for forecasting prices and enhancing the probability of success in logistics tenders,” Transportation Research Procedia, vol. 47, pp. 529–536, 2020.
  6. Z. Yang and E. E. Mehmed, “Artificial neural networks in freight rate forecasting,” Maritime Economics & Logistics, vol. 21, no. 3, pp. 390–414, 2019.
  7. A. M. Viellechner, “The new era of predictive analytics in container shipping and air cargo,” Ph.D. dissertation, WHU-Otto Beisheim School of Management, 2022.
  8. A. Janusz, A. Jamiołkowski, and M. Okulewicz, “Predicting the costs of forwarding contracts: Analysis of data mining competition results,” in Proceedings of the 17th Conference on Computer Science and Intelligence Systems, FedCSIS 2022, Sofia, Bulgaria, September 4-7, 2022. IEEE, 2022.
  9. G. Tiwari, A. Sharma, A. Sahotra, and R. Kapoor, “English-hindi neural machine translation-lstm seq2seq and convs2s,” in 2020 International Conference on Communication and Signal Processing (ICCSP), 2020. http://dx.doi.org/10.1109/ICCSP48568.2020.9182117 pp. 871–875.
  10. A. H. Bukhari, M. A. Z. Raja, M. Sulaiman, S. Islam, M. Shoaib, and P. Kumam, “Fractional neuro-sequential arfima-lstm for financial market forecasting,” IEEE Access, vol. 8, pp. 71 326–71 338, 2020. http://dx.doi.org/10.1109/ACCESS.2020.2985763
  11. J. Wang, J. Li, X. Wang, J. Wang, and M. Huang, “Air quality prediction using ct-lstm,” Neural Computing and Applications, vol. 33, pp. 1–14, 05 2021. http://dx.doi.org/10.1007/s00521-020-05535-w
  12. A. Janusz, T. Tajmajer, and M. Świechowski, “Helping ai to play hearthstone: Aaia’17 data mining challenge,” in 2017 Federated Conference on Computer Science and Information Systems (FedCSIS), 2017. http://dx.doi.org/10.15439/2017F573 pp. 121–125.
  13. K. H. Kim, B. Chang, and H. K. Choi, “Deep learning based short-term electric load forecasting models using one-hot encoding,” Journal of IKEEE, vol. 23, no. 3, pp. 852–857, 2019.
  14. G. E. Hinton and D. v. Camp, “Keeping neural networks simple,” in International Conference on Artificial Neural Networks. Springer, 1993, pp. 11–18.
  15. N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, “Dropout: A simple way to prevent neural networks from overfitting,” J. Mach. Learn. Res., vol. 15, no. 1, pp. 1929–1958, Jan. 2014. [Online]. Available: http://dl.acm.org/citation.cfm?id=2627435.2670313
  16. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” CoRR, vol. abs/1412.6980, 2014. [Online]. Available: http://arxiv.org/abs/1412.6980