Logo PTI Logo FedCSIS

Proceedings of the 17th Conference on Computer Science and Intelligence Systems

Annals of Computer Science and Information Systems, Volume 30

Using gradient boosting trees to predict the costs of forwarding contracts

,

DOI: http://dx.doi.org/10.15439/2022F299

Citation: Proceedings of the 17th Conference on Computer Science and Intelligence Systems, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 30, pages 421424 ()

Full text

Abstract. When selling goods abroad or bringing them into the country from foreign partners, we face the problem of delivery. The division of responsibilities related to this between the manufacturer and the recipient sometimes varies. In such a situation, it is reasonable to use the services of a forwarding company. Then a forwarding contract is concluded, which specifies the details of the service, but the most important issue remains the selection of its price. In this paper, we present results obtained using the LightGBM method on the forwarding contracts pricing challenge held as part of the FedCSIS 2022 conference.

References

  1. T. Hastie, R. Tibshirani, and J. H. Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition, ser. Springer Series in Statistics. Springer, 2009. ISBN 9780387848570
  2. L. Breiman, “Arcing the edge,” 1997.
  3. T. Chen and C. Guestrin, “XGBoost: A scalable tree boosting system,” in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, ser. KDD ’16. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2939672.2939785 pp. 785–794.
  4. G. Ke, Q. Meng, T. Finley, T. Wang, W. Chen, W. Ma, Q. Ye, and T.-Y. Liu, “Lightgbm: A highly efficient gradient boosting decision tree,” in Advances in Neural Information Processing Systems, vol. 30. Curran Associates, Inc., 2017.
  5. A. V. Dorogush, V. Ershov, and A. Gulin, “Catboost: gradient boosting with categorical features support,” ArXiv, vol. abs/1810.11363, 2018. http://dx.doi.org/10.48550/ARXIV.1810.11363
  6. S. Makridakis, E. Spiliotis, and V. Assimakopoulos, “M5 accuracy competition: Results, findings, and conclusions,” International Journal of Forecasting, 2022. http://dx.doi.org/10.1016/j.ijforecast.2021.11.013
  7. A. Janusz, A. Jamiołkowski, and M. Okulewicz, “Predicting the costs of forwarding contracts: Analysis of data mining competition results,” in Proceedings of the 17th Conference on Computer Science and Intelligence Systems, FedCSIS 2022, Sofia, Bulgaria, September 4-7, 2022. IEEE, 2022.
  8. “Fedcsis 2022 challenge: Predicting the costs of forwarding contracts,” https://knowledgepit.ml/fedcsis-2022-challenge/, accessed: 2022-06-20.
  9. C. Wang, Q. Wu, M. Weimer, and E. Zhu, “Flaml: A fast and lightweight automl library,” in MLSys, 2021.
  10. Q. Wu, C. Wang, and S. Huang, “Frugal optimization for cost-related hyperparameters,” in AAAI’21, 2021.
  11. C. Wang, Q. Wu, S. Huang, and A. Saied, “Economical hyperparameter optimization with blended search strategy,” in ICLR’21, 2021.
  12. F. T. Liu, K. M. Ting, and Z.-H. Zhou, “Isolation forest,” in 2008 Eighth IEEE International Conference on Data Mining, 2008. http://dx.doi.org/10.1109/ICDM.2008.17 pp. 413–422.
  13. M. M. Breunig, H.-P. Kriegel, R. T. Ng, and J. Sander, “Lof: Identifying density-based local outliers,” in Proceedings of the 2000 ACM SIGMOD International Conference on Management of Data, ser. SIGMOD ’00. New York, NY, USA: Association for Computing Machinery, 2000. http://dx.doi.org/10.1145/342009.335388 pp. 93–104.
  14. F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay, “Scikit-learn: Machine learning in Python,” Journal of Machine Learning Research, vol. 12, pp. 2825–2830, 2011.