Logo PTI Logo FedCSIS

Proceedings of the 20th Conference on Computer Science and Intelligence Systems (FedCSIS)

Annals of Computer Science and Information Systems, Volume 43

Evaluating Effectiveness of Nonlinear Dimensionality Reduction in Hedge Funds’ Returns Forecasting

, , , ,

DOI: http://dx.doi.org/10.15439/2025F3970

Citation: Proceedings of the 20th Conference on Computer Science and Intelligence Systems (FedCSIS), M. Bolanowski, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 43, pages 411416 ()

Full text

Abstract. Hedge funds (HF) are actively managed investment vehicles employing diverse and often complex strategies. Accurate returns forecasting is essential for optimizing their performance and managing risk. This paper investigates the application of nonlinear feature extraction (FE) methods in forecasting HF strategy performance, building upon prior work in financial time series analysis. We evaluate the effectiveness of Kernel Principal Component Analysis (KPCA), t-Distributed Stochastic Neighbor Embedding (t-SNE), Uniform Manifold Approximation and Projection (UMAP), and autoencoders on predictive performance of machine learning models. The extracted features are fed into several forecasting models, Support Vector Machine (SVM) with linear and nonlinear kernels, Neural Network (NN), and Extreme Gradient Boosting (XGB), to predict returns of five diverse HF investment strategies: Commodity Trading Advisors, Equity Long Short, Equity Market Neutral, Fixed Income Arbitrage, and Global Macro. The results demonstrate that nonlinear FE methods, particularly autoencoders, and KPCA combined with NN, significantly outperform other techniques. Our findings highlight the value of nonlinear transformations in enhancing predictive accuracy for HF returns time series.

References

  1. HFR Industry Reports. [Online]. Available: www.hfr.com/hfr-industry-reports/
  2. W. Wu, J. Chen, Z. Yang, and M. L. Tindall, “A cross-sectional machine learning approach for hedge fund return prediction and selection,” Manag. Sci., vol. 67, no. 7, pp. 4577–4601, 2021. https://doi.org/10.1287/mnsc.2020.3696
  3. M. Ashraf et al., “A survey on dimensionality reduction techniques for time-series data,” IEEE Access, vol. 11, pp. 42909–42923, 2023, https://dx.doi.org/10.1109/ACCESS.2023.3269693.
  4. R. Zaheer, M. K. Hanif, M. U. Sarwar, and R. Talib, “Evaluating the effectiveness of dimensionality reduction on machine learning algorithms in time series forecasting,” IEEE Access, 2025. https://doi.org/10.1109/ACCESS.2025.3551741
  5. F. Anowar, S. Sadaoui, and B. Selim, “Conceptual and empirical comparison of dimensionality reduction algorithms (PCA, KPCA, LDA, MDS, SVD, LLE, ISOMAP, LE, ICA, t-SNE),” Comput. Sci. Rev., vol. 40, 100378, 2021. https://doi.org/10.1016/j.cosrev.2021.100378
  6. IEEE Xplore. [Online]. Available: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10929010
  7. A. M. Lopes and J. A. T. Machado, “Dynamical analysis of the Dow Jones index using dimensionality reduction and visualization,” Entropy, vol. 23, no. 5, p. 600, 2021. https://doi.org/10.3390/e23050600
  8. Z. Wang, H. Chen, X. Yang, J. Wan, T. Li, and C. Luo, “Fuzzy rough dimensionality reduction: a feature set partition-based approach,” Information Sciences, vol. 644, Art. no. 119266, 2023.
  9. D. Kumar, P. K. Sarangi, and R. Verma, “A systematic review of stock market prediction using machine learning and statistical techniques,” Mater. Today Proc., vol. 49, pp. 3187–3191, 2022. https://doi.org/10.1016/j.matpr.2020.11.399
  10. S. Gu, B. Kelly, and D. Xiu, “Empirical asset pricing via machine learning,” Rev. Financ. Stud., vol. 33, no. 5, pp. 2223–2273, 2020. https://doi.org/10.1093/rfs/hhaa009
  11. C. Lin, “Key financial indicators analysis and stock trend forecasting based on a wrapper feature selection method,” in 2024 19th Conf. on Computer Science and Intelligence Systems (FedCSIS), Belgrade, Serbia, Sep. 2024, pp. 755–759.
  12. A. Radosavcevic, A. Poledica, and I. Antovic, “Discovery of key factors in hedge funds investment strategies using optimal IBA-based logical polynomials,” in Proc. Int. Conf. Inf. Process. Manage. Uncertainty Knowl.-Based Syst., Cham, Switzerland: Springer, 2024, pp. 335–346. https://doi.org/10.1007/978-3-031-74000-8_28
  13. B. M. S. Hasan and A. M. Abdulazeez, “A review of principal component analysis algorithm for dimensionality reduction,” J. Soft Comput. Data Mining, vol. 2, no. 1, pp. 20–30, 2021. https://doi.org/10.30880/jscdm.2021.02.01.003
  14. L. J. Cao et al., “A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine,” Neurocomputing, vol. 55, no. 1–2, pp. 321–336, 2003. https://doi.org/10.1016/S0925-2312(03)00433-8
  15. R. Silva and P. Melo-Pinto, “t-SNE: A study on reducing the dimensionality of hyperspectral data for the regression problem of estimating oenological parameters,” Artif. Intell. Agric., vol. 7, pp. 58–68, 2023. https://doi.org/10.1016/j.aiia.2023.02.003
  16. M. W. Dorrity, L. M. Saunders, C. Queitsch, S. Fields, and C. Trapnell, “Dimensionality reduction by UMAP to visualize physical and genetic interactions,” Nat. Commun., vol. 11, no. 1, p. 1537, 2020. https://doi.org/10.1038/s41467-020-15351-4
  17. Y. Wang, H. Huang, C. Rudin, and Y. Shaposhnik, “Understanding how dimension reduction tools work: An empirical approach to deciphering t-SNE, UMAP, TriMAP, and PaCMAP for data visualization,” J. Mach. Learn. Res., vol. 22, no. 201, pp. 1–73, 2021.
  18. B. Ghojogh et al., “Feature selection and dimensionality reduction in pattern analysis: A literature review,” arXiv preprint https://arxiv.org/abs/1905.02845, 2019. https://doi.org/10.48550/arXiv.1905.02845
  19. Q. Fournier and D. Aloise, “Empirical comparison between autoencoders and traditional dimensionality reduction methods,” in Proc. 2nd Int. Conf. Artif. Intell. Knowl. Eng. (AIKE), pp. 211–214, IEEE, June 2019. https://doi.org/10.1109/AIKE.2019.00044
  20. E. F. Fama and K. R. French, “Common risk factors in the returns on stocks and bonds,” J. Financ. Econ., vol. 33, no. 1, pp. 3–56, 1993. https://doi.org/10.1016/0304-405X(93)90023-5