Challenges in Evaluating OSS Quality: Results from SLR on Quality Evaluation Tools
Aslı Taşgetiren, Ayça Kolukısa Tarhan
DOI: http://dx.doi.org/10.15439/2025F2661
Citation: Proceedings of the 20th Conference on Computer Science and Intelligence Systems (FedCSIS), M. Bolanowski, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 43, pages 393–398 (2025)
Abstract. Open Source Software (OSS) quality evaluation is essential for ensuring the adoption and effective use of OSS projects across various domains. Most existing works on OSS quality evaluation have focused on the development of models or frameworks rather than providing practical implementations. These models and frameworks can be challenging to use for inexperienced users, which highlights the need for user-friendly tools. This paper presents preliminary findings from a Systematic Literature Review (SLR) that investigates the characteristics, limitations, and gaps of current Quality Evaluation Tools (QETs) for OSS. Our analysis reveals the diversity of quality models underlying these tools and the absence of standardization, which impedes the comparability and reliability of evaluation results. Based on the SLR findings, we outline key challenges and propose a vision for future research focusing on development of standardized and user-friendly QETs. This work aims to inform the next steps in improving OSS quality evaluation practices.
References
- V. Lenarduzzi, D. Taibi, D. Tosi, L. Lavazza, and S. Morasca, “Open Source Software Evaluation, Selection, and Adoption: a Systematic Literature Review,” in 2020 46th Euromicro Conference on Software Engineering and Advanced Applications (SEAA), Aug. 2020. https://dx.doi.org/10.1109/SEAA51224.2020.00076 pp. 437–444.
- N. Yılmaz and A. Kolukısa Tarhan, “Quality evaluation models or frameworks for open source software: A systematic literature review,” Journal of Software: Evolution and Process, vol. 34, no. 6, p. e2458, Jun. 2022. https://dx.doi.org/10.1002/smr.2458 Publisher: John Wiley & Sons, Ltd.
- S. Keele et al., “Guidelines for performing systematic literature reviews in software engineering,” Technical report, ver. 2.3 ebse technical report. ebse, Tech. Rep., 2007.
- C. Wohlin, P. Runeson, M. Höst, M. C. Ohlsson, B. Regnell, A. Wesslén et al., Experimentation in software engineering. Springer, 2012, vol. 236.
- “ISO/IEC 9126-1:2001.” [Online]. Available: https://www.iso.org/standard/22749.html
- “ISO 25010.” [Online]. Available: https://iso25000.com/index.php/en/iso-25000-standards/iso-25010
- K. Szymański and M. Ochodek, “On the applicability of the pareto principle to source-code growth in open source projects,” in 2023 18th Conference on Computer Science and Intelligence Systems (FedCSIS), 2023. https://dx.doi.org/10.15439/2023F5221 pp. 781–789.