Logo PTI Logo FedCSIS

Proceedings of the 19th Conference on Computer Science and Intelligence Systems (FedCSIS)

Annals of Computer Science and Information Systems, Volume 39

Toward a Framework for Determining Methods of Evaluation in Design Science Research

, , ,

DOI: http://dx.doi.org/10.15439/2024F7208

Citation: Proceedings of the 19th Conference on Computer Science and Intelligence Systems (FedCSIS), M. Bolanowski, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 39, pages 231236 ()

Full text

Abstract. Evaluation is a key phase of design science research, particularly in design-oriented information systems, one that involves analyzing and solving problems to create artifacts. Because the nature of those artifacts varies based on the problem, they necessitate different methods of evaluation, and selecting an appropriate one first requires identifying appropriate criteria for evaluation. This paper aims to pinpoint those criteria by systematically reviewing the literature, with particular focus on identifying various criteria for evaluation, their frequency, their significance in evaluating artifacts, and their connection to specific methods of evaluation. The findings suggest a framework for choosing the most suitable methods of evaluation based on the defined criteria that can enhance the rigor and relevance of the evaluation phase in design science research.

References

  1. A. Hevner, S. March, J. Park, and S. Ram, “Design Science in Information Systems,” MIS Quarterly, vol. 28, pp. 75-105, 2004.
  2. Larsen, K. R., Lukyanenko, R., Mueller, R. M., Storey, V. C., VanderMeer, D., Parsons, J., and Hovorka, D. S., “Validity in Design Science Research,“ in Designing for Digital Transformation. Co-Creating Services with Citizens and Industry. DESRIST 2020, Lecture Notes in Computer Science (LNCS), vol. 12388, S. Hofmann, O. Müller, and M. Rossi, Eds., Cham: Springer, 2020, pp 272–282.
  3. K. Peffers, T. Tuunanen, M. A. Rothenberger, and S. Chatterjee, “A Design Science Research Methodology for Information Systems Research,” JMIS, vol. 24, pp. 45-77, 2007.
  4. A. Hevner, N. Prat, I. Comyn-Wattiau, and J. Akoka, “A pragmatic approach for identifying and managing design science research goals and evaluation criteria,” in AIS SIGPRAG Pre-ICIS Workshop 2018, San Francisco, USA, 2018.
  5. C. Ihlstrom Eriksson, M. Åkesson, and K. Kautz, “Authentic and Concurrent Evaluation – Refining an Evaluation Approach in Design Science Research,” in PACIS 2011 Proc., Brisbane, Australia, 2011.
  6. E. Stoeckli, G. Neiditsch, F. Uebernickel, and W. Brenner, “Towards an Understanding of How and Why Design Science Research Scholars Evaluate,” in Proceedings of ACIS 2017, Hobart, Australia, 2017.
  7. J. Venable, J. Pries-Heje, and Baskerville, R. “FEDS: A Framework for Evaluation in Design Science Research,” Eur J Inf Syst, vol. 25, pp. 77-89, 2016.
  8. S. Straßburg, S. Kahlert, D. Stöffler, and T. Schäffer, “Identification of Issues in Design Science Research Evaluation – A Literature Review,” in AMCIS 2021 Proceedings, virtual conference, 2021.
  9. H. Österle et al., “Memorandum on design-oriented information systems research,” Eur J Inf Syst, vol. 20, pp. 7-10, 2011.
  10. S. March, and G. Smith, “Design and natural science research on information technology,” Dec. Supp. Syst., vol. 15, pp. 251-266, 1995.
  11. A. Cleven, P. Gubler, and K. M. Hüner, “Design alternatives for the evaluation of design science research artifacts,” in DESRIST 2009 Proceedings, Philadelphia, USA, 2009.
  12. R. Baskerville, M. Kaul, and V. Storey, “Genres of Inquiry in Design-Science Research: Justification and Evaluation of Knowledge Production,” MIS Quarterly, vol. 39, pp. 541-564, 2015.
  13. M. K. Sein, O. Henfridsson, S. Purao, M. Rossi, and R. Lindgren, “Action Design Research”. MIS Quarterly, vol. 35, pp. 37-56, 2011.
  14. C. Sonnenberg, and J. vom Brocke, “Evaluations in the Science of the Artificial – Reconsidering the Build-Evaluate Pattern in Design Science Research,” in: Design Science Research in Information Systems. Advances in Theory and Practice. DESRIST 2012, Lecture Notes in Computer Science (LNCS), vol. 7286, K. Peffers, M. Rothenberger, and B. Kuechler, Eds., Berlin, Heidelberg: Springer, 2012, pp. 381-397.
  15. J. vom Brocke et al., “Standing on the Shoulders of Giants: Challenges and Recommendations of Literature Search in Information Systems Research,” Comm. of the AIS, vol. 37, pp. 205-224, 2015.
  16. J. Webster, and R. Watson, “Analyzing the Past to Prepare for the Future: Writing a Literature Review,” MIS Quarterly, vol. 26, pp. xiii-xxiii, 2002.
  17. P. Mayring, “Qualitative Forschungsdesigns,” in Handbuch Qualitative Forschung in der Psychologie, G. Mey, and K. Mruck, Eds., Wiesbaden: Springer, 2020, pp. 3-17.
  18. K. Peffers, M. Rothenberger, T. Tuunanen, and R. Vaezi, “Design Science Research Evaluation,” in Design Science Research in Information Systems. Advances in Theory and Practice. DESRIST 2012, Lecture Notes in Computer Science (LNCS), vol. 7286, K. Peffers, M. Rothenberger, and B. Kuechler, Eds., Berlin, Heidelberg: Springer, 2012, pp 398-410.
  19. J. Akoka, I. Comyn-Wattiau, N. Prat, and V. C. Storey, “Knowledge contributions in design science research: Paths of knowledge types,” Dec. Supp. Syst., vol. 166, Article 113898, 2023.
  20. M. D. Ahmed, and D. Sundaram, “Design Science Research Methodology: An Artefact-Centric Creation and Evaluation Approach” in ACIS 2011 Proceedings, Sydney, Australia, 2011.
  21. J. Barata, P. R. Da Cunha, and A. D. Figueiredo, “Self-reporting Limitations in Information Systems Design Science Research,” Bus Inf Syst Eng, vol. 65, pp.143-160, 2023.
  22. C. Basile, B. D. Sutter, D. Canavese, L. Regano, and B. Coppens, “Design, implementation, and automation of a risk management approach for man-at-the-end software protection,” Comput. Secur., vol. 132, Article 103321, 2023.
  23. M. Bitzer, et al., “Managing the Inevitable – A Maturity Model to Establish Incident Response Management Capabilities,” Comput. Secur., vol. 125, Article 103050, 2023.
  24. G. Bou Ghantous, and A.Q. Gill, A. Q., “Evaluating the DevOps Reference Architecture for Multi-Cloud IoT-Applications, “SN COMPUT. SCI., vol. 2, Article 123, 2021.
  25. L. Bunnell, K-M. Osei-Bryson, and V. Y. Yoon, V. Y., “FinPath-light: Framework for an multiagent recommender system designed to increase consumer financial capability,” Dec. Supp. Syst., vol. 134, Article 113306, 2020.
  26. B. M. Chaudhry, and J. Smith, “RefineMind: A Mobile App for People with Dementia and Their Caregivers,” in: The Next Wave of Sociotechnical Design. DESRIST 2021, Lecture Notes in Computer Science (LNCS), vol. 12807, L. Chandra Kruse, S. Seidel, and G. I. Hausvik, Eds., Cham: Springer, 2021, pp 16-21.
  27. Q. Deng, and S. Ji, “A Review of Design Science Research in Information Systems: Concept, Process, Outcome, and Evaluation,” PAJAIS, vol. 10, 2018.
  28. O F. El-Gayar, and B. D. Fritz, “A web-based multi-perspective decision support system for information security planning,” Dec. Supp. Syst., vol. 50, pp. 43-54, 2010.
  29. D. A. Fischer et al., “Towards interactive event log forensics: Detecting and quantifying timestamp imperfections,” Information Systems, vol. 109, Article 102039, 2022.
  30. J. Forsberg, and T. Frantti, “Technical performance metrics of a security operations center,” Comput. Secur., vol. 135, Article 103529, 2023.
  31. T. Matschak, F. Rampold, M. Hellmeier, C. Prinz, and S. Trang, “A Digitization Pipeline for Mixed-Typed Documents Using Machine Learning and Optical Character Recognition,” in The Transdisciplinary Reach of Design Science Research. DESRIST 2022, Lecture Notes in Computer Science (LNCS), vol. 13229, A. Drechsler, A. Gerber, and A. Hevner, Eds., Cham: Springer, 2022, pp. 195-207.
  32. S. Mdletshe, O. S. Motshweneng, M. Oliveira, and B. Twala, “Design science research application in medical radiation science education,” J. Med. Imaging Radiat. Sci., vol. 54, pp. 206-214, 2023.
  33. A. L. Mesquida, and A. Mas, “Implementing information security best practices on software lifecycle processes: The ISO/IEC 15504 Security Extension,” Comput. Secur., vol. 48, pp. 19-34, 2015.
  34. T. Mettler, M. Eurich, and R. Winter, “On the Use of Experiments in Design Science Research: A Proposition of an Evaluation Framework,” CAIS, vol. 34, pp. 223-240, 2014.
  35. B. Morschheuser, L. Hassan, K. Werder, and J. Hamari, “How to design gamification? A method for engineering gamified software,” Inform. and Soft. Tech., vol. 95, pp. 219-237, 2018.
  36. J. A. Moutinho, G. Fernandes, and R. Rabechini, “Evaluation in design science: A framework to support project studies in the context of University Research Centres,” Evaluation and Program Planning, vol. 102, Article 102366, 2024.
  37. M. Muntean, R-D. Dănăiaţă, and L. Hurbean, “Applying Design Science Research for Developing Business Artifacts,” Proc. Comput. Sci., vol. 199, pp. 637-642, 2022.
  38. T. Nagle, C. Doyle, I. M. Alhassan, and D. Sammon, “The Research Method We Need or Deserve? A Literature Review of the Design Science Research Landscape,” CAIS, vol. 50, pp. 358-395, 2022.
  39. K. Nahar, and A. Q. Gill, “Integrated identity and access management metamodel and pattern system for secure enterprise architecture,” Data & Knowledge Engineering, vol. 140, Article 102038 2022.
  40. F. K. de Oliveira, M. B. de Oliveira, A. S. Gomes and L. M. Queiros, “RECREIO: Floss as SAAS for sharing of educational resources,” in: Proc. of 12th Iberian Conference on Information Systems and Technologies (CISTI), Lisbon, Portugal, 2017.
  41. M. Overeem, M. Mathijssen, and S. Jansen, “API-m-FAMM: A focus area maturity model for API management,” Inform. and Soft. Tech., vol. 147, Article 106890, 2022.
  42. R. K. Pallasena, M. Sharma, and V. Krishnaswamy, “A Study of Interaction, Visual Canvas, and Immersion in AR Design: A DSR Approach,” AIS Trans. on HCI, vol. 14, pp. 390-425, 2022.
  43. N. Prat, I. Comyn-Wattiau, and J. Akoka, “A Taxonomy of Evaluation Methods for Information Systems Artifacts. JMIS, vol. 32, pp.229-267, 2015.
  44. I. G. A. Premananda, A. Tjahyanto, and A. Mukhlason, “Design Science Research Methodology and Its Application to Developing a New Timetabling Algorithm, in: Proc. of IEEE CyberneticsCom 2022, Malang, Indonesia, 2022.
  45. J. Pries-Heje, J. Venable, and R. L. Baskerville, “RMF4DSR: A Risk Management Framework for Design Science Research,” Scand. J. of Inform. Syst., vol. 26, pp. 57-82, 2014.
  46. S. Pulparambil, Y. Baghdadi, and C. Salinesi, “A methodical framework for service-oriented architecture adoption,” Inform. and Soft. Tech., vol. 132, Article 106487, 2021.
  47. A. Shrestha, A. Cater-Steel, M. Toleman, and T. Rout, “The Role of International Standards to Corroborate Artefact Development and Evaluation,” in: Software Process Improvement and Capability Determination. SPICE 2017, Communications in Computer and Information Science, vol. 770, A. Mas, A. Mesquida, R. O'Connor, T. Rout, and A. Dorling, Eds., Cham: Springer, 2017, pp. 438-451.
  48. P. M. Wiegmann, M. Talmar, and S. B. De Nijs, “Forging a sharper blade: A design science research approach for transition studies,” Environ. Innov. and Soc. Tran., vol. 48, Article 100760, 2023.
  49. F. D. Davis, “Perceived usefulness, perceived ease of use, and user acceptance of information technology,” MIS Quarterly, vol. 13, pp. 319-340, 1989.
  50. P. Checkland, and J. Scholes, Soft systems methodology in action. Hoboken, New Jersey, USA: John Wiley & Sons.
  51. ISO/IEC/IEEE, ISO/IEC/IEEE 24765:2017 Systems and software engineering — Vocabulary, Edition 2, September 2017.
  52. S. Gregor, and A. R. Hevner, “Positioning and presenting design science research for maximum impact,” MIS Quarterly, vol. 37, pp. 337-355, 2013.