Logo PTI Logo FedCSIS

Position Papers of the 17th Conference on Computer Science and Intelligence Systems

Annals of Computer Science and Information Systems, Volume 31

An Integrated Checklist for Architecture Design of Critical Software Systems

, ,

DOI: http://dx.doi.org/10.15439/2022F287

Citation: Position Papers of the 17th Conference on Computer Science and Intelligence Systems, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 31, pages 133140 ()

Full text

Abstract. With the advancement of digitalization, critical information infrastructures, such as intelligent energy distribution, transportation, or healthcare, have opened themselves towards intelligent technological opportunities, including automation of previously manual decision making. As a side effect, the digitalization of these infrastructures gives rise to new challenges, especially linked to the complexity of architecture design of these infrastructures, to later support necessary software quality and safeguard the systems against attacks and other harm. To support software architects in the design of these critical software systems, well structure architectural knowledge would be of great help to prevent the architects from missing some of the crucial concerns that need to be reflected with built-in architectural mechanisms, early during architecture design.Given the narrow scope of existing guidelines, with the need of browsing and combining multiple sources, this paper proposes an integrated checklist to cover the breath of architectural concerns for the design of critical software systems, covering the need for built-in mechanisms to prevent, detect, stop, recover from and analyse intentional as well as unintentional threats to system dependability. Contrary to existing guidelines that typically focus on runtime incident handling, our checklist is to be used during architecture design to ensure that the system has built-in mechanisms to either handle the incidents automatically or include the right mechanisms to support the runtime incident handling.

References

  1. I. Meedeniya, A. Aleti, and B. Buhnova, “Redundancy allocation in automotive systems using multi-objective optimisation,” in Symposium of Avionics/Automotive Systems Engineering (SAASE’09), San Diego, CA, 2009.
  2. S. Chren, B. Rossi, B. Bühnova, and T. Pitner, “Reliability data for smart grids: Where the real data can be found,” in 2018 smart city symposium prague (scsp). IEEE, 2018, pp. 1–6.
  3. J. Rodríguez, A. Galán, A. Alvarez, R. Díaz, and C. Consortium, D2.1, CIPSEC System Design WP 2, Development of the CIPSEC security framework for Critical Infrastructure environments CIPSEC Enhancing Critical Infrastructure Protection with innovative SECurity framework, Jan 2017.
  4. E. The European Union Agency for Cybersecurity, “Eu cybersecurity certification framework,” Dec 2020. [Online]. Available: https: //www.enisa.europa.eu/topics/standards/certification
  5. L. Bass, P. Clements, and R. Kazman, Software architecture in practice, 3rd edition. Addison-Wesley Professional, 2013.
  6. E. Fernandez-Buglioni, Security patterns in practice: designing secure architectures using software patterns. John Wiley & Sons, 2013.
  7. A. Bierska, B. Buhnova, and H. Bangui, “Supplementary material for the integrated checklist,” https://drive.google.com/drive/folders/1DNjQdBmVTR7Z_JYZYpQS_QaohdFboIkC?usp=sharing, 2022.
  8. K. Lukitsch, M. Müller, and C. Stahlhut, “Criticality,” in Key Concepts for Critical Infrastructure Research. Springer, 2018, pp. 11–20.
  9. “Federal ministry of the interior, national strategy for critical infrastructure protection, berlin, germany (www.bmi.bund.de, 2009.”
  10. N. Medvidovic and R. N. Taylor, Software architecture: foundations, theory, and practice. John Wiley & Sons, 2010.
  11. G. Fairbanks, Just enough software architecture: a risk-driven approach. Marshall & Brainerd, 2010.
  12. M. Fowler, Patterns of enterprise application architecture. Addison-Wesley Longman Publishing Co., Inc., 2002.
  13. I. Jacobson, “The immense power of simple check-lists for monitoring projects,” https://www.ivarjacobson.com/publications/blog/power-checklists, 2020.
  14. Z. A. Baig, “Multi-agent systems for protecting critical infrastructures: A survey,” Journal of Network and Computer Applications, vol. 35, no. 3, pp. 1151–1161, 2012.
  15. C. Consortium, “D1.3, report on taxonomy of the ci environments,” Feb 2018. [Online]. Available: https://www.cipsec.eu/sites/default/files/cipsec/public/content-files/deliverables/D1.3%20Report%20on%20Taxonomy%20of%20the%20CI%20environments.pdf
  16. B. Buhnova, T. Kazickova, M. Ge, L. Walletzky, F. Caputo, and L. Carrubbo, “A cross-domain landscape of ict services in smart cities,” in Artificial Intelligence, Machine Learning, and Optimization Tools for Smart Cities. Springer, 2022, pp. 63–95.
  17. B. Robert, R. De Calan, and L. Morabito, “Modelling interdependencies among critical infrastructures,” International Journal of Critical Infrastructures, vol. 4, no. 4, pp. 392–408, 2008.
  18. P. Cichonski, T. Millar, T. Grance, and K. Scarfone, “Computer security incident handling guide : Recommendations of the national institute of standards and technology,” Computer Security Incident Handling Guide, vol. 2, Aug 2012. http://dx.doi.org/10.6028/nist.sp.800-61r2. [Online]. Available: https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-61r2.pdf
  19. C. M. Machuca, S. Secci, P. Vizarreta, F. Kuipers, A. Gouglidis, D. Hutchison, S. Jouet, D. Pezaros, A. Elmokashfi, P. Heegaard et al., “Technology-related disasters: A survey towards disaster-resilient software defined networks,” in 2016 8th International Workshop on Resilient Networks Design and Modeling (RNDM). IEEE, 2016, pp. 35–42.
  20. S. S. Murtaza, W. Khreich, A. Hamou-Lhadj, and A. B. Bener, “Mining trends and patterns of software vulnerabilities,” Journal of Systems and Software, vol. 117, pp. 218–228, 2016.
  21. F. Kadri, B. Birregah, and E. Châtelet, “The impact of natural disasters on critical infrastructures: A domino effect-based study,” Journal of Homeland Security and Emergency Management, vol. 11, no. 2, pp. 217–241, 2014.
  22. G. Kirov, P. Zlateva, and D. Velev, “Software architecture for rapid development of hla-integrated simulations for critical infrastructure elements under natural disasters,” International Journal of Innovation, Management and Technology, vol. 6, no. 4, p. 244, 2015.
  23. L. N. Alrawi and T. Pusatli, “Investigating end user errors in oil and gas critical control systems,” in Proceedings of the 2020 6th International Conference on Computer and Technology Applications, 2020, pp. 41–45.
  24. T. Plėta, M. Tvaronavičienė, S. D. Casa, and K. Agafonov, “Cyberattacks to critical energy infrastructure and management issues: Overview of selected cases,” 2020.
  25. T. Limba, T. Plėta, K. Agafonov, and M. Damkus, “Cyber security management model for critical infrastructure,” 2019.
  26. P. J. G. Seoane, “Use and limitations of checklists. other strategies for audits and inspections,” The Quality Assurance Journal: The Quality Assurance Journal for Pharmaceutical, Health and Environmental Professionals, vol. 5, no. 3, pp. 133–136, 2001.
  27. M. Sibbald, A. B. de Bruin, and J. J. van Merrienboer, “Checklists improve experts’ diagnostic decisions,” Medical education, vol. 47, no. 3, pp. 301–308, 2013.
  28. E. Verdaasdonk, L. Stassen, P. P. Widhiasmara, and J. Dankelman, “Requirements for the design and implementation of checklists for surgical processes,” Surgical endoscopy, vol. 23, no. 4, pp. 715–726, 2009.
  29. North American Electric Reliability Corporation - NERC, “Critical infrastructure protection standards,” 2011.
  30. “IEEE standard for software safety plans,” IEEE Std 1228-1994, pp. 1–24, 1993. http://dx.doi.org/10.1109/IEEESTD.1993.9097571
  31. D. G. Photovoltaics and E. Storage, “IEEE standard for interconnection and interoperability of distributed energy resources with associated electric power systems interfaces,” IEEE Std, pp. 1547–2018, 2018.
  32. “IEEE standard for intelligent electronic devices cyber security capabilities,” IEEE Std 1686-2013 (Revision of IEEE Std 1686-2007), pp. 1–29, 2014. http://dx.doi.org/10.1109/IEEESTD.2014.6704702
  33. “IEEE standard cybersecurity requirements for substation automation, protection, and control systems,” IEEE Std C37.240-2014, pp. 1–38, 2015. http://dx.doi.org/10.1109/IEEESTD.2015.7024885
  34. N. Aeronautics and S. Administration, “Nasa-std-8719.13 software safety standard,” 2020.
  35. National Aeronautics and Space Administration, “Nasa-std-8739.8 software assurance and software safety standard,” 2020.
  36. N. I. of Standards and Technology, “Nistir 7628 revision 1 – guidelines for smart grid cybersecurity,” The Smart Grid Interoperability Panel – Smart Grid Cybersecurity Committee, 2014.
  37. O. Tayan, “Concepts and tools for protecting sensitive data in the it industry: a review of trends, challenges and mechanisms for data-protection,” International Journal of Advanced Computer Science and Applications, vol. 8, no. 2, pp. 46–52, 2017.
  38. M. Papaioannou, M. Karageorgou, G. Mantas, V. Sucasas, I. Essop, J. Rodriguez, and D. Lymberopoulos, “A survey on security threats and countermeasures in internet of medical things (iomt),” Transactions on Emerging Telecommunications Technologies, p. e4049, 2020.
  39. B. W. Lampson, “Computer security in the real world,” Computer, vol. 37, no. 6, pp. 37–46, 2004.
  40. E. B. Fernandez and J. Hawkins, “Determining role rights from use cases,” in Proceedings of the second ACM workshop on Role-based access control, 1997, pp. 121–125.
  41. S. Mare, A. M. Markham, C. Cornelius, R. Peterson, and D. Kotz, “Zebra: Zero-effort bilateral recurring authentication,” in 2014 IEEE Symposium on Security and Privacy. IEEE, 2014, pp. 705–720.
  42. T. Nandy, M. Y. I. B. Idris, R. M. Noor, L. M. Kiah, L. S. Lun, N. B. A. Juma’at, I. Ahmedy, N. A. Ghani, and S. Bhattacharyya, “Review on security of internet of things authentication mechanism,” IEEE Access, vol. 7, pp. 151 054–151 089, 2019.
  43. A. Naiakshina, A. Danilova, C. Tiefenau, M. Herzog, S. Dechand, and M. Smith, “Why do developers get password storage wrong? a qualitative usability study,” in Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security, 2017, pp. 311–328.
  44. M. Alsaleh, M. Mannan, and P. C. Van Oorschot, “Revisiting defenses against large-scale online password guessing attacks,” IEEE Transactions on dependable and secure computing, vol. 9, no. 1, pp. 128–141, 2011.
  45. J. R. de Almeida, J. B. Camargo, B. A. Basseto, and S. M. Paz, “Best practices in code inspection for safety-critical software,” IEEE software, vol. 20, no. 3, pp. 56–63, 2003.
  46. M. E. Whitman and H. J. Mattord, Principles of incident response and disaster recovery. Cengage Learning, 2021.
  47. A. Pecchia, M. Cinque, G. Carrozza, and D. Cotroneo, “Industry practices and event logging: Assessment of a critical software development process,” in 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering, vol. 2. IEEE, 2015, pp. 169–178.
  48. V. Chandola, A. Banerjee, and V. Kumar, “Anomaly detection: A survey,” ACM computing surveys (CSUR), vol. 41, no. 3, pp. 1–58, 2009.
  49. J. K. Teto, R. Bearden, and D. C.-T. Lo, “The impact of defensive programming on i/o cybersecurity attacks,” in Proceedings of the SouthEast Conference, 2017, pp. 102–111.
  50. M. Maass, A. Sales, B. Chung, and J. Sunshine, “A systematic analysis of the science of sandboxing,” PeerJ Computer Science, vol. 2, p. e43, 2016.
  51. A. Tarinejad, H. Izadkhah, M. M. Ardakani, and K. Mirzaie, “Metrics for assessing reliability of self-healing software systems,” Computers & Electrical Engineering, vol. 90, p. 106952, 2021.
  52. A. Mattavelli, “Software redundancy: what, where, how,” Ph.D. dissertation, Università della Svizzera italiana, 2016.
  53. E. Nemeth, G. Snyder, S. Seebass, and T. Hein, UNIX system administration handbook. Pearson Education, 2000.
  54. R. Rowlingson et al., “A ten step process for forensic readiness,” International Journal of Digital Evidence, vol. 2, no. 3, pp. 1–28, 2004.
  55. L. Pasquale, D. Alrajeh, C. Peersman, T. Tun, B. Nuseibeh, and A. Rashid, “Towards forensic-ready software systems,” in 2018 IEEE/ACM 40th International Conference on Software Engineering: New Ideas and Emerging Technologies Results (ICSE-NIER). IEEE, 2018, pp. 9–12.
  56. J. McQuaid, “Forensic considerations for cloud data storage - forensic focus,” 2021. [Online]. Available: https://www.forensicfocus.com/webinars/forensic-considerations-for-cloud-data-storage/
  57. A. Singh, R. A. Ikuesan, and H. Venter, “Secure storage model for digital forensic readiness,” IEEE Access, vol. 10, pp. 19 469–19 480, 2022.
  58. M. Hollick and S. Katzenbeisser, “Resilient critical infrastructures,” in Information Technology for Peace and Security. Springer, 2019, pp. 305–318.
  59. M.-D. McLaughlin and J. Gogan, “Challenges and best practices in information security management,” MIS Quarterly Executive, vol. 17, no. 3, p. 12, 2018.
  60. F. C. Freiling and B. Schwittay, “A common process model for incident response and computer forensics,” IMF 2007: IT-Incident Management & IT-Forensics, 2007.
  61. E. C. Thompson, Cybersecurity incident response: How to contain, eradicate, and recover from incidents. Apress, 2018.
  62. R. Fateman, “Software fault prevention by language choice: Why c is not my favorite language,” in Advances in Computers. Elsevier, 2002, vol. 56, pp. 167–188.
  63. A. Valjarevic and H. S. Venter, “Harmonised digital forensic investigation process model,” in 2012 Information Security for South Africa. IEEE, 2012, pp. 1–10.
  64. Q. He and A. I. Antón, “Requirements-based access control analysis and policy specification (recaps),” Information and Software Technology, vol. 51, no. 6, pp. 993–1009, 2009.
  65. K. Walsh, “Authorization and trust in software systems,” 2012.
  66. G. Rong, Q. Zhang, X. Liu, and S. Gu, “A systematic review of logging practice in software engineering,” in 2017 24th Asia-Pacific Software Engineering Conference (APSEC). IEEE, 2017, pp. 534–539.
  67. L. A. Maglaras, K.-H. Kim, H. Janicke, M. A. Ferrag, S. Rallis, P. Fragkou, A. Maglaras, and T. J. Cruz, “Cyber security of critical infrastructures,” Ict Express, vol. 4, no. 1, pp. 42–45, 2018.
  68. G. Tzokatziou, L. Maglaras, and H. Janicke, “Insecure by design: Using human interface devices to exploit scada systems,” in 3rd International Symposium for ICS & SCADA Cyber Security Research 2015 (ICS-CSR 2015) 3, 2015, pp. 103–106.
  69. M. M. Howell, “Data backups and disaster recovery planning,” 2003.
  70. L. Daubner, M. Macak, B. Buhnova, and T. Pitner, “Verification of forensic readiness in software development: A roadmap,” in Proceedings of the 35th Annual ACM Symposium on Applied Computing, 2020, pp. 1658–1661.
  71. D. E. Rico and M. Hann, “A combined dependability and security approach for third party software in space systems,” arXiv preprint https://arxiv.org/abs/1608.06133, 2016.
  72. J. Obert, P. Cordeiro, J. T. Johnson, G. Lum, T. Tansy, N. Pala, and R. Ih, “Recommendations for trust and encryption in der interoperability standards,” Sandia National Lab.(SNL-NM), Albuquerque, NM (United States); Kitu Systems, Tech. Rep., 2019.
  73. M. Salehie and L. Tahvildari, “Self-adaptive software: Landscape and research challenges,” ACM transactions on autonomous and adaptive systems (TAAS), vol. 4, no. 2, pp. 1–42, 2009.