Logo PTI Logo FedCSIS

Proceedings of the 18th Conference on Computer Science and Intelligence Systems

Annals of Computer Science and Information Systems, Volume 35

Inscrutability versus Privacy and Automation versus Labor in Human-Centered AI: Approaching Ethical Paradoxes and Directions for Research

DOI: http://dx.doi.org/10.15439/2023F7504

Citation: Proceedings of the 18th Conference on Computer Science and Intelligence Systems, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 35, pages 11011105 ()

Full text

Abstract. From an analysis of ethical paradoxes based on the inscrutability feature of AI algorithms and resulting from recent advances in this field, this paper emphasizes the pressingness of dedicating research to the potential consequences on societal organization and interactions. With reference to Critical Theory that needs to be recombined with other socio-technical theories, new perspectives on future research is offered and discussed in light of privacy and labor market, their mutual influence as well as limitations.

References

  1. Adman, P., and Warren, L. 2000. “Participatory sociotechnical design of organizations and information systems - an adaptation of ethics methodology,” Journal of Information Technology (15:1), pp. 39-51.
  2. Adorno, T. W. 1972. „Zur Logik der Sozialwissenschaften,“ in Gesammelte Schriften. Band 8: Soziologische Schriften I, Suhrkamp: Frankfurt am Main, pp. 547–565.
  3. Akerlof, G. A. 1970. “The market for ‘lemons’: Quality uncertainty and the market mechanism”, The Quarterly Journal of Economics, pp. 488-500.
  4. Allen, V. L. and Levine, J: M. 1971. “Social support and conformity: The role of independent assessment of reality,” Journal of Experimental Social Psychology (7), pp. 48-58.
  5. Asch, S. E. 1951. “Effects of group pressure upon the modification and distortion of judgments,” in Groups, leadership and men, H. Guetzkow (ed.), Pittsburgh: Carnegie, pp.177-190.
  6. Berente, N., Gu, B., and Recker, J. 2021. “Managing Artificial Intelligence,” MIS Quarterly (45:3), pp. 1-41.
  7. Blatchford, C. 2000. “Information security, business and the internet — Part 1,” Network Security 2000(1), pp. 8–12.
  8. Bechmann, A. and Bowker, C. G. 2019. “Unsupervised by any other name: Hidden layers of knowledge production in artificial intelligence on social meda,” Big Data & Society, pp. 1-11.
  9. Chaum, D. 1992. “Achieving Electronic Privacy, Scientific American,” pp. 66-101, URL: https://chaum.com/wp-content/uploads/2021/12/ScientificAmerican-AEP.pdf.
  10. Cialdini, R. B. and Trost, M. R. 1998. “Social influence: Social norms, conformity, and compliance,” in Handbook of social psychology D. T. Gilbert, S. T. Fiske and G. Lindzey (eds), Boston: McGraw-Hill, pp. 151-192.
  11. Cohen, B. 1967. “An Ethical Paradox,” Mind (76:302), pp. 250–259.
  12. Davis, F. D. (1989). “Perceived usefulness, perceived ease of use, and user acceptance of information technology,” MIS Quarterly 13(3), 319-339.
  13. Davis, F. D., Bagozzi, R. P. and Warshaw, P. R. (1989). “User acceptance of computer technology: A comparison of two theoretical models,” Management Science (35:8), pp. 982-1003.
  14. DCC (Digital Curation Center). 2020. “The Role of Data in AI: Report for the Data Governance Working Group of the Global Partnership of AI,” School of Informatics, University of Edinburgh.
  15. Habermas, J. 1981. Der philosophische Diskurs der Moderne. Zwölf Vorlesungen, Frankfurt am Main: Suhrkamp.
  16. Hogg, M. A., Sherman, D. K., Dierselhuis, J., Maitner, A. T. and Moffitt, G. 2007. “Uncertainty, entitativity, and group identification,” Journal of Experimental Psychology (43), pp. 135-142.
  17. Horkheimer, M. 1988. „Traditionelle und kritische Theorie,“ in Gesammelte Schriften. Band 4: Schriften 1936–1941. Frankfurt am Main: Fischer.
  18. Janis, I. L. 1982. Groupthink, Boston: Hougthon Mifflin.
  19. Jaton, F. (2021): Assessing biases, relaxing moralism: On ground-truthing practices in machine learning design and application. Big Data & Society, pp. 1-15.
  20. Jozani, M., Ayaburi, E., Ko, M., and Choo, K. K. R. (2020). Privacy concerns and benefits of engagement with social media-enabled apps: A privacy calculus perspective. Computers in Human Behavior (107), 106260.
  21. Kerkmann, C. 2022. Künstliche Intelligenz: Wirtschaft warnt vor „massiven Einschränkungen“ durch AI Act. URL am 23. Januar 2023: https://www.handelsblatt.com/technik/it-internet/eu-regulierung-kuenstliche-intelligenz-wirtschaft-warntvor-massiven-einschraenkungen-durch-ai-act/28850684.html.
  22. Kropp, P., Theuer, S. and Fritzsche, B. 2018. „Immer mehr Tätigkeiten werden durch Digitalisierung ersetzbar: Aktualisierte Substituierbarkeitspotenziale in Thüringen, IAB-Regional. IAB Sachsen Anhalt-Thüringen,“ No. 02/2018, Institut für Arbeitsmarkt und Berufsforschung (IAB), Nürnberg.
  23. Laufer, R. S. and Wolfe, M. 1977. „Privacy as a Concept and a Social Issue: A Multidimensional Developmental Theory,” Journal of Social Issues (33), pp. 22-42.
  24. Marcuse, H. 1965. „Philosophie und kritische Theorie,“ in Kultur und Gesellschaft I, Frankfurt am Main: Suhrkamp, pp. 102–127.
  25. Markus, L. 1983. “Power, politics, and MIS implementation,” Communications of the ACM (26:6), pp. 430-444.
  26. McFarland, D. A., and McFarland, H. R. 2015. „Big Data and the danger of being precisely inaccurate,”. Big Data & Society (2:2). https://doi.org/10.1177/2053951715602495 .
  27. Milgram, S. 1974. Obedience to authority: An experimental view, New York: Harper & Row.
  28. Moscovici, S. 1976. Social Influence and social change, London: Academic Press.
  29. Moscovici, S. 1980. “Toward a theory of conversion behavior,” in Advances in experimental social psychology (13), L. Berkowitz (ed.), pp. 209-234.
  30. Müller, A. 2022. „Der Artificial Intelligence Act der EU: Ein risikobasierter Ansatz zur Regulierung von Künstlicher Intelligenz.“ Zeitschrift für Europarecht (1).
  31. Mumford, E. 1983. Designing human systems, the ETHICS approach. Manchester Business School, Manchester, U.K.
  32. Mumford, E. 2000, “Socio-technical design: An Unfulfilled Promise or a future Opportunity”. in Organizational and Social Perspectives on Information Technology, Baskerville, R., Stage, J., and DeGross, J.I. (eds), Boston: Kluwer academic Publications, pp 33-46.
  33. Mumford, E., (2003), Redesigning Human Systems, Hershey: IRM Press.
  34. Mumford, E. and Weir, M., (1979). Computer Systems and work Design: The ETHICS Method, New York: Wiley & Sons.
  35. Paaß, G. and Hecker, D. 2020. Künstliche Intelligenz – Was steckt hinter der Technologie der Zukunft?, Springer Vieweg.
  36. Sherif, M. 1936. The psychology of social norms, New York: Harper & Row.
  37. Smith, W. and Lewis, M. (2011). "Toward a theory of paradox: A dynamic equilibrium model of organizing". Academy of Management Review, (36:2), pp. 381-403.
  38. Turner, J. C. 1991. Social Influence, Buckingham: Open University Press.
  39. Vallor, S. 2016. Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting. Oxford: University Press.
  40. Venkatesh,V., Morris, M. G.;Davis, G. B.;Davis, F. D. 2003. “User acceptance of information technology: Toward a unified view”, MIS Quarterly (27:3), 425-478.