Selected Papers from the 2019 conference of the Aust Inst of Computer Ethics (AiCE)

2020-06-08

Warren, M., Wahlstrom, K., Wigan, M., & Burmeister, O. K. (2020). Preface Ethics in the Cyber Age and exploring emerging themes and relationships between ethics, governance and emerging technologies. Australasian Journal of Information Systems, 24. https://doi.org/10.3127/ajis.v24i0.2889

Wigan, M. (2020). Rethinking IT Professional Ethics: Classical and Current Contexts . Australasian Journal of Information Systems, 24. https://doi.org/10.3127/ajis.v24i0.2851

Abstract
Professional computer ethics has widened its scope over the last 20 years as a direct result of the massive growth in computer mediated services by government and industry, and concerns over how data and interaction processes are recorded. These shifts are explored in conjunction with the parallel decline in community trust of government. The growing importance of a broader view and action framework for professional computer societies is delineated.

Robinson, B. (2020). Towards an Ontology and Ethics of Virtual Influencers. Australasian Journal of Information Systems, 24. https://doi.org/10.3127/ajis.v24i0.2807

Abstract
In 2018, TIME magazine named Miquela Souza one of the 25 most influential people on the internet, despite the fact she is not a person at all. Miquela is the first digitally created virtual influencer. This paper provides an initial analysis of some of the ontological and ethical issues associated with the rise of virtual influencers on social media platforms like Instagram. Through a focus on Miquela, it is argued that while these fabricated identities may cause uneasiness at first, there is nothing morally significant that distinguishes them from natural, ‘real life’ influencers. But, far from ‘business as usual’, the inability to separate ‘virtual’ and ‘real life’ influencers raises important questions about the ethical construction of identity, and how this may affect the ongoing preservation of social values like trust in online spaces. The paper draws on literature in personal identity and agency theory to establish the ontological claim that there is no meaningful difference between Miquela and other ‘real life’ influencers, which leads to the discussion about ethical issues including moral responsibility and motivation, and transparency. As of May 2020, this appears to be the first peer-reviewed article theorising about virtual influencers. There are significant opportunities for further research, both in terms of how we should conceptualise these identities, as well as more empirically based social research into how to preserve social values like trust in online spaces.

Warren, M. (2020). Fake News Case Study during the Australian 2019 General Election. Australasian Journal of Information Systems, 24. https://doi.org/10.3127/ajis.v24i0.2803

Abstract
Social media is used by all aspects of society from citizens to businesses, but it also now used by political parties. Political parties use social media to engage with voters as a method of attract new voters or reinforcing the views of political parties’ current supporters. An important consideration is the ethical conduct of political parties and politicians in how they use social media. It is now recognized that social media can also have negative aspects seen by the introduction of Fake News. These negative aspects of social media are often overlooked and have not been explored from a research perspective. This paper looks at the Australian 2019 General Election and discusses a major Fake News example that occurred during that election. The paper will also describe the different types of social media data was collected during the study and also present the analysis of the data collected as well discussing the research findings including the ethical issues.

Wahlstrom, K., Ul-haq, A., & Burmeister, O. (2020). Privacy by design: a Holochain exploration. Australasian Journal of Information Systems, 24. https://doi.org/10.3127/ajis.v24i0.2801

Abstract
Privacy is important because it supports freedom, dignity, autonomy, justice, and democracy, and therefore it is important that privacy is studied in ontologically robust ways. A form of privacy is implemented in the right to be forgotten, which is a human right established by the European Court of Justice. Blockchain and Holochain are examples of recently emerged technologies that were shaped by, and are now shaping of, social contexts in which economic transactions may occur. The right to be forgotten represents a compliance challenge for public and private implementations of blockchain technology. This paper describes a few of these challenges.

Wang, L. (2020). The Three Harms of Gendered Technology. Australasian Journal of Information Systems, 24. https://doi.org/10.3127/ajis.v24i0.2799

Abstract
Marginalised groups experience both immediate and long-term detriment as a result of innovations in information systems. This paper explores three facets of technologically related gendered harm: physical, institutional, and psychological. These harms will be demonstrated by case studies. Firstly, technology can cause physical harm by denying women their bodily autonomy, demonstrated by the public availability of AI software that generates nude pictures of women, and smart home devices used in instances of domestic abuse. Secondly, technology can deny women institutional access, as increasingly widespread algorithms are shown to underperform on marginalised groups. Thirdly, anthropomorphised technology reflects and entrenches harmful stereotypes of women’s submissiveness, causing psychological harm. Reducing harm must go beyond ensuring a diversity of representation in STEM fields. We conclude that effective regulation should focus on the design features in technological innovations.

Wildenauer, M. (2020). The Shared Responsibility Model: Levers of Influence and Loci of Control to aid Regulation of Ethical Behaviour in Technology Platform Companies. Australasian Journal of Information Systems, 24. https://doi.org/10.3127/ajis.v24i0.2797

Abstract
This exploratory paper provides social context for platform corporations and examples of ethical transgressions by them and then canvasses the role of various organizational actors in controlling the ethical behaviour of ‘platforms', which may be seen to be more than usually problematic in this regard. From this survey, the conclusion is drawn that there may be no single actor that offers sufficient leverage to change organizational ethical behaviour. The paper then suggests the Shared Responsibility Model as a possible conceptual framework for a better understanding of the issue of ethical control and recommends practical interventions that may assist in realizing ethical behaviour by platforms that more closely aligns with societal expectations. The paper offers a caution about the side-effects of interventions to improve ethical behaviour, before concluding by pointing out implications of these findings for state-actor regulators and avenues for future research.

Kaluarachchi, C., Warren , M., & Jiang, F. (2020). Review: Responsible use of technology to combat Cyberbullying among adolescents. Australasian Journal of Information Systems, 24. https://doi.org/10.3127/ajis.v24i0.2791

Abstract
Cyberbullying has become a major challenge for authorities, parents, guardians and schools in particular, especially in the era of the digital world. This paper reviews available empirical research to examine the issues such as the responsible use of technology amongst young people, parents and schools responsibility to protect against Cyberbullying. The analysis revealed that the responsible use of technology provides better practices to encourage comparisons because of these new digital technologies. Parents and educators are the key to Cyber ethics, therefore teaching the responsible use of technology whilst focusing on Cyber ethics at the start of young people’s exposure to technology use may be an excellent strategy to reduce the growth and impact of Cyberbullying. The paper will also review good practices for young people, school communities and parents to prevent and manage Cyberbullying and unethical behaviours online. These claims are examined using current literature to ensure a better understanding of responsible use of technology and understanding of Cyberbullying in order to support young people to combat this immerging societal challenge.

Fernando, A., & Scholl , L. (2020). Towards Using Value Tensions to Reframe the Value of Data Beyond Market-based Online Social Norms . Australasian Journal of Information Systems, 24. https://doi.org/10.3127/ajis.v24i0.2793

Abstract
Making sense of data, its value and impact is imperative for individuals, organisations and societies to function in the cyber age. The online interactions through which data flows present many benefits. However, the consumption of data and its value is problematic due to an overreliance on market norms as a substitute for values-based online social norms and practices, creating value tensions. Understanding the implications of data is further complicated due to the complex contextual nature of online interactions. These challenges are addressed through efforts from technology organisations and policy initiatives. Largely absent from these efforts is an understanding of the values needed to ground healthy online social interactions, and processes that nurture and afford the practice of these values in contextual community settings. Value tensions as an ethics tool can surface and clarify these interpersonal needs in understanding data and its impact. Communities may be appropriately placed to grapple with these value tensions given the contextual nature of interactions. This discussion paper presents a preliminary research agenda raising questions on uncovering value tensions and understanding the values at stake to transform data practices and develop healthy online social norms, to reframe the value of data beyond market-based online social norms.

Poulsen, A., Fosch-Villaronga, E., & Burmeister, O. K. (2020). Cybersecurity, value sensing robots for LGBTIQ+ elderly, and the need for revised codes of conduct. Australasian Journal of Information Systems, 24. https://doi.org/10.3127/ajis.v24i0.2789

Abstract
Until now, each profession has developed their professional codes of conduct independently. However, the use of robots and artificial intelligence is blurring professional delineations: aged care nurses work with lifting robots, tablet computers, and intelligent diagnostic systems, and health information system designers work with clinical teams. While robots assist the medical staff in extending the professional service they provide, it is not clear how professions adhere and adapt to the new reality. In this article, we reflect on how the insertion of robots may shape codes of conduct, in particular with regards to cybersecurity. We do so by focusing on the use of social robots for helping LGBTIQ+ elderly cope with loneliness and depression. Using robots in such a delicate domain of application changes how care is delivered, as now alongside the caregiver, there is a cyber-physical health information system that can learn from experience and act autonomously. Our contribution stresses the importance of including cybersecurity considerations in codes of conduct for both robot developers and caregivers as it is the human and not the machine which is responsible for ensuring the system’s security and the user’s safety.