New chapter examines ‘dark side’ of emotional AI in emerging technologies

New chapter examines ‘dark side’ of emotional AI in emerging technologies
Emotions play a key role in decision-making, but data protection and privacy law frameworks don’t adequately address their interface with artificial intelligence. Photo: Geralt/Pixabay

I think the paper is important as it analyses some of the core assumptions within the law relating to assumed rationality/capacity of consumers to make informed choices.

It’s one of the final frontiers in the world of artificial intelligence (AI), separating humans from machines: our emotions. However, some suggest that this gap is increasingly closing with significant implications for the law and privacy.

‘Emotional AI’ refers to a technological subset that measures, simulates and reacts to human emotions. It’s also the focus of a chapter by Dr Damian Clifford, a Senior Lecturer at The Australian National University (ANU) College of Law, in the new book Future Law: Emerging Technology, Regulation and Ethics.

The chapter, titled ‘Citizen-consumers in a personalised Galaxy: Emotion influenced decision-making, a true path to the dark side?’, explores the emergence of technologies purportedly capable of detecting, classifying and responding to users’ emotional lives and thereby appearing to understand their audience.

“In particular, the chapter looks at the emergence of these technologies from a consumer-facing perspective taking their claimed capacity somewhat at face value so as to assess the capacity of the current legislative framework in the European Union to respond to the risks posed by such innovations,” said Dr Clifford.

“In doing so, the chapter analyses the role of data protection and privacy law (i.e. in particular, the EU General Data Protection Regulation (GDPR) and the ePrivacy Directive), but also the EU consumer law acquis (e.g. the Unfair Contract Terms and Unfair Commercial Practices directives). The chapter identifies weaknesses in the framework in terms of the effects emotionally-aware personalisation may have on the rational decision-making capacity of consumers in light of the mediating effects of technology.”

The chapter builds on Dr Clifford’s PhD research, which focused on the monetisation of emotion through the developments in affective computing. His research also builds on “the wave of interest in AI at every level”, he added. In particular, it aims to better inform the policy debates and legal discussion regarding how existing frameworks actually apply.

“There has been widespread discussions regarding AI ethics and, connectedly, how we should regulate AI. The chapter illustrates through its narrowed scope that we already have a lot of law to apply to such developments and that we need to understand how this applies,” he explained.

Dr Clifford’s research in this field comes at an important time in Australia. He noted his chapter “pokes holes in some of the key legislative framework” when they are applied to the emotional AI case study.

“For instance, in data privacy legislation, such as the EU General Data Protection Regulation or the Privacy Act 1988 (Cth) here in Australia, there are certain categories of information that are afforded extra protection (i.e. so-called sensitive personal data/information) because of their sensitivity. Here, we can think of health data as an example,” he explained.

“However, things get complex when you consider information about someone’s emotions; for example, is such information to be understood as health data given that insights into someone’s emotional state overtime may reveal information about their mental health by inference? And if not, where should the lines be drawn? Is it linked to the purpose the information is used for and what separates health care from pseudo-health care products or services?

“In addition, I think the paper is important as it analyses some of the core assumptions within the law relating to assumed rationality/capacity of consumers to make informed choices. In doing so the chapter questions the place of emotion in the law and how such developments may push the notion of the ‘average’ consumer further and further from the rational market actor paradigm imbued in the law,” he added.

Dr Clifford has helped develop the ANU Ninian Stephen Cyber Law program, an online professional development course delivered by the ANU College of Law. He is also a research fellow with Humanising Machine Intelligence, a multidisciplinary project funded by the ANU Grand Challenges scheme.

Dr Clifford is currently working on a book based on his PhD research due for release in 2022.

Learn more about law and technology research at the ANU College of Law here.

Other stories you might like to read

Updated:  10 August 2015/Responsible Officer:  College General Manager, ANU College of Law/Page Contact:  Law Marketing Team