Home > Research > Publications & Outputs > We have to talk about Emotional AI and crime

Links

Text available via DOI:

View graph of relations

We have to talk about Emotional AI and crime

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

We have to talk about Emotional AI and crime. / Podoletz, Lena.
In: AI and Society, Vol. 38, 30.06.2023, p. 1067-1082.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Podoletz L. We have to talk about Emotional AI and crime. AI and Society. 2023 Jun 30;38:1067-1082. Epub 2022 May 5. doi: 10.1007/s00146-022-01435-w

Author

Podoletz, Lena. / We have to talk about Emotional AI and crime. In: AI and Society. 2023 ; Vol. 38. pp. 1067-1082.

Bibtex

@article{f5973af22caf40a989c0dd50df885ea5,
title = "We have to talk about Emotional AI and crime",
abstract = "Emotional AI is an emerging technology used to make probabilistic predictions about the emotional states of people using data sources, such as facial (micro)-movements, body language, vocal tone or the choice of words. The performance of such systems is heavily debated and so are the underlying scientific methods that serve as the basis for many such technologies. In this article I will engage with this new technology, and with the debates and literature that surround it. Working at the intersection of criminology, policing, surveillance and the study of emotional AI this paper explores and offers a framework of understanding the various issues that these technologies present particularly to liberal democracies. I argue that these technologies should not be deployed within public spaces because there is only a very weak evidence-base as to their effectiveness in a policing and security context, and even more importantly represent a major intrusion to people{\textquoteright}s private lives and also represent a worrying extension of policing power because of the possibility that intentions and attitudes may be inferred. Further to this, the danger in the use of such invasive surveillance for the purpose of policing and crime prevention in urban spaces is that it potentially leads to a highly regulated and control-oriented society. I argue that emotion recognition has severe impacts on the right to the city by not only undertaking surveillance of existing situations but also making inferences and probabilistic predictions about future events as well as emotions and intentions.",
keywords = "Automated affect recognition, Emotional AI, Crime, Policing, Crime prevention, Surveillance",
author = "Lena Podoletz",
year = "2023",
month = jun,
day = "30",
doi = "10.1007/s00146-022-01435-w",
language = "English",
volume = "38",
pages = "1067--1082",
journal = "AI and Society",
issn = "0951-5666",
publisher = "Springer London",

}

RIS

TY - JOUR

T1 - We have to talk about Emotional AI and crime

AU - Podoletz, Lena

PY - 2023/6/30

Y1 - 2023/6/30

N2 - Emotional AI is an emerging technology used to make probabilistic predictions about the emotional states of people using data sources, such as facial (micro)-movements, body language, vocal tone or the choice of words. The performance of such systems is heavily debated and so are the underlying scientific methods that serve as the basis for many such technologies. In this article I will engage with this new technology, and with the debates and literature that surround it. Working at the intersection of criminology, policing, surveillance and the study of emotional AI this paper explores and offers a framework of understanding the various issues that these technologies present particularly to liberal democracies. I argue that these technologies should not be deployed within public spaces because there is only a very weak evidence-base as to their effectiveness in a policing and security context, and even more importantly represent a major intrusion to people’s private lives and also represent a worrying extension of policing power because of the possibility that intentions and attitudes may be inferred. Further to this, the danger in the use of such invasive surveillance for the purpose of policing and crime prevention in urban spaces is that it potentially leads to a highly regulated and control-oriented society. I argue that emotion recognition has severe impacts on the right to the city by not only undertaking surveillance of existing situations but also making inferences and probabilistic predictions about future events as well as emotions and intentions.

AB - Emotional AI is an emerging technology used to make probabilistic predictions about the emotional states of people using data sources, such as facial (micro)-movements, body language, vocal tone or the choice of words. The performance of such systems is heavily debated and so are the underlying scientific methods that serve as the basis for many such technologies. In this article I will engage with this new technology, and with the debates and literature that surround it. Working at the intersection of criminology, policing, surveillance and the study of emotional AI this paper explores and offers a framework of understanding the various issues that these technologies present particularly to liberal democracies. I argue that these technologies should not be deployed within public spaces because there is only a very weak evidence-base as to their effectiveness in a policing and security context, and even more importantly represent a major intrusion to people’s private lives and also represent a worrying extension of policing power because of the possibility that intentions and attitudes may be inferred. Further to this, the danger in the use of such invasive surveillance for the purpose of policing and crime prevention in urban spaces is that it potentially leads to a highly regulated and control-oriented society. I argue that emotion recognition has severe impacts on the right to the city by not only undertaking surveillance of existing situations but also making inferences and probabilistic predictions about future events as well as emotions and intentions.

KW - Automated affect recognition

KW - Emotional AI

KW - Crime

KW - Policing

KW - Crime prevention

KW - Surveillance

U2 - 10.1007/s00146-022-01435-w

DO - 10.1007/s00146-022-01435-w

M3 - Journal article

VL - 38

SP - 1067

EP - 1082

JO - AI and Society

JF - AI and Society

SN - 0951-5666

ER -