Home > Research > Publications & Outputs > Visual Digital Data, Ethical Challenges and Psy...

Electronic data

Links

Text available via DOI:

View graph of relations

Visual Digital Data, Ethical Challenges and Psychological Science

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
<mark>Journal publication date</mark>30/01/2024
<mark>Journal</mark>American Psychologist
Issue number1
Volume79
Number of pages14
Pages (from-to)109-122
Publication StatusPublished
<mark>Original language</mark>English

Abstract

Digital visual data affords psychologists exciting research possibilities. It becomes possible to see real-life interactions in real time, and to be able to analyze this behavior in a fine grained and systematic manner. However, the fact that faces (and other personally identifying physical characteristics) are captured as part of these data sets, means that this kind of data is at the highest level of sensitivity by default. When this is combined with the possibility of automatic collection and processing, then the sensitivity risks are compounded. Here we explore the ethical challenges that face psychologists wishing to take advantage of digital visual data. Specifically, we discuss ethical considerations around data acquisition, data analysis, data storage and data sharing. We begin by considering the challenges of securing visual data from both public space security systems and from social media sources. We then explore the dangers of bias and discrimination in automatic data processing, as well as the dangers to human analysts. We set out the ethical requirements for secure data storage, the dangers of ‘function creep’ and the challenges of the right of the individual to withdraw from databases. Finally, we consider the tensions that exist between sensitive visual data that require extra protections and the recent open science movement, which advocates data transparency and sharing. We conclude by offering a practical route map for tackling these complex ethical issues in the form of a Privacy and Data Protection Impact Assessment (PDPIA) template for researchers.