Final published version
Licence: CC BY: Creative Commons Attribution 4.0 International License
Research output: Contribution to Journal/Magazine › Journal article › peer-review
Research output: Contribution to Journal/Magazine › Journal article › peer-review
}
TY - JOUR
T1 - Reproducibility in lie detection research
T2 - A case study of the cue called complications
AU - Neequaye, David A.
PY - 2025/5/28
Y1 - 2025/5/28
N2 - Purpose: This review examined reproducibility in verbal lie detection research, wherein studies typically involve coding statements to identify deception cues. Such coding is prone to analytic flexibility that can invite false positives. I focused on the cue called complications as a case study. The variable emerged in the literature simultaneously with the availability of open science resources—providing a reasonable expectation that the relevant materials would be archived in accessible repositories if not in the publication. Methods: I reviewed 30 relevant publications to assess whether complications research is amenable to auditing. Results: The findings indicated sufficient consistency in the definitions of complications and little ambiguity regarding what the variable denotes. Additionally, numerical estimates indicated that the extant results in the literature might be replicable—but with a significant caveat. Such replicability entirely depends on acquiring the coding protocols and anonymized raw data of published studies. However, that critical information is not publicly available. I discuss the ramifications of this barrier to reproducibility: it prevents the auditing of published findings, which allows explaining null findings away with post hoc explanations that depend on inaccessible information. Conclusions: At a minimum, journal editors and reviewers must insist on the codebooks of coding protocols. Providing the corresponding anonymized raw data should also be a requirement unless specific obstructions like grant agreements prevent data sharing. The nature of verbal lie detection research necessitates this policy.
AB - Purpose: This review examined reproducibility in verbal lie detection research, wherein studies typically involve coding statements to identify deception cues. Such coding is prone to analytic flexibility that can invite false positives. I focused on the cue called complications as a case study. The variable emerged in the literature simultaneously with the availability of open science resources—providing a reasonable expectation that the relevant materials would be archived in accessible repositories if not in the publication. Methods: I reviewed 30 relevant publications to assess whether complications research is amenable to auditing. Results: The findings indicated sufficient consistency in the definitions of complications and little ambiguity regarding what the variable denotes. Additionally, numerical estimates indicated that the extant results in the literature might be replicable—but with a significant caveat. Such replicability entirely depends on acquiring the coding protocols and anonymized raw data of published studies. However, that critical information is not publicly available. I discuss the ramifications of this barrier to reproducibility: it prevents the auditing of published findings, which allows explaining null findings away with post hoc explanations that depend on inaccessible information. Conclusions: At a minimum, journal editors and reviewers must insist on the codebooks of coding protocols. Providing the corresponding anonymized raw data should also be a requirement unless specific obstructions like grant agreements prevent data sharing. The nature of verbal lie detection research necessitates this policy.
KW - veracity
KW - replication
KW - lie detection
KW - complications
KW - reproducibility
U2 - 10.1111/lcrp.12315
DO - 10.1111/lcrp.12315
M3 - Journal article
JO - Legal and Criminological Psychology
JF - Legal and Criminological Psychology
SN - 1355-3259
ER -