Home > Research > Publications & Outputs > Coherent emotional perception from body express...

Electronic data

  • 1-s2.0-S0028393216302822-main

    Rights statement: This is the author’s version of a work that was accepted for publication in Neuropsychologia. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neuropsychologia, 91, 2016 DOI: 10.1016/j.neuropsychologia.2016.07.038

    Accepted author manuscript, 1.1 MB, PDF document

    Available under license: CC BY-NC-ND: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Links

Text available via DOI:

View graph of relations

Coherent emotional perception from body expressions and the voice

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
<mark>Journal publication date</mark>10/2016
<mark>Journal</mark>Neuropsychologia
Volume91
Number of pages10
Pages (from-to)99-108
Publication StatusPublished
Early online date30/07/16
<mark>Original language</mark>English

Abstract

Perceiving emotion from multiple modalities enhances the perceptual sensitivity of an individual. This allows more accurate judgments of others’ emotional states, which is crucial to appropriate social interactions. It is known that body expressions effectively convey emotional messages, although fewer studies have examined how this information is combined with the auditory cues. The present study used event-related potentials (ERP) to investigate the interaction between emotional body expressions and vocalizations. We also examined emotional congruency between auditory and visual information to determine how preceding visual context influences later auditory processing. Consistent with prior findings, a reduced N1 amplitude was observed in the audiovisual condition compared to an auditory-only condition. While this component was not sensitive to the modality congruency, the P2 was sensitive to the emotionally incompatible audiovisual pairs. Further, the direction of these congruency effects was different in terms of facilitation or suppression based on the preceding contexts. Overall, the results indicate a functionally dissociated mechanism underlying two stages of emotional processing whereby N1 is involved in cross-modal processing, whereas P2 is related to assessing a unifying perceptual content. These data also indicate that emotion integration can be affected by the specific emotion that is presented.

Bibliographic note

This is the author’s version of a work that was accepted for publication in Neuropsychologia. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neuropsychologia, 91, 2016 DOI: 10.1016/j.neuropsychologia.2016.07.038