Home > Research > Publications & Outputs > Coherent emotional perception from body express...

Electronic data

  • 1-s2.0-S0028393216302822-main

    Rights statement: This is the author’s version of a work that was accepted for publication in Neuropsychologia. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neuropsychologia, 91, 2016 DOI: 10.1016/j.neuropsychologia.2016.07.038

    Accepted author manuscript, 1.1 MB, PDF document

    Available under license: CC BY-NC-ND: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Links

Text available via DOI:

View graph of relations

Coherent emotional perception from body expressions and the voice

Research output: Contribution to journalJournal article

Published

Standard

Coherent emotional perception from body expressions and the voice. / Yeh, Pei-Wen; Geangu, Elena; Reid, Vincent Michael.

In: Neuropsychologia, Vol. 91, 10.2016, p. 99-108.

Research output: Contribution to journalJournal article

Harvard

APA

Vancouver

Author

Bibtex

@article{eae5e0e5e78244b0b117b54371fc59d2,
title = "Coherent emotional perception from body expressions and the voice",
abstract = "Perceiving emotion from multiple modalities enhances the perceptual sensitivity of an individual. This allows more accurate judgments of others{\textquoteright} emotional states, which is crucial to appropriate social interactions. It is known that body expressions effectively convey emotional messages, although fewer studies have examined how this information is combined with the auditory cues. The present study used event-related potentials (ERP) to investigate the interaction between emotional body expressions and vocalizations. We also examined emotional congruency between auditory and visual information to determine how preceding visual context influences later auditory processing. Consistent with prior findings, a reduced N1 amplitude was observed in the audiovisual condition compared to an auditory-only condition. While this component was not sensitive to the modality congruency, the P2 was sensitive to the emotionally incompatible audiovisual pairs. Further, the direction of these congruency effects was different in terms of facilitation or suppression based on the preceding contexts. Overall, the results indicate a functionally dissociated mechanism underlying two stages of emotional processing whereby N1 is involved in cross-modal processing, whereas P2 is related to assessing a unifying perceptual content. These data also indicate that emotion integration can be affected by the specific emotion that is presented.",
keywords = "EEG/ERP, audiovisual processing, body expressions, emotion, congruency, cross-modal prediction",
author = "Pei-Wen Yeh and Elena Geangu and Reid, {Vincent Michael}",
note = "This is the author{\textquoteright}s version of a work that was accepted for publication in Neuropsychologia. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neuropsychologia, 91, 2016 DOI: 10.1016/j.neuropsychologia.2016.07.038",
year = "2016",
month = oct
doi = "10.1016/j.neuropsychologia.2016.07.038",
language = "English",
volume = "91",
pages = "99--108",
journal = "Neuropsychologia",
issn = "0028-3932",
publisher = "Elsevier Limited",

}

RIS

TY - JOUR

T1 - Coherent emotional perception from body expressions and the voice

AU - Yeh, Pei-Wen

AU - Geangu, Elena

AU - Reid, Vincent Michael

N1 - This is the author’s version of a work that was accepted for publication in Neuropsychologia. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neuropsychologia, 91, 2016 DOI: 10.1016/j.neuropsychologia.2016.07.038

PY - 2016/10

Y1 - 2016/10

N2 - Perceiving emotion from multiple modalities enhances the perceptual sensitivity of an individual. This allows more accurate judgments of others’ emotional states, which is crucial to appropriate social interactions. It is known that body expressions effectively convey emotional messages, although fewer studies have examined how this information is combined with the auditory cues. The present study used event-related potentials (ERP) to investigate the interaction between emotional body expressions and vocalizations. We also examined emotional congruency between auditory and visual information to determine how preceding visual context influences later auditory processing. Consistent with prior findings, a reduced N1 amplitude was observed in the audiovisual condition compared to an auditory-only condition. While this component was not sensitive to the modality congruency, the P2 was sensitive to the emotionally incompatible audiovisual pairs. Further, the direction of these congruency effects was different in terms of facilitation or suppression based on the preceding contexts. Overall, the results indicate a functionally dissociated mechanism underlying two stages of emotional processing whereby N1 is involved in cross-modal processing, whereas P2 is related to assessing a unifying perceptual content. These data also indicate that emotion integration can be affected by the specific emotion that is presented.

AB - Perceiving emotion from multiple modalities enhances the perceptual sensitivity of an individual. This allows more accurate judgments of others’ emotional states, which is crucial to appropriate social interactions. It is known that body expressions effectively convey emotional messages, although fewer studies have examined how this information is combined with the auditory cues. The present study used event-related potentials (ERP) to investigate the interaction between emotional body expressions and vocalizations. We also examined emotional congruency between auditory and visual information to determine how preceding visual context influences later auditory processing. Consistent with prior findings, a reduced N1 amplitude was observed in the audiovisual condition compared to an auditory-only condition. While this component was not sensitive to the modality congruency, the P2 was sensitive to the emotionally incompatible audiovisual pairs. Further, the direction of these congruency effects was different in terms of facilitation or suppression based on the preceding contexts. Overall, the results indicate a functionally dissociated mechanism underlying two stages of emotional processing whereby N1 is involved in cross-modal processing, whereas P2 is related to assessing a unifying perceptual content. These data also indicate that emotion integration can be affected by the specific emotion that is presented.

KW - EEG/ERP

KW - audiovisual processing

KW - body expressions

KW - emotion

KW - congruency

KW - cross-modal prediction

U2 - 10.1016/j.neuropsychologia.2016.07.038

DO - 10.1016/j.neuropsychologia.2016.07.038

M3 - Journal article

VL - 91

SP - 99

EP - 108

JO - Neuropsychologia

JF - Neuropsychologia

SN - 0028-3932

ER -