Home > Research > Publications & Outputs > Understanding the multimodal integration of emo...

Electronic data

  • 2017Peiwenphd

    Final published version, 3 MB, PDF document

    Available under license: CC BY-NC-ND

Text available via DOI:

View graph of relations

Understanding the multimodal integration of emotional information during development

Research output: ThesisDoctoral Thesis

Publication date2017
Number of pages187
Awarding Institution
Award date13/06/2014
  • Lancaster University
<mark>Original language</mark>English


In recent years, body expressions have been demonstrated to be effective visual cues for conveying emotional information. We usually perceive others' emotions from multiple modal sources in our daily lives, such as via the face, sounds and touch. As such, it is an important issue that we seek to understand how we perceive emotional cues as a coherent percept rather than separate percepts. With behavioral measurements, previous studies have provided evidence that a combination of multiple emotional cues can assist in making a more accurate and rapid discrimination of emotional contents. However, little to no research has focused on the integration of emotion perception from body expressions combined with other modal information, especially during development. As a consequence, the aim of this thesis was to investigate developmental changes in neural activity underlying the integration of emotion perception via body expressions and the voice. In Chapter 1, literature on multisensory processing in infants and children was reviewed, and the objectives of the thesis were described. In Chapter 2, processing for unisensory (sounds or body expressions) compared to audiovisual conditions (body expressions with sounds) was measured in adults. In Chapter 3, influences of types of body presentations (dynamic/static) with emotions were examined in an audiovisual paradigm with adults. In Chapter 4, 6.5-month-old infants processed emotional information in a paradigm derived from Chapter 3. In Chapter 5, audiovisual emotion perception was examined in 5-6 year-old children. These studies showed separate processing for interactions between visual and auditory perceptual sources, and for the assessment of combined emotional content across the three ages groups. This series of studies also revealed maturational changes in neural correlates to audiovisual emotion processing in ERP components indexing perception and cognition. A final chapter explores the implications of the findings for understanding the audiovisual emotion processing from a developmental perspective