Facial expression aftereffect and its association with autistic traits in obscured faces

Objects we see in daily life are often occluded and rarely appear in full form. The present study investigates how people recognize emotions from partially occluded faces. Previous studies have shown that prolonged exposure to a complete emotional face biases the perception of emotion of subsequent...

Full description

Saved in:
Bibliographic Details
Main Author: Luo, Chengwen
Other Authors: Xu Hong
Format: Theses and Dissertations
Language:English
Published: 2017
Subjects:
Online Access:http://hdl.handle.net/10356/71113
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Objects we see in daily life are often occluded and rarely appear in full form. The present study investigates how people recognize emotions from partially occluded faces. Previous studies have shown that prolonged exposure to a complete emotional face biases the perception of emotion of subsequent faces, the facial expression aftereffect (FEA). However, less is known about the influences of an obscured emotional face to the perception of subsequent facial emotions. People with autism spectrum disorder (ASD) have been found to suffer from compromised ability to process facial emotions. Autistic traits typically exhibited in people with ASD are also demonstrated in neurotypical individuals (NT) to milder degrees, however, the association between autistic traits and the perception of facial emotion in NT is understudied. The present thesis aims to bridge the gap by addressing the following questions: (1) Can the perception of facial emotion be biased by prolonged exposure to partially obscured emotional faces? (2) How autistic traits are associated with the perception of facial emotion from these faces? (3) What are the possible neural mechanisms of the association between autistic traits and FEAs. Three studies were carried out to address these questions. Study 1 was carried out to answer the first question. The recognizability of the facial emotion was manipulated by partially occluding different areas of the faces. We then manipulated the dynamics, the location, and size of the preceding face (the adaptor) to infer the contribution of local versus high-level information to the perception of facial emotion. Results indicate that for static conditions, significant FEAs were only observed when participants could recognize the emotion conveyed by the adaptors (Experiment 1.1). However, the dynamic facial videos generated significant FEAs regardless of recognizability (Experiment 1.2). The FEAs also survived locational shift of the adaptor, suggesting high-level adaptation of facial emotion and possible activation of both dorsal and subcortical pathways (Experiment 1.3). The second study tested the association between autistic traits and FEAs. In Experiment 2.1, we occluded the adapting faces with aligned or misaligned bars, and measured the autistic traits of participants with Autism-Spectrum Quotient (AQ). Both conditions produced significant FEAs, with FEAs and autistic traits negatively correlated. In Experiment 2.2, we manipulated the holistic perception levels by flickering the viewable facial parts in different synchronization. Results again showed significant FEAs in all conditions, however, the association between autistic traits and FEAs was abolished. The third study further explored the association between AQ and FEAs in obscured emotional faces using simultaneous behavioral and EEG recordings. Results showed that the N170 suppression was stronger if the preceding adaptors could be more holistically perceived. We observed a positive correlation between autistic traits and N170 suppression, suggesting that high autistic traits were associated with reduced holistic perception in obscured faces. The late positive potential (LPP), an indicator of emotion processing, were suppressed when emotion of test faces was the same as of the adaptor’s. The LPP suppression of the test faces was largest when the adaptors had the strongest completion cues. Autistic traits were negatively associated with perceived emotional intensity of the obscured face with intermediate completion cues. Taken together, the thesis has investigated the perception of facial emotion from partially occluded faces in neurotypical individuals with a range of autistic traits. Results show that prolonged exposure to obscured faces produces FEA, although the magnitude of FEA is subject to autistic levels. Increasing autistic level is associated with smaller perceptional bias after adapting to static partially obscured faces. Possible explanations are that people with high AQ suffer from compromised ability to integrate viewable facial parts into a holistic construct and reduced emotion intensity perception. To the best of our knowledge, this is the first study that has been carried out to examine the emotion adaptation in NT with a range of autistic traits. In addition to that, we used both behavioral and physiological approaches which provide a more comprehensive understanding of the perception of facial emotion under impoverished conditions.  It also sheds lights on possible training schemes that could help to improve the quality of life in people with high autistic traits.