Seeing sounds around you : non-linguistic visual information and speech perception
The role of visual information in speech perception has not been adequately addressed in the current exemplar-based approach despite evidence showing that speech perception is multimodal in nature. This study investigates the role of visual information in speech perception by examining the co-enc...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Theses and Dissertations |
Language: | English |
Published: |
2016
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/67032 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | The role of visual information in speech perception has not been adequately
addressed in the current exemplar-based approach despite evidence showing
that speech perception is multimodal in nature. This study investigates the role
of visual information in speech perception by examining the co-encoding of
visual and auditory information. Through a lexical decision paradigm,
participants were repeatedly exposed to audio-visual targets consisting of sound
tokens co-presented with non-linguistic visual cues. The exemplar-based
approach predicts an interaction between the auditory and visual cues due to the
concurrent processing of both modalities. However, visual cues did not seem to
have any effect in the experiment, suggesting that there was little to no co-encoding
of visual and auditory information. Lack of perceptual salience and
linguistic remoteness of the visual stimuli are postulated to be likely factors for
the non-effect of the visual cues. Taken together, an attentive mechanism that
filters information is proposed to refine the current model. |
---|