Seeing sounds around you : non-linguistic visual information and speech perception

The role of visual information in speech perception has not been adequately addressed in the current exemplar-based approach despite evidence showing that speech perception is multimodal in nature. This study investigates the role of visual information in speech perception by examining the co-enc...

全面介紹

Saved in:
書目詳細資料
主要作者: Ho, Danyuan
其他作者: James Sneed German
格式: Theses and Dissertations
語言:English
出版: 2016
主題:
在線閱讀:http://hdl.handle.net/10356/67032
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
機構: Nanyang Technological University
語言: English
實物特徵
總結:The role of visual information in speech perception has not been adequately addressed in the current exemplar-based approach despite evidence showing that speech perception is multimodal in nature. This study investigates the role of visual information in speech perception by examining the co-encoding of visual and auditory information. Through a lexical decision paradigm, participants were repeatedly exposed to audio-visual targets consisting of sound tokens co-presented with non-linguistic visual cues. The exemplar-based approach predicts an interaction between the auditory and visual cues due to the concurrent processing of both modalities. However, visual cues did not seem to have any effect in the experiment, suggesting that there was little to no co-encoding of visual and auditory information. Lack of perceptual salience and linguistic remoteness of the visual stimuli are postulated to be likely factors for the non-effect of the visual cues. Taken together, an attentive mechanism that filters information is proposed to refine the current model.