Please use this identifier to cite or link to this item:
|Title:||Seeing sounds around you : non-linguistic visual information and speech perception||Authors:||Ho, Danyuan||Keywords:||DRNTU::Humanities::Linguistics::Phonology
|Issue Date:||2016||Abstract:||The role of visual information in speech perception has not been adequately addressed in the current exemplar-based approach despite evidence showing that speech perception is multimodal in nature. This study investigates the role of visual information in speech perception by examining the co-encoding of visual and auditory information. Through a lexical decision paradigm, participants were repeatedly exposed to audio-visual targets consisting of sound tokens co-presented with non-linguistic visual cues. The exemplar-based approach predicts an interaction between the auditory and visual cues due to the concurrent processing of both modalities. However, visual cues did not seem to have any effect in the experiment, suggesting that there was little to no co-encoding of visual and auditory information. Lack of perceptual salience and linguistic remoteness of the visual stimuli are postulated to be likely factors for the non-effect of the visual cues. Taken together, an attentive mechanism that filters information is proposed to refine the current model.||URI:||http://hdl.handle.net/10356/67032||Fulltext Permission:||restricted||Fulltext Availability:||With Fulltext|
|Appears in Collections:||HSS Theses|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.