%A Bayard,Clémence %A Colin,Cécile %A Leybaert,Jacqueline %D 2014 %J Frontiers in Psychology %C %F %G English %K Multimodal speech perception,Cued Speech,cochlear implant,Deafness,audio-visual speech integration %Q %R 10.3389/fpsyg.2014.00416 %W %L %M %P %7 %8 2014-May-19 %9 Original Research %+ Miss Clémence Bayard,clemence.bayard@ulb.ac.be %# %! How is the McGurk effect modulated by Cued Speech ? %* %< %T How is the McGurk effect modulated by Cued Speech in deaf and hearing adults? %U https://www.frontiersin.org/articles/10.3389/fpsyg.2014.00416 %V 5 %0 JOURNAL ARTICLE %@ 1664-1078 %X Speech perception for both hearing and deaf people involves an integrative process between auditory and lip-reading information. In order to disambiguate information from lips, manual cues from Cued Speech may be added. Cued Speech (CS) is a system of manual aids developed to help deaf people to clearly and completely understand speech visually (Cornett, 1967). Within this system, both labial and manual information, as lone input sources, remain ambiguous. Perceivers, therefore, have to combine both types of information in order to get one coherent percept. In this study, we examined how audio-visual (AV) integration is affected by the presence of manual cues and on which form of information (auditory, labial or manual) the CS receptors primarily rely. To address this issue, we designed a unique experiment that implemented the use of AV McGurk stimuli (audio /pa/ and lip-reading /ka/) which were produced with or without manual cues. The manual cue was congruent with either auditory information, lip information or the expected fusion. Participants were asked to repeat the perceived syllable aloud. Their responses were then classified into four categories: audio (when the response was /pa/), lip-reading (when the response was /ka/), fusion (when the response was /ta/) and other (when the response was something other than /pa/, /ka/ or /ta/). Data were collected from hearing impaired individuals who were experts in CS (all of which had either cochlear implants or binaural hearing aids; N = 8), hearing-individuals who were experts in CS (N = 14) and hearing-individuals who were completely naïve of CS (N = 15). Results confirmed that, like hearing-people, deaf people can merge auditory and lip-reading information into a single unified percept. Without manual cues, McGurk stimuli induced the same percentage of fusion responses in both groups. Results also suggest that manual cues can modify the AV integration and that their impact differs between hearing and deaf people.