0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Electrophysiological differences and similarities in audiovisual speech processing in CI users with unilateral and bilateral hearing loss

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Hearing with a cochlear implant (CI) is limited compared to natural hearing. Although CI users may develop compensatory strategies, it is currently unknown whether these extend from auditory to visual functions, and whether compensatory strategies vary between different CI user groups. To better understand the experience-dependent contributions to multisensory plasticity in audiovisual speech perception, the current event-related potential (ERP) study presented syllables in auditory, visual, and audiovisual conditions to CI users with unilateral or bilateral hearing loss, as well as to normal-hearing (NH) controls. Behavioural results revealed shorter audiovisual response times compared to unisensory conditions for all groups. Multisensory integration was confirmed by electrical neuroimaging, including topographic and ERP source analysis, showing a visual modulation of the auditory-cortex response at N1 and P2 latency. However, CI users with bilateral hearing loss showed a distinct pattern of N1 topography, indicating a stronger visual impact on auditory speech processing compared to CI users with unilateral hearing loss and NH listeners. Furthermore, both CI user groups showed a delayed auditory-cortex activation and an additional recruitment of the visual cortex, and a better lip-reading ability compared to NH listeners. In sum, these results extend previous findings by showing distinct multisensory processes not only between NH listeners and CI users in general, but even between CI users with unilateral and bilateral hearing loss. However, the comparably enhanced lip-reading ability and visual-cortex activation in both CI user groups suggest that these visual improvements are evident regardless of the hearing status of the contralateral ear.

          Graphical abstract

          Highlights

          • Altered audiovisual speech processing after unilateral and bilateral hearing loss.

          • CI users with unilateral and bilateral hearing loss differ in their N1 topography.

          • Enhanced lip-reading and stronger visual-cortex activation in different CI groups.

          • Visual improvements with CI are independent of the contralateral ear's hearing status.

          Related collections

          Most cited references116

          • Record: found
          • Abstract: not found
          • Article: not found

          Controlling the False Discovery Rate: A Practical and Powerful Approach to Multiple Testing

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The assessment and analysis of handedness: The Edinburgh inventory

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis

              We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.
                Bookmark

                Author and article information

                Contributors
                Journal
                Curr Res Neurobiol
                Curr Res Neurobiol
                Current Research in Neurobiology
                Elsevier
                2665-945X
                08 November 2022
                2022
                08 November 2022
                : 3
                Affiliations
                [a ]University of Cologne, Faculty of Medicine and University Hospital Cologne, Department of Otorhinolaryngology, Head and Neck Surgery, Audiology and Pediatric Audiology, Cochlear Implant Center, Germany
                [b ]Jean-Uhrmacher-Institute for Clinical ENT Research, University of Cologne, Germany
                [c ]The Sense Innovation and Research Center, Lausanne and Sion, Switzerland
                [d ]The LINE (The Laboratory for Investigative Neurophysiology), Department of Radiology, Lausanne University Hospital and University of Lausanne, Lausanne, Switzerland
                [e ]CIBM Center for Biomedical Imaging of Lausanne and Geneva, Lausanne, Switzerland
                [f ]Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, USA
                Author notes
                []Corresponding author. at: University of Cologne, Faculty of Medicine and University Hospital Cologne, Department of Otorhinolaryngology, Head and Neck Surgery, Audiology and Paediatric Audiology, Cochlear Implant Center, Kerpener Strasse 62, 50937, Cologne, Germany. natalie.layer@ 123456uk-koeln.de
                Article
                S2665-945X(22)00032-8 100059
                10.1016/j.crneur.2022.100059
                9672392
                36405629
                7b2ecd1a-86c1-43a9-9198-0b834502f47f
                © 2022 The Authors

                This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

                Categories
                Research Article

                cochlear implant,single-sided-deafness,bilateral hearing loss,event-related potential,cortical plasticity,multisensory integration,audiovisual speech perception

                Comments

                Comment on this article