23
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Feeling the Beat: Bouncing Synchronization to Vibrotactile Music in Hearing and Early Deaf People

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The ability to dance relies on the ability to synchronize movements to a perceived musical beat. Typically, beat synchronization is studied with auditory stimuli. However, in many typical social dancing situations, music can also be perceived as vibrations when objects that generate sounds also generate vibrations. This vibrotactile musical perception is of particular relevance for deaf people, who rely on non-auditory sensory information for dancing. In the present study, we investigated beat synchronization to vibrotactile electronic dance music in hearing and deaf people. We tested seven deaf and 14 hearing individuals on their ability to bounce in time with the tempo of vibrotactile stimuli (no sound) delivered through a vibrating platform. The corresponding auditory stimuli (no vibrations) were used in an additional condition in the hearing group. We collected movement data using a camera-based motion capture system and subjected it to a phase-locking analysis to assess synchronization quality. The vast majority of participants were able to precisely time their bounces to the vibrations, with no difference in performance between the two groups. In addition, we found higher performance for the auditory condition compared to the vibrotactile condition in the hearing group. Our results thus show that accurate tactile-motor synchronization in a dance-like context occurs regardless of auditory experience, though auditory-motor synchronization is of superior quality.

          Related collections

          Most cited references28

          • Record: found
          • Abstract: found
          • Article: not found

          Sensorimotor synchronization: a review of recent research (2006-2012).

          Sensorimotor synchronization (SMS) is the coordination of rhythmic movement with an external rhythm, ranging from finger tapping in time with a metronome to musical ensemble performance. An earlier review (Repp, 2005) covered tapping studies; two additional reviews (Repp, 2006a, b) focused on music performance and on rate limits of SMS, respectively. The present article supplements and extends these earlier reviews by surveying more recent research in what appears to be a burgeoning field. The article comprises four parts, dealing with (1) conventional tapping studies, (2) other forms of moving in synchrony with external rhythms (including dance and nonhuman animals' synchronization abilities), (3) interpersonal synchronization (including musical ensemble performance), and (4) the neuroscience of SMS. It is evident that much new knowledge about SMS has been acquired in the last 7 years.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Cross-modal plasticity in specific auditory cortices underlies visual compensations in the deaf.

            When the brain is deprived of input from one sensory modality, it often compensates with supranormal performance in one or more of the intact sensory systems. In the absence of acoustic input, it has been proposed that cross-modal reorganization of deaf auditory cortex may provide the neural substrate mediating compensatory visual function. We tested this hypothesis using a battery of visual psychophysical tasks and found that congenitally deaf cats, compared with hearing cats, have superior localization in the peripheral field and lower visual movement detection thresholds. In the deaf cats, reversible deactivation of posterior auditory cortex selectively eliminated superior visual localization abilities, whereas deactivation of the dorsal auditory cortex eliminated superior visual motion detection. Our results indicate that enhanced visual performance in the deaf is caused by cross-modal reorganization of deaf auditory cortex and it is possible to localize individual visual functions in discrete portions of reorganized auditory cortex.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Auditory dominance in temporal processing: new evidence from synchronization with simultaneous visual and auditory sequences.

              Evidence that audition dominates vision in temporal processing has come from perceptual judgment tasks. This study shows that this auditory dominance extends to the largely subconscious processes involved in sensorimotor coordination. Participants tapped their finger in synchrony with auditory and visual sequences containing an event onset shift (EOS), expected to elicit an involuntary phase correction response (PCR), and also tried to detect the EOS. Sequences were presented in unimodal and bimodal conditions, including one in which auditory and visual EOSs of opposite sign coincided. Unimodal results showed greater variability of taps, smaller PCRs, and poorer EOS detection in vision than in audition. In bimodal conditions, variability of taps was similar to that for unimodal auditory sequences, and PCRs depended more on auditory than on visual information, even though attention was always focused on the visual sequences.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Neurosci
                Front Neurosci
                Front. Neurosci.
                Frontiers in Neuroscience
                Frontiers Media S.A.
                1662-4548
                1662-453X
                12 September 2017
                2017
                : 11
                : 507
                Affiliations
                [1] 1Faculty of Psychology, University of Montreal Montreal, QC, Canada
                [2] 2International Laboratory for Brain, Music, and Sound Montreal, QC, Canada
                [3] 3Centre for Interdisciplinary Research on Music, Media, and Technology Montreal, QC, Canada
                [4] 4Centre for Research on Brain, Language, and Music Montreal, QC, Canada
                [5] 5Montreal Neurological Institute, McGill University Montreal, QC, Canada
                [6] 6Input Devices and Music Interaction Lab, McGill University Montreal, QC, Canada
                Author notes

                Edited by: Daniela Sammler, Max Planck Institute for Human Cognitive and Brain Sciences (MPG), Germany

                Reviewed by: Manuel Varlet, Western Sydney University, Australia; Jessica Phillips-Silver, Georgetown University Medical Center, United States

                *Correspondence: Pauline Tranchant pauline.tranchant@ 123456umontreal.ca

                This article was submitted to Auditory Cognitive Neuroscience, a section of the journal Frontiers in Neuroscience

                †Present Address: Martha M. Shiell, Department of Cognitive Neuroscience, Maastricht University, Maastricht, Netherlands

                ‡These authors have contributed equally to this work.

                Article
                10.3389/fnins.2017.00507
                5601036
                28955193
                f859a2c7-c944-48c6-b25f-d5a2247689d3
                Copyright © 2017 Tranchant, Shiell, Giordano, Nadeau, Peretz and Zatorre.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 19 May 2017
                : 28 August 2017
                Page count
                Figures: 2, Tables: 0, Equations: 0, References: 34, Pages: 8, Words: 6800
                Funding
                Funded by: Centre for Interdisciplinary Research in Music Media and Technology 10.13039/100009222
                Funded by: Canadian Institutes of Health Research 10.13039/501100000024
                Categories
                Neuroscience
                Original Research

                Neurosciences
                dancing,beat sychronization,vibrotactile,deafness,sensorimotor integration
                Neurosciences
                dancing, beat sychronization, vibrotactile, deafness, sensorimotor integration

                Comments

                Comment on this article