8
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Two Distinct Types of Eye-Head Coupling in Freely Moving Mice

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Summary

          Animals actively interact with their environment to gather sensory information. There is conflicting evidence about how mice use vision to sample their environment. During head restraint, mice make rapid eye movements coupled between the eyes, similar to conjugate saccadic eye movements in humans. However, when mice are free to move their heads, eye movements are more complex and often non-conjugate, with the eyes moving in opposite directions. We combined head and eye tracking in freely moving mice and found both observations are explained by two eye-head coupling types, associated with vestibular mechanisms. The first type comprised non-conjugate eye movements, which compensate for head tilt changes to maintain a similar visual field relative to the horizontal ground plane. The second type of eye movements was conjugate and coupled to head yaw rotation to produce a “saccade and fixate” gaze pattern. During head-initiated saccades, the eyes moved together in the head direction but during subsequent fixation moved in the opposite direction to the head to compensate for head rotation. This saccade and fixate pattern is similar to humans who use eye movements (with or without head movement) to rapidly shift gaze but in mice relies on combined head and eye movements. Both couplings were maintained during social interactions and visually guided object tracking. Even in head-restrained mice, eye movements were invariably associated with attempted head motion. Our results reveal that mice combine head and eye movements to sample their environment and highlight similarities and differences between eye movements in mice and humans.

          Highlights

          • Head and eye tracking in freely moving mice reveals two types of eye-head coupling

          • Eye coupling to head tilt aligns gaze to the horizontal plane

          • Eye coupling to head yaw rotation produces a “saccade and fixate” gaze pattern

          • Eye movements in head-restrained mice are related to attempted head rotation

          Abstract

          Meyer et al. track head and eyes in freely moving mice and find two distinct types of eye-head coupling. Eye coupling to head tilt aligns gaze to the horizontal plane, while eye coupling to yaw head rotation produces a “saccade and fixate” gaze pattern. Also in head-restrained mice, eye movements are linked to attempted head rotation.

          Related collections

          Most cited references72

          • Record: found
          • Abstract: found
          • Article: not found

          DeepLabCut: markerless pose estimation of user-defined body parts with deep learning

          Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods for the observation and recording of animal behavior in diverse settings, yet extracting particular aspects of a behavior for further analysis can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based tracking, but markers are intrusive, and the number and location of the markers must be determined a priori. Here we present an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. We demonstrate the versatility of this framework by tracking various body parts in multiple species across a broad collection of behaviors. Remarkably, even when only a small number of frames are labeled (~200), the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Python for Scientific Computing

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Rapid innate defensive responses of mice to looming visual stimuli.

              Much of brain science is concerned with understanding the neural circuits that underlie specific behaviors. While the mouse has become a favorite experimental subject, the behaviors of this species are still poorly explored. For example, the mouse retina, like that of other mammals, contains ∼20 different circuits that compute distinct features of the visual scene [1, 2]. By comparison, only a handful of innate visual behaviors are known in this species--the pupil reflex [3], phototaxis [4], the optomotor response [5], and the cliff response [6]--two of which are simple reflexes that require little visual processing. We explored the behavior of mice under a visual display that simulates an approaching object, which causes defensive reactions in some other species [7, 8]. We show that mice respond to this stimulus either by initiating escape within a second or by freezing for an extended period. The probability of these defensive behaviors is strongly dependent on the parameters of the visual stimulus. Directed experiments identify candidate retinal circuits underlying the behavior and lead the way into detailed study of these neural pathways. This response is a new addition to the repertoire of innate defensive behaviors in the mouse that allows the detection and avoidance of aerial predators.
                Bookmark

                Author and article information

                Contributors
                Journal
                Curr Biol
                Curr. Biol
                Current Biology
                Cell Press
                0960-9822
                1879-0445
                08 June 2020
                08 June 2020
                : 30
                : 11
                : 2116-2130.e6
                Affiliations
                [1 ]Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen 6525, the Netherlands
                [2 ]Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London (UCL), London W1T 4JG, UK
                [3 ]Department of Cell and Developmental Biology, UCL, London WC1E 6BT, UK
                [4 ]Department of Psychology, University of Cambridge, Cambridge CB2 3EB, UK
                Author notes
                []Corresponding author a1.meyer@ 123456donders.ru.nl
                [∗∗ ]Corresponding author jp816@ 123456cam.ac.uk
                [5]

                Lead Contact

                Article
                S0960-9822(20)30556-X
                10.1016/j.cub.2020.04.042
                7284311
                32413309
                b76c6695-b7d2-46ec-9689-53b8a8f5054c
                © 2020 The Author(s)

                This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).

                History
                : 25 February 2020
                : 9 April 2020
                : 20 April 2020
                Categories
                Article

                Life sciences
                eye movement,head movement,gaze,pupil,vision,oculomotor system,vestibular system,natural behavior

                Comments

                Comment on this article