15
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Early maternal mirroring predicts infant motor system activation during facial expression observation

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Processing facial expressions is an essential component of social interaction, especially for preverbal infants. In human adults and monkeys, this process involves the motor system, with a neural matching mechanism believed to couple self- and other-generated facial gestures. Here, we used electroencephalography to demonstrate recruitment of the human motor system during observation and execution of facial expressions in nine-month-old infants, implicating this system in facial expression processing from a very young age. Notably, examination of early video-recorded mother-infant interactions supported the common, but as yet untested, hypothesis that maternal mirroring of infant facial gestures is central to the development of a neural matching mechanism for these gestures. Specifically, the extent to which mothers mirrored infant facial expressions at two months postpartum predicted infant motor system activity during observation of the same expressions at nine months. This suggests that maternal mirroring strengthens mappings between visual and motor representations of facial gestures, which increases infant neural sensitivity to particularly relevant cues in the early social environment.

          Related collections

          Most cited references63

          • Record: found
          • Abstract: found
          • Article: not found

          EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis

          We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Recognizing emotion from facial expressions: psychological and neurological mechanisms.

            Recognizing emotion from facial expressions draws on diverse psychological processes implemented in a large array of neural structures. Studies using evoked potentials, lesions, and functional imaging have begun to elucidate some of the mechanisms. Early perceptual processing of faces draws on cortices in occipital and temporal lobes that construct detailed representations from the configuration of facial features. Subsequent recognition requires a set of structures, including amygdala and orbitofrontal cortex, that links perceptual representations of the face to the generation of knowledge about the emotion signaled, a complex set of mechanisms using multiple strategies. Although recent studies have provided a wealth of detail regarding these mechanisms in the adult human brain, investigations are also being extended to nonhuman primates, to infants, and to patients with psychiatric disorders.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Generating Stimuli for Neuroscience Using PsychoPy

              PsychoPy is a software library written in Python, using OpenGL to generate very precise visual stimuli on standard personal computers. It is designed to allow the construction of as wide a variety of neuroscience experiments as possible, with the least effort. By writing scripts in standard Python syntax users can generate an enormous variety of visual and auditory stimuli and can interact with a wide range of external hardware (enabling its use in fMRI, EEG, MEG etc.). The structure of scripts is simple and intuitive. As a result, new experiments can be written very quickly, and trying to understand a previously written script is easy, even with minimal code comments. PsychoPy can also generate movies and image sequences to be used in demos or simulated neuroscience experiments. This paper describes the range of tools and stimuli that it provides and the environment in which experiments are conducted.
                Bookmark

                Author and article information

                Contributors
                holly.rayson@reading.ac.uk
                Journal
                Sci Rep
                Sci Rep
                Scientific Reports
                Nature Publishing Group UK (London )
                2045-2322
                15 September 2017
                15 September 2017
                2017
                : 7
                : 11738
                Affiliations
                [1 ]ISNI 0000 0004 0457 9566, GRID grid.9435.b, School of Psychology and Clinical Language Sciences, University of Reading, ; Reading, United Kingdom
                [2 ]ISNI 0000000121901201, GRID grid.83440.3b, Sobell Department of Motor Neuroscience and Movement Disorders, University College London, ; London, United Kingdom
                [3 ]GRID grid.465537.6, Institut des Sciences Cognitives Marc Jeannerod, CNRS/Université Claude Bernard, ; Lyon, France
                [4 ]ISNI 0000 0001 2214 904X, GRID grid.11956.3a, Department of Psychology, Stellenbosch University, ; Stellenbosch, South Africa
                [5 ]ISNI 0000 0004 1937 1151, GRID grid.7836.a, Department of Psychology, University of Cape Town, ; Cape Town, South Africa
                Article
                12097
                10.1038/s41598-017-12097-w
                5601467
                28916786
                c0c67d9f-7df6-4c9a-a079-f8c4652cbaa6
                © The Author(s) 2017

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 19 May 2017
                : 4 September 2017
                Categories
                Article
                Custom metadata
                © The Author(s) 2017

                Uncategorized
                Uncategorized

                Comments

                Comment on this article