5
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Frames of reference in small-scale spatial tasks in wild bumblebees

      research-article
      1 , 2 ,
      Scientific Reports
      Nature Publishing Group UK
      Animal behaviour, Experimental evolution

      Read this article at

      ScienceOpenPublisherPMC
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Spatial cognitive abilities are fundamental to foraging animal species. In particular, being able to encode the location of an object in relation to another object (i.e., spatial relationships) is critical for successful foraging . Whether egocentric (i.e., viewer-dependent) or allocentric (i.e., dependent on external environment or cues) representations underlie these behaviours is still a highly debated question in vertebrates and invertebrates. Previous research shows that bees encode spatial information largely using egocentric information. However, no research has investigated this question in the context of relational similarity. To test this, a spatial matching task previously used with humans and great apes was adapted for use with wild-caught bumblebees. In a series of experiments, bees first experienced a rewarded object and then had to spontaneously (Experiment 1) find or learn (Experiments 2 and 3) to find a second one, based on the location of first one. The results showed that bumblebees predominantly exhibited an allocentric strategy in the three experiments. These findings suggest that egocentric representations alone might not be evolutionary ancestral and clearly indicate similarities between vertebrates and invertebrates when encoding spatial information.

          Related collections

          Most cited references40

          • Record: found
          • Abstract: found
          • Article: not found

          Generalized linear mixed models: a practical guide for ecology and evolution.

          How should ecologists and evolutionary biologists analyze nonnormal data that involve random effects? Nonnormal data such as counts or proportions often defy classical statistical procedures. Generalized linear mixed models (GLMMs) provide a more flexible approach for analyzing nonnormal data when random effects are present. The explosion of research on GLMMs in the last decade has generated considerable uncertainty for practitioners in ecology and evolution. Despite the availability of accurate techniques for estimating GLMM parameters in simple cases, complex GLMMs are challenging to fit and statistical inference such as hypothesis testing remains difficult. We review the use (and misuse) of GLMMs in ecology and evolution, discuss estimation and inference and summarize 'best-practice' data analysis procedures for scientists facing this challenge.
            • Record: found
            • Abstract: found
            • Article: not found

            Spatial memory: how egocentric and allocentric combine.

            Recent experiments indicate the need for revision of a model of spatial memory consisting of viewpoint-specific representations, egocentric spatial updating and a geometric module for reorientation. Instead, it appears that both egocentric and allocentric representations exist in parallel, and combine to support behavior according to the task. Current research indicates complementary roles for these representations, with increasing dependence on allocentric representations with the amount of movement between presentation and retrieval, the number of objects remembered, and the size, familiarity and intrinsic structure of the environment. Identifying the neuronal mechanisms and functional roles of each type of representation, and of their interactions, promises to provide a framework for investigation of the organization of human memory more generally.
              • Record: found
              • Abstract: found
              • Article: not found

              Spatial cognition and the brain.

              Recent advances in the understanding of spatial cognition are reviewed, focusing on memory for locations in large-scale space and on those advances inspired by single-unit recording and lesion studies in animals. Spatial memory appears to be supported by multiple parallel representations, including egocentric and allocentric representations, and those updated to accommodate self-motion. The effects of these representations can be dissociated behaviorally, developmentally, and in terms of their neural bases. It is now becoming possible to construct a mechanistic neural-level model of at least some aspects of spatial memory and imagery, with the hippocampus and medial temporal lobe providing allocentric environmental representations, the parietal lobe egocentric representations, and the retrosplenial cortex and parieto-occipital sulcus allowing both types of representation to interact. Insights from this model include a common mechanism for the construction of spatial scenes in the service of both imagery and episodic retrieval and a role for the remainder of Papez's circuit in orienting the viewpoint used. In addition, it appears that hippocampal and striatal systems process different aspects of environmental layout (boundaries and local landmarks, respectively) and do so using different learning rules (incidental learning and associative reinforcement, respectively).

                Author and article information

                Contributors
                gema.martin-ordas@stir.ac.uk , martingema@uniovi.es
                Journal
                Sci Rep
                Sci Rep
                Scientific Reports
                Nature Publishing Group UK (London )
                2045-2322
                15 December 2022
                15 December 2022
                2022
                : 12
                : 21683
                Affiliations
                [1 ]GRID grid.10863.3c, ISNI 0000 0001 2164 6351, Department of Psychology, , University of Oviedo, ; Oviedo, Spain
                [2 ]GRID grid.11918.30, ISNI 0000 0001 2248 4331, Division of Psychology, , University of Stirling, ; Stirling, UK
                Article
                26282
                10.1038/s41598-022-26282-z
                9755249
                36522430
                6f464b42-2e6b-4d23-a896-e5fbd0f82475
                © The Author(s) 2022

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 25 October 2022
                : 13 December 2022
                Funding
                Funded by: Maria Zambrano
                Award ID: MU-21-UP2021-030 11081303
                Award Recipient :
                Funded by: The Royal Society
                Award ID: RGS\R2\222260
                Award Recipient :
                Categories
                Article
                Custom metadata
                © The Author(s) 2022

                Uncategorized
                animal behaviour,experimental evolution
                Uncategorized
                animal behaviour, experimental evolution

                Comments

                Comment on this article

                Related Documents Log