70
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A standardized citation metrics author database annotated for scientific field

      other

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Citation metrics are widely used and misused. We have created a publicly available database of 100,000 top scientists that provides standardized information on citations, h-index, coauthorship-adjusted hm-index, citations to papers in different authorship positions, and a composite indicator. Separate data are shown for career-long and single-year impact. Metrics with and without self-citations and ratio of citations to citing papers are given. Scientists are classified into 22 scientific fields and 176 subfields. Field- and subfield-specific percentiles are also provided for all scientists who have published at least five papers. Career-long data are updated to end of 2017 and to end of 2018 for comparison.

          Abstract

          Citation metrics are widely used and misused. This Community Page article presents a publicly available database that provides standardized information on multiple citation indicators and a composite thereof, annotating each author according to his/her main scientific field(s).

          Related collections

          Most cited references5

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Assessing scientists for hiring, promotion, and tenure

          Assessment of researchers is necessary for decisions of hiring, promotion, and tenure. A burgeoning number of scientific leaders believe the current system of faculty incentives and rewards is misaligned with the needs of society and disconnected from the evidence about the causes of the reproducibility crisis and suboptimal quality of the scientific publication record. To address this issue, particularly for the clinical and life sciences, we convened a 22-member expert panel workshop in Washington, DC, in January 2017. Twenty-two academic leaders, funders, and scientists participated in the meeting. As background for the meeting, we completed a selective literature review of 22 key documents critiquing the current incentive system. From each document, we extracted how the authors perceived the problems of assessing science and scientists, the unintended consequences of maintaining the status quo for assessing scientists, and details of their proposed solutions. The resulting table was used as a seed for participant discussion. This resulted in six principles for assessing scientists and associated research and policy implications. We hope the content of this paper will serve as a basis for establishing best practices and redesigning the current approaches to assessing scientists by the many players involved in that process.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Meta-research: Evaluation and Improvement of Research Methods and Practices

            As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science). Much work is already done in this growing field, but efforts to-date are fragmented. We provide a map of ongoing efforts and discuss plans for connecting the multiple meta-research efforts across science worldwide.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              The inconsistency of the h-index

                Bookmark

                Author and article information

                Journal
                PLoS Biol
                PLoS Biol
                plos
                plosbiol
                PLoS Biology
                Public Library of Science (San Francisco, CA USA )
                1544-9173
                1545-7885
                12 August 2019
                August 2019
                12 August 2019
                : 17
                : 8
                : e3000384
                Affiliations
                [1 ] Departments of Medicine, Health Research and Policy, Biomedical Data Science, and Statistics and Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, California, United States of America
                [2 ] Research Intelligence, Elsevier B.V., Amsterdam, the Netherlands
                [3 ] SciTech Strategies, Inc., Wayne, Pennsylvania, United States of America
                [4 ] SciTech Strategies, Inc., Albuquerque, New Mexico, United States of America
                Author notes

                The authors have declared that no competing interests exist. JPAI is a member of the editorial board of PLoS Biology. Jeroen Baas is an Elsevier employee. Elsevier runs Scopus, which is the source of this data, and also runs Mendeley Data where the database is now stored.

                Author information
                http://orcid.org/0000-0003-3118-6859
                http://orcid.org/0000-0001-8005-4153
                http://orcid.org/0000-0001-7814-8951
                Article
                PBIOLOGY-D-19-01244
                10.1371/journal.pbio.3000384
                6699798
                31404057
                6cccec3a-1be9-4200-9122-3c12aea6d0a2
                © 2019 Ioannidis et al

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                Page count
                Figures: 0, Tables: 1, Pages: 6
                Funding
                The Meta-Research Innovation Center at Stanford (METRICS) has been funded by the Laura and John Arnold Foundation (funding to JPAI). The work of JPAI is also funded by an unrestricted gift from Sue and Bob O’Donnell. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
                Categories
                Community Page
                Science Policy
                Science and Technology Workforce
                Careers in Research
                Scientists
                People and Places
                Population Groupings
                Professions
                Scientists
                Research and Analysis Methods
                Research Assessment
                Citation Analysis
                Research and Analysis Methods
                Research Assessment
                Bibliometrics
                Research and Analysis Methods
                Scientific Publishing
                Medicine and Health Sciences
                Clinical Medicine
                Biology and Life Sciences
                Immunology
                Medicine and Health Sciences
                Immunology
                Engineering and Technology
                Telecommunications
                Social Sciences
                Sociology
                Communications
                Custom metadata
                vor-update-to-uncorrected-proof
                2019-08-19

                Life sciences
                Life sciences

                Comments

                Comment on this article