192
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Negative results are disappearing from most disciplines and countries

      Scientometrics
      Springer Nature

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references47

          • Record: found
          • Abstract: found
          • Article: not found

          Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting Bias

          Background The increased use of meta-analysis in systematic reviews of healthcare interventions has highlighted several types of bias that can arise during the completion of a randomised controlled trial. Study publication bias has been recognised as a potential threat to the validity of meta-analysis and can make the readily available evidence unreliable for decision making. Until recently, outcome reporting bias has received less attention. Methodology/Principal Findings We review and summarise the evidence from a series of cohort studies that have assessed study publication bias and outcome reporting bias in randomised controlled trials. Sixteen studies were eligible of which only two followed the cohort all the way through from protocol approval to information regarding publication of outcomes. Eleven of the studies investigated study publication bias and five investigated outcome reporting bias. Three studies have found that statistically significant outcomes had a higher odds of being fully reported compared to non-significant outcomes (range of odds ratios: 2.2 to 4.7). In comparing trial publications to protocols, we found that 40–62% of studies had at least one primary outcome that was changed, introduced, or omitted. We decided not to undertake meta-analysis due to the differences between studies. Conclusions Recent work provides direct empirical evidence for the existence of study publication bias and outcome reporting bias. There is strong evidence of an association between significant results and publication; studies that report positive or significant results are more likely to be published and outcomes that are statistically significant have higher odds of being fully reported. Publications have been found to be inconsistent with their protocols. Researchers need to be aware of the problems of both types of bias and efforts should be concentrated on improving the reporting of trials.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The scientific impact of nations.

            David King (2004)
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index

              The growth rate of scientific publication has been studied from 1907 to 2007 using available data from a number of literature databases, including Science Citation Index (SCI) and Social Sciences Citation Index (SSCI). Traditional scientific publishing, that is publication in peer-reviewed journals, is still increasing although there are big differences between fields. There are no indications that the growth rate has decreased in the last 50 years. At the same time publication using new channels, for example conference proceedings, open archives and home pages, is growing fast. The growth rate for SCI up to 2007 is smaller than for comparable databases. This means that SCI was covering a decreasing part of the traditional scientific literature. There are also clear indications that the coverage by SCI is especially low in some of the scientific areas with the highest growth rate, including computer science and engineering sciences. The role of conference proceedings, open access archives and publications published on the net is increasing, especially in scientific fields with high growth rates, but this has only partially been reflected in the databases. The new publication channels challenge the use of the big databases in measurements of scientific productivity or output and of the growth rate of science. Because of the declining coverage and this challenge it is problematic that SCI has been used and is used as the dominant source for science indicators based on publication and citation numbers. The limited data available for social sciences show that the growth rate in SSCI was remarkably low and indicate that the coverage by SSCI was declining over time. National Science Indicators from Thomson Reuters is based solely on SCI, SSCI and Arts and Humanities Citation Index (AHCI). Therefore the declining coverage of the citation databases problematizes the use of this source.
                Bookmark

                Author and article information

                Journal
                Scientometrics
                Scientometrics
                Springer Nature
                0138-9130
                1588-2861
                March 2012
                September 11 2011
                : 90
                : 3
                : 891-904
                Article
                10.1007/s11192-011-0494-7
                90a3f6fa-b700-49cf-89a3-7d987ae6e330
                © 2011
                History

                Comments

                Comment on this article