3
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      The debate over open access is missing the point

      1
      Learned Publishing
      Wiley

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references5

          • Record: found
          • Abstract: found
          • Article: not found

          Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015

          Being able to replicate scientific findings is crucial for scientific progress1-15. We replicate 21 systematically selected experimental studies in the social sciences published in Nature and Science between 2010 and 201516-36. The replications follow analysis plans reviewed by the original authors and pre-registered prior to the replications. The replications are high powered, with sample sizes on average about five times higher than in the original studies. We find a significant effect in the same direction as the original study for 13 (62%) studies, and the effect size of the replications is on average about 50% of the original effect size. Replicability varies between 12 (57%) and 14 (67%) studies for complementary replicability indicators. Consistent with these results, the estimated true-positive rate is 67% in a Bayesian analysis. The relative effect size of true positives is estimated to be 71%, suggesting that both false positives and inflated effect sizes of true positives contribute to imperfect reproducibility. Furthermore, we find that peer beliefs of replicability are strongly related to replicability, suggesting that the research community could predict which results would replicate and that failures to replicate were not the result of chance alone.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Open access: The true cost of science publishing.

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              What errors do peer reviewers detect, and does training improve their ability to detect them?

              To analyse data from a trial and report the frequencies with which major and minor errors are detected at a general medical journal, the types of errors missed and the impact of training on error detection. 607 peer reviewers at the BMJ were randomized to two intervention groups receiving different types of training (face-to-face training or a self-taught package) and a control group. Each reviewer was sent the same three test papers over the study period, each of which had nine major and five minor methodological errors inserted. BMJ peer reviewers. The quality of review, assessed using a validated instrument, and the number and type of errors detected before and after training. The number of major errors detected varied over the three papers. The interventions had small effects. At baseline (Paper 1) reviewers found an average of 2.58 of the nine major errors, with no notable difference between the groups. The mean number of errors reported was similar for the second and third papers, 2.71 and 3.0, respectively. Biased randomization was the error detected most frequently in all three papers, with over 60% of reviewers rejecting the papers identifying this error. Reviewers who did not reject the papers found fewer errors and the proportion finding biased randomization was less than 40% for each paper. Editors should not assume that reviewers will detect most major errors, particularly those concerned with the context of study. Short training packages have only a slight impact on improving error detection.
                Bookmark

                Author and article information

                Journal
                Learned Publishing
                Learned Publishing
                Wiley
                0953-1513
                1741-4857
                November 23 2018
                April 2019
                January 03 2019
                April 2019
                : 32
                : 2
                : 188-190
                Affiliations
                [1 ]Vice President of Research, Laura and John Arnold Foundation Houston TX USA
                Article
                10.1002/leap.1223
                cff13384-bc02-4621-82dd-a301bf5a2794
                © 2019

                http://onlinelibrary.wiley.com/termsAndConditions#vor

                http://doi.wiley.com/10.1002/tdm_license_1.1

                History

                Comments

                Comment on this article