43
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The Hong Kong Principles for assessing researchers: Fostering research integrity

      other

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          For knowledge to benefit research and society, it must be trustworthy. Trustworthy research is robust, rigorous, and transparent at all stages of design, execution, and reporting. Assessment of researchers still rarely includes considerations related to trustworthiness, rigor, and transparency. We have developed the Hong Kong Principles (HKPs) as part of the 6th World Conference on Research Integrity with a specific focus on the need to drive research improvement through ensuring that researchers are explicitly recognized and rewarded for behaviors that strengthen research integrity. We present five principles: responsible research practices; transparent reporting; open science (open research); valuing a diversity of types of research; and recognizing all contributions to research and scholarly activity. For each principle, we provide a rationale for its inclusion and provide examples where these principles are already being adopted.

          Abstract

          Assessment of researchers still rarely includes considerations related to trustworthiness, rigor, and transparency. This Essay presents the Hong Kong Principles (HKPs), developed as part of the 6th World Conference on Research Integrity, with a specific focus on the need to drive research improvement by ensuring that researchers are explicitly recognized and rewarded for behavior that leads to trustworthy research.

          Related collections

          Most cited references23

          • Record: found
          • Abstract: found
          • Article: not found

          Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015

          Being able to replicate scientific findings is crucial for scientific progress1-15. We replicate 21 systematically selected experimental studies in the social sciences published in Nature and Science between 2010 and 201516-36. The replications follow analysis plans reviewed by the original authors and pre-registered prior to the replications. The replications are high powered, with sample sizes on average about five times higher than in the original studies. We find a significant effect in the same direction as the original study for 13 (62%) studies, and the effect size of the replications is on average about 50% of the original effect size. Replicability varies between 12 (57%) and 14 (67%) studies for complementary replicability indicators. Consistent with these results, the estimated true-positive rate is 67% in a Bayesian analysis. The relative effect size of true positives is estimated to be 71%, suggesting that both false positives and inflated effect sizes of true positives contribute to imperfect reproducibility. Furthermore, we find that peer beliefs of replicability are strongly related to replicability, suggesting that the research community could predict which results would replicate and that failures to replicate were not the result of chance alone.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Consolidated standards of reporting trials (CONSORT) and the completeness of reporting of randomised controlled trials (RCTs) published in medical journals.

            An overwhelming body of evidence stating that the completeness of reporting of randomised controlled trials (RCTs) is not optimal has accrued over time. In the mid-1990s, in response to these concerns, an international group of clinical trialists, statisticians, epidemiologists, and biomedical journal editors developed the CONsolidated Standards Of Reporting Trials (CONSORT) Statement. The CONSORT Statement, most recently updated in March 2010, is an evidence-based minimum set of recommendations including a checklist and flow diagram for reporting RCTs and is intended to facilitate the complete and transparent reporting of trials and aid their critical appraisal and interpretation. In 2006, a systematic review of eight studies evaluating the "effectiveness of CONSORT in improving reporting quality in journals" was published. To update the earlier systematic review assessing whether journal endorsement of the 1996 and 2001 CONSORT checklists influences the completeness of reporting of RCTs published in medical journals. We conducted electronic searches, known item searching, and reference list scans to identify reports of evaluations assessing the completeness of reporting of RCTs. The electronic search strategy was developed in MEDLINE and tailored to EMBASE. We searched the Cochrane Methodology Register and the Cochrane Database of Systematic Reviews using the Wiley interface. We searched the Science Citation Index, Social Science Citation Index, and Arts and Humanities Citation Index through the ISI Web of Knowledge interface. We conducted all searches to identify reports published between January 2005 and March 2010, inclusive. In addition to studies identified in the original systematic review on this topic, comparative studies evaluating the completeness of reporting of RCTs in any of the following comparison groups were eligible for inclusion in this review: 1) Completeness of reporting of RCTs published in journals that have and have not endorsed the CONSORT Statement; 2) Completeness of reporting of RCTs published in CONSORT-endorsing journals before and after endorsement; or 3) Completeness of reporting of RCTs before and after the publication of the CONSORT Statement (1996 or 2001). We used a broad definition of CONSORT endorsement that includes any of the following: (a) requirement or recommendation in journal's 'Instructions to Authors' to follow CONSORT guidelines; (b) journal editorial statement endorsing the CONSORT Statement; or (c) editorial requirement for authors to submit a CONSORT checklist and/or flow diagram with their manuscript. We contacted authors of evaluations reporting data that could be included in any comparison group(s), but not presented as such in the published report and asked them to provide additional data in order to determine eligibility of their evaluation. Evaluations were not excluded due to language of publication or validity assessment. We completed screening and data extraction using standardised electronic forms, where conflicts, reasons for exclusion, and level of agreement were all automatically and centrally managed in web-based management software, DistillerSR(®). One of two authors extracted general characteristics of included evaluations and all data were verified by a second author. Data describing completeness of reporting were extracted by one author using a pre-specified form; a 10% random sample of evaluations was verified by a second author. Any discrepancies were discussed by both authors; we made no modifications to the extracted data. Validity assessments of included evaluations were conducted by one author and independently verified by one of three authors. We resolved all conflicts by consensus.For each comparison we collected data on 27 outcomes: 22 items of the CONSORT 2001 checklist, plus four items relating to the reporting of blinding, and one item of aggregate CONSORT scores. Where reported, we extracted and qualitatively synthesised data on the methodological quality of RCTs, by scale or score. Fifty-three publications reporting 50 evaluations were included. The total number of RCTs assessed within evaluations was 16,604 (median per evaluation 123 (interquartile range (IQR) 77 to 226) published in a median of six (IQR 3 to 26) journals. Characteristics of the included RCT populations were variable, resulting in heterogeneity between included evaluations. Validity assessments of included studies resulted in largely unclear judgements. The included evaluations are not RCTs and less than 8% (4/53) of the evaluations reported adjusting for potential confounding factors.   Twenty-five of 27 outcomes assessing completeness of reporting in RCTs appeared to favour CONSORT-endorsing journals over non-endorsers, of which five were statistically significant. 'Allocation concealment' resulted in the largest effect, with risk ratio (RR) 1.81 (99% confidence interval (CI) 1.25 to 2.61), suggesting that 81% more RCTs published in CONSORT-endorsing journals adequately describe allocation concealment compared to those published in non-endorsing journals. Allocation concealment was reported adequately in 45% (393/876) of RCTs in CONSORT-endorsing journals and in 22% (329/1520) of RCTs in non-endorsing journals. Other outcomes with results that were significant include: scientific rationale and background in the 'Introduction' (RR 1.07, 99% CI 1.01 to 1.14); 'sample size' (RR 1.61, 99% CI 1.13 to 2.29); method used for 'sequence generation' (RR 1.59, 99% CI 1.38 to 1.84); and an aggregate score over reported CONSORT items, 'total sum score' (standardised mean difference (SMD) 0.68 (99% CI 0.38 to 0.98)). Evidence has accumulated to suggest that the reporting of RCTs remains sub-optimal. This review updates a previous systematic review of eight evaluations. The findings of this review are similar to those from the original review and demonstrate that, despite the general inadequacies of reporting of RCTs, journal endorsement of the CONSORT Statement may beneficially influence the completeness of reporting of trials published in medical journals. Future prospective studies are needed to explore the influence of the CONSORT Statement dependent on the extent of editorial policies to ensure adherence to CONSORT guidance.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Open science challenges, benefits and tips in early career and beyond

              The movement towards open science is a consequence of seemingly pervasive failures to replicate previous research. This transition comes with great benefits but also significant challenges that are likely to affect those who carry out the research, usually early career researchers (ECRs). Here, we describe key benefits, including reputational gains, increased chances of publication, and a broader increase in the reliability of research. The increased chances of publication are supported by exploratory analyses indicating null findings are substantially more likely to be published via open registered reports in comparison to more conventional methods. These benefits are balanced by challenges that we have encountered and that involve increased costs in terms of flexibility, time, and issues with the current incentive structure, all of which seem to affect ECRs acutely. Although there are major obstacles to the early adoption of open science, overall open science practices should benefit both the ECR and improve the quality of research. We review 3 benefits and 3 challenges and provide suggestions from the perspective of ECRs for moving towards open science practices, which we believe scientists and institutions at all levels would do well to consider.
                Bookmark

                Author and article information

                Journal
                PLoS Biol
                PLoS Biol
                plos
                plosbiol
                PLoS Biology
                Public Library of Science (San Francisco, CA USA )
                1544-9173
                1545-7885
                16 July 2020
                July 2020
                16 July 2020
                : 18
                : 7
                : e3000737
                Affiliations
                [1 ] Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada
                [2 ] School of Epidemiology and Public Health, University of Ottawa, Ottawa, Canada
                [3 ] Department of Epidemiology and Biostatistics, Amsterdam University Medical Centers, location VUmc, Amsterdam, the Netherlands
                [4 ] Department of Philosophy, Faculty of Humanities, Vrije Universiteit, Amsterdam, the Netherlands
                [5 ] The Lancet, London Wall Office, London, United Kingdom
                [6 ] Institute for Evidence-Based Healthcare, Bond University, Gold Coast, Queensland, Australia
                [7 ] School of Biomedical Sciences, LKS Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
                [8 ] Queensland University of Technology (QUT), Brisbane, Australia
                [9 ] Wellcome Trust, London, United Kingdom
                [10 ] Austrian Agency for Research Integrity, Vienna, Austria
                [11 ] Berlin Institute of Health, QUEST Center for Transforming Biomedical Research, Berlin, Germany
                Author notes

                I have read the journal's policy and the authors of this manuscript have the following competing interests: A-MC works for Wellcome. Through this, the organization and she are engaged in a lot of advocacy work to promote a more positive research culture. The guidance asks about advocacy work so I include this for completeness. VB was involved in the creation of a Research Integrity course at QUT. QUT licenses this course to other institutions and provides a proportion of any income to the creators. VB is employed by QUT and the Australasian Open Access Strategy Group. She sits on and is paid for work on the NHMRC’s Research Quality Steering Committee. She is an unpaid advisor to a variety of open access and scholarly communication initiatives, including DORA.

                Author information
                http://orcid.org/0000-0003-2434-4206
                http://orcid.org/0000-0002-2659-5482
                http://orcid.org/0000-0001-7564-073X
                http://orcid.org/0000-0003-1179-7839
                http://orcid.org/0000-0002-2358-2440
                http://orcid.org/0000-0003-2632-1745
                http://orcid.org/0000-0003-0755-6119
                Article
                PBIOLOGY-D-20-00253
                10.1371/journal.pbio.3000737
                7365391
                32673304
                806afe78-998a-4624-b947-c77714c0cd94
                © 2020 Moher et al

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                Page count
                Figures: 1, Tables: 0, Pages: 14
                Funding
                PG is funded by an Australian National Health and Medical Research Council NHMRC Fellowship APP1155009. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
                Categories
                Essay
                Science Policy
                Research Integrity
                Research and Analysis Methods
                Research Assessment
                Social Sciences
                Economics
                Labor Economics
                Employment
                Careers
                Research and Analysis Methods
                Research Design
                Social Sciences
                Sociology
                Social Research
                Research and Analysis Methods
                Research Assessment
                Research Validity
                Research and Analysis Methods
                Research Assessment
                Peer Review
                Computer and Information Sciences
                Data Management

                Life sciences
                Life sciences

                Comments

                Comment on this article