36
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Usability and acceptability of four systematic review automation software packages: a mixed method design

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Aim

          New software packages help to improve the efficiency of conducting a systematic review through automation of key steps in the systematic review. The aim of this study was to gather qualitative data on the usability and acceptability of four systematic review automation software packages (Covidence, SRA-Helper for EndNote, Rayyan and RobotAnalyst) for the citation screening step of a systematic review.

          Methods

          We recruited three volunteer systematic reviewers and asked them to use allocated software packages during citation screening. They then completed a 12-item online questionnaire which was tailored to capture data for the software packages used.

          Findings

          All four software packages were reported to be easy or very easy to learn and use. SRA-Helper for EndNote was most favoured by participants for screening citations and Covidence for resolving conflicts. Overall, participants reported that SRA-Helper for EndNote would be their software package of choice, primarily due to its efficiency.

          Conclusion

          This study identified a number of considerations which systematic reviewers can use as a basis of their decision which software to use when performing the citation screening and dispute resolution steps of a systematic review.

          Electronic supplementary material

          The online version of this article (10.1186/s13643-019-1069-6) contains supplementary material, which is available to authorized users.

          Related collections

          Most cited references1

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Automated screening of research studies for systematic reviews using study characteristics

          Background Screening candidate studies for inclusion in a systematic review is time-consuming when conducted manually. Automation tools could reduce the human effort devoted to screening. Existing methods use supervised machine learning which train classifiers to identify relevant words in the abstracts of candidate articles that have previously been labelled by a human reviewer for inclusion or exclusion. Such classifiers typically reduce the number of abstracts requiring manual screening by about 50%. Methods We extracted four key characteristics of observational studies (population, exposure, confounders and outcomes) from the text of titles and abstracts for all articles retrieved using search strategies from systematic reviews. Our screening method excluded studies if they did not meet a predefined set of characteristics. The method was evaluated using three systematic reviews. Screening results were compared to the actual inclusion list of the reviews. Results The best screening threshold rule identified studies that mentioned both exposure (E) and outcome (O) in the study abstract. This screening rule excluded 93.7% of retrieved studies with a recall of 98%. Conclusions Filtering studies for inclusion in a systematic review based on the detection of key study characteristics in abstracts significantly outperformed standard approaches to automated screening and appears worthy of further development and evaluation.
            Bookmark

            Author and article information

            Contributors
            gcleo@bond.edu.au
            ascott@bond.edu.au
            farhana.islam@student.bond.edu.au
            blair.julien@student.bond.edu.au
            ebeller@bond.edu.au
            Journal
            Syst Rev
            Syst Rev
            Systematic Reviews
            BioMed Central (London )
            2046-4053
            20 June 2019
            20 June 2019
            2019
            : 8
            : 145
            Affiliations
            [1 ]ISNI 0000 0004 0405 3820, GRID grid.1033.1, Institute for Evidence-Based Healthcare, , Bond University, ; Gold Coast, Australia
            [2 ]ISNI 0000 0004 0405 3820, GRID grid.1033.1, Faculty of Health Sciences and Medicine, , Bond University, ; Gold Coast, Australia
            Author information
            http://orcid.org/0000-0002-0902-4928
            Article
            1069
            10.1186/s13643-019-1069-6
            6587262
            31221212
            52cd03bd-c9df-4a7e-9a11-56c3e16b26d0
            © The Author(s). 2019

            Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

            History
            : 7 August 2018
            : 10 June 2019
            Funding
            Funded by: FundRef http://dx.doi.org/10.13039/501100000925, National Health and Medical Research Council;
            Award ID: APP1044904
            Award ID: APP1044904
            Award Recipient :
            Categories
            Research
            Custom metadata
            © The Author(s) 2019

            Public health
            automation,qualitative report,acceptability,usability,software packages,systematic review accelerator

            Comments

            Comment on this article