+1 Recommend
1 collections
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      Experiences Conducting Systematic Reviews from Novices’ Perspective


      , , ,

      14th International Conference on Evaluation and Assessment in Software Engineering (EASE) (EASE)

      Evaluation and Assessment in Software Engineering

      12 - 13 April 2010

      Systematic review, empirical software engineering, experience report



            Background: A systematic review (SR) is a sound methodology for collecting evidence on a research topic of interest and establishing the context of future research. Unlike ordinary or even expert literature reviews, SRs are systematic thus increasing the confidence in the findings from the previous published literature. SRs can be carried out by both experienced and novice researchers; however, while expert researchers. experiences with conducting SRs are important for improving the SR body of knowledge, we believe that novice researchers. experiences are equally important to establish what distinct problems they face while carrying out SRs. With a prior knowledge of these issues, novice researchers can better plan their SRs and seek guidance from expert researchers. Aim: The aim of this paper is therefore to report on experiences conducting SRs from the perspective of novice researchers. The paper reports first hand experiences of novices conducting SRs and compares them with the experiences of an expert as well as with the experiences reported in the previous literature. Method: An instrument was created and used to gather the experiences conducting SRs from three PhD students and their supervisor. The instrument covered all the SR steps; it was individually filled out by each of the participating subjects and its data was later on aggregated. Results: The results show that the problems faced by novices in terms of time taken to conduct the review; defining the research questions, inclusion/exclusion criteria, data extraction and data synthesis forms are not faced by expert researchers. Moreover, problems faced by novices related to defining quality criteria are different in nature than those faced by expert researchers. Conclusions: It has been observed that while numerous problems are faced by both novices and experts, many others are specific to novices, where several of these can be solved with the help of domain and SR experts.


            Author and article information

            April 2010
            April 2010
            : 1-10
            [0001]Department of Computer Science, The University of Auckland, Level 5, 38 Princes Street, Auckland 1142, New Zealand
            © Mehwish Riaz et al. Published by BCS Learning and Development Ltd. 14th International Conference on Evaluation and Assessment in Software Engineering (EASE), Keele University, UK

            This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

            14th International Conference on Evaluation and Assessment in Software Engineering (EASE)
            Keele University, UK
            12 - 13 April 2010
            Electronic Workshops in Computing (eWiC)
            Evaluation and Assessment in Software Engineering
            Product Information: 1477-9358BCS Learning & Development
            Self URI (journal page): https://ewic.bcs.org/
            Electronic Workshops in Computing


            Comment on this article