1,744
views
0
recommends
+1 Recommend
1 collections
    4
    shares

      Studying business & IT? Drive your professional career forwards with BCS books - for a 20% discount click here: shop.bcs.org

      scite_
       
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      An assessment of published evaluations of requirements management tools

      Published
      proceedings-article
      a , b , a
      13th International Conference on Evaluation and Assessment in Software Engineering (EASE) (EASE)
      Evaluation and Assessment in Software Engineering (EASE)
      20 - 21 April 2009
      Evaluation, Requirements Management Tool, Literature Review, Systematic Mapping, Scoping study
      Bookmark

            Abstract

            Context: The traditional literature review is a low cost, relatively quick but potentially ineffective method for evaluating tools. Practitioners appear to place a greater emphasis on the practical constraints of an evaluation (e.g. that it is low cost and quick) and the efficacy of the technology to the company, rather than on generic scientific results. By contrast, academia appears to place greater emphasis on theory confirmation, rigour and validity, and their literature reviews focus on literature published in peer-reviewed journals and conferences, and tend not to consider the trade and ‘grey’ literature. Objectives: To assess the quality and quantity of published evaluations of requirements management tools (RMTs) reported in the academic, ‘grey’ and trade literatures. Method: Three independent literature reviews were conducted to identify published evaluations of RMTs. The three reviews were conducted by three different types of reviewers: a practitioner in a company, an experienced researcher, and 19 final-year undergraduate students. The researcher and the students followed a version of Evidence Based Software Engineering to undertake their literature reviews. The practitioner undertook an ad hoc literature review. Publications were then screened to select higherquality evaluations, which were then analysed to identify the RMTs evaluated. Results: The three literature reviews found a total of 28 evaluations referring to 14 RMTs, of which 6 evaluations were duplicates, giving 22 unique evaluations. Evaluations were identified between approximately the year 2000 and 2007, with an average of about 3 evaluations published per year. Conclusions/implications: Given the number of commercial RMTs on the market (>40), and the few evaluations published per year, there are surprisingly few higher-quality evaluations. There is a noticeable bias toward evaluating the market leading RMTs. Given the rate of change in the IT industry, there may be a need to re-evaluate RMTs every two years or less. Overall, there appears to be a poor ‘base’ of up-to-date published evaluations of RMTs available for use in literature reviews. Literature reviews would appear to be useful for short-listing RMTs for subsequent in-company evaluation, and for benchmarking, but care should be taken to include non-market leading RMTs in the shortlisting.

            Content

            Author and article information

            Contributors
            Conference
            April 2009
            April 2009
            : 1-10
            Affiliations
            [ a ]School of Computer Science

            University of Hertfordshire

            College Lane Campus

            Hatfield

            Hertfordshire

            AL10 9AB

            UK
            [ b ]School of Information Systems

            Computing and Mathematics

            Brunel University

            Uxbridge

            Middlesex

            UB8 3PH

            UK
            Article
            10.14236/ewic/EASE2009.12
            ecf4e738-2ed9-4858-a624-1524341a0493
            © Austen Rainer et al. Published by BCS Learning and Development Ltd. 13th International Conference on Evaluation and Assessment in Software Engineering (EASE), Durham University, UK

            This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

            13th International Conference on Evaluation and Assessment in Software Engineering (EASE)
            EASE
            13
            Durham University, UK
            20 - 21 April 2009
            Electronic Workshops in Computing (eWiC)
            Evaluation and Assessment in Software Engineering (EASE)
            History
            Product

            1477-9358 BCS Learning & Development

            Self URI (article page): https://www.scienceopen.com/hosted-document?doi=10.14236/ewic/EASE2009.12
            Self URI (journal page): https://ewic.bcs.org/
            Categories
            Electronic Workshops in Computing

            Applied computer science,Computer science,Security & Cryptology,Graphics & Multimedia design,General computer science,Human-computer-interaction
            Evaluation,Systematic Mapping,Requirements Management Tool,Scoping study,Literature Review

            Comments

            Comment on this article