1,057
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Celebrating 65 years of The Computer Journal - free-to-read perspectives - bcs.org/tcj65

      scite_
       
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      Impact of Experience and Team Size on the Quality of Scenarios for Architecture Evaluation

      proceedings-article
      1 , 2 , 1
      12th International Conference on Evaluation and Assessment in Software Engineering (EASE) (EASE)
      Evaluation and Assessment in Software Engineering (EASE)
      26 - 27 June 2008
      Architecture Evaluation, Quality Attributes, Scenarios, Empirical Studies, Performance Measurement
      Bookmark

            Abstract

            Software and systems architecture is a success-critical issue in software projects. Changing nonfunctional quality requirements, e.g., performance, modifiability, and maintainability, can have strong impact on software architecture and can result in a high rework effort in case of changes. Architecture reviews help evaluating architectural design with scenarios in early stages of product development. Quality-sensitive scenarios represent a set of software requirements (including non-functional quality attributes). In this paper we empirically investigate the impact of potentially important factors on the number and quality of scenarios elicited in an architecture evaluation workshop: (a) scoring schemes for scenario quality, (b) workshop participant experience, and (c) team size for workshop group work. We report data analysis results from an empirical study where 24 reviewers at different experience levels identified over 100 scenarios. Main findings are: (a) results of different scoring approaches (frequency-based and expert scoring) agree very well regarding critical scenarios, (b) the scenario elicitation method was more important than individual experience, and (c) adding a new person to a team of size 3 or more increases scenario coverage by less than 10%.

            Content

            Author and article information

            Contributors
            Conference
            June 2008
            June 2008
            : 1-10
            Affiliations
            [1 ]Vienna University of Technology, Austria
            [2 ]Lero, University of Limerick, Ireland
            Article
            10.14236/ewic/EASE2008.1
            557743fa-cbb7-4c58-89a1-d55612db1a77
            © Stefan Biffl et al. Published by BCS Learning and Development Ltd. 12th International Conference on Evaluation and Assessment in Software Engineering (EASE)

            This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

            12th International Conference on Evaluation and Assessment in Software Engineering (EASE)
            EASE
            12
            University of Bari, Italy
            26 - 27 June 2008
            Electronic Workshops in Computing (eWiC)
            Evaluation and Assessment in Software Engineering (EASE)
            History
            Product

            1477-9358 BCS Learning & Development

            Self URI (article page): https://www.scienceopen.com/hosted-document?doi=10.14236/ewic/EASE2008.1
            Self URI (journal page): https://ewic.bcs.org/
            Categories
            Electronic Workshops in Computing

            Applied computer science,Computer science,Security & Cryptology,Graphics & Multimedia design,General computer science,Human-computer-interaction
            Architecture Evaluation,Empirical Studies,Quality Attributes,Performance Measurement,Scenarios

            Comments

            Comment on this article