726
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Celebrating 65 years of The Computer Journal - free-to-read perspectives - bcs.org/tcj65

      scite_
       
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      13th International Conference on Evaluation and Assessment in Software Engineering (EASE) - Index

      proceedings-article
      , ,
      13th International Conference on Evaluation and Assessment in Software Engineering (EASE) (EASE)
      Evaluation and Assessment in Software Engineering
      20 - 21 April 2009
      Bookmark

            Abstract

            The International Conference on Evaluation & Assessment in Software Engineering (EASE) offers a forum for discussion and exchange of experiences among researchers who present their research results and discuss issues related to empirical and evaluation studies. The conference aims to provide a workshop-like atmosphere in which papers can be presented and then time is allowed for constructive discussion of their results and processes.

            This year's edition has been organised by the Department of Computer Science (http://www.dur.ac.uk/computer.science/) and hosted by Grey College, both in Durham University, UK (http://www.dur.ac.uk/grey.college/).

            The event includes two keynote speeches held by internationally leading researchers: Prof. Magne Jørgensen, Research Scientist at Simula Research Laboratory Norway, and Prof. John McDermid, Department of Computer Science, The University of York UK.

            The International Conference on Evaluation and Assessment in Software Engineering provides a forum for discussion and exchange of experiences among researchers who present their research results and discuss issues related to empirical and evaluation studies in a workshop-like atmosphere in which papers are presented and time is allowed for constructive discussion of their results and processes. The 13th edition (EASE 2009) has been organised by the Department of Computer Science and hosted by Grey College, both in Durham University, UK.

            This year's papers have addressed topics including systematic literature reviews, effort estimation, requirements engineering, case study planning and execution, and empirical software engineering.

            We received submissions from countries all over the world. Each paper was reviewed by four members of the Program Committee and 12 full papers and 4 short papers were accepted for presentation at the Conference and inclusion in the Proceedings.

            In addition to research papers we have two keynote speeches, one for each day of the conference: the first by Prof. Magne Jørgensen, Research Scientist at Simula Research Laboratory Norway; the second, by Prof. John McDermid, Department of Computer Science, The University of York UK.

            We want to thank all of those who have contributed to the set up and running of this year's conference: the authors for submitting their papers, the Program Committee members for their valuable work in reviewing and selecting the papers and in promoting the conference, the organising committee together with all the people that helped in arranging the conference. We also would like to thank all the organisations that have sponsored the event and the Department of Computer Science and Grey College, Durham University for organising and hosting the event.

            We finally thank all the attendees of the conference for making it a successful event, and hope you find the program interesting and enjoy your stay in Durham.

            David Budgen, General Chair

            Mark Turner & Mahmood Niazi, Program Chairs

            Main article text

            Papers:

            Session 1: Industry Related Studies

            M Gatrell, S Counsell and T Hall Empirical Support for Two Refactoring Studies Using Commercial C# Software http://dx.doi.org/10.14236/ewic/EASE2009.1

            Emilia Mendes, Chris Lokan Investigating the Use of Chronological Splitting to Compare Software Cross-company and Single-company Effort Predictions: A Replicated Study http://dx.doi.org/10.14236/ewic/EASE2009.2

            Jörg Leuser, Nicolas Porta Empirical Validation of a Requirements Engineering Process Guide http://dx.doi.org/10.14236/ewic/EASE2009.3

            Session 2: Methodology

            Mats Skoglund and Per Runeson Reference-based search strategies in systematic reviews http://dx.doi.org/10.14236/ewic/EASE2009.4

            Emilia Mendes, Carmel Pollino, Nile Mosley Building an Expert-based Web Effort Estimation Model using Bayesian Networks http://dx.doi.org/10.14236/ewic/EASE2009.5

            Asma Mubarak, Steve Counsell, Robert M Hierons Does an 80:20 rule apply to Java coupling? http://dx.doi.org/10.14236/ewic/EASE2009.6

            Session 3: Quality

            Barbara A Kitchenham, O Pearl Brereton, David Budgen, Zhi Li An Evaluation of Quality Checklist Proposals - A participant-observer case study http://dx.doi.org/10.14236/ewic/EASE2009.7

            David Budgen, Cheng Zhang Preliminary Reporting Guidelines for Experience Papers http://dx.doi.org/10.14236/ewic/EASE2009.8

            Chris Thomson, Mike Holcombe Factors Explaining External Quality in 54 Case Studies of Software Development Projects http://dx.doi.org/10.14236/ewic/EASE2009.9

            Session 4: SLRs 1

            Briony J Oates, Graham Capper Using systematic reviews and evidence-based software engineering with masters students http://dx.doi.org/10.14236/ewic/EASE2009.10

            Wiebe Hordijk, Maria Laura Ponisio, Roel Wieringa Harmfulness of Code Duplication - A Structured Review of the Evidence http://dx.doi.org/10.14236/ewic/EASE2009.11

            Austen Rainer, Sarah Beecham, Cei Sanderson An assessment of published evaluations of requirements management tools http://dx.doi.org/10.14236/ewic/EASE2009.12

            Session 5: SLRs 2

            Carolyn Mair, Miriam Martincova, Martin Shepperd A Literature Review of Expert Problem Solving using Analogy http://dx.doi.org/10.14236/ewic/EASE2009.13

            Lianping Chen, Muhammad Ali Babar, Ciaran Cawley A Status Report on the Evaluation of Variability Management Approaches http://dx.doi.org/10.14236/ewic/EASE2009.14

            Barbara A Kitchenham, Andrew J Burn, Zhi Li A Quality Checklist for Technology-Centred Testing Studies http://dx.doi.org/10.14236/ewic/EASE2009.15

            Chris Thomson, Marian Gheorghe Using Process Mining Metrics to Measure Noisy Process Fidelity http://dx.doi.org/10.14236/ewic/EASE2009.16

            Author and article information

            Contributors
            Conference
            April 2009
            April 2009
            Article
            10.14236/ewic/EASE2009.0
            ca1c06fe-717a-4d45-9770-84558b232559
            Copyright @ 2009

            This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

            13th International Conference on Evaluation and Assessment in Software Engineering (EASE)
            EASE
            13
            Durham University, UK
            20 - 21 April 2009
            Electronic Workshops in Computing (eWiC)
            Evaluation and Assessment in Software Engineering
            History
            Product

            1477-9358 BCS Learning & Development

            Self URI (journal page): https://ewic.bcs.org/
            Categories
            Electronic Workshops in Computing

            Applied computer science,Computer science,Security & Cryptology,Graphics & Multimedia design,General computer science,Human-computer-interaction

            Comments

            Comment on this article