+1 Recommend
1 collections
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      Investigating the Use of Chronological Splitting to Compare Software Cross-company and Single-company Effort Predictions: A Replicated Study


      1 , 2

      13th International Conference on Evaluation and Assessment in Software Engineering (EASE) (EASE)

      Evaluation and Assessment in Software Engineering (EASE)

      20 - 21 April 2009

      chronological split, effort estimation, software projects, cross-company estimation models, single-company estimation models, regression-based estimation models

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.


          CONTEXT: Three previous studies have investigated the use of chronological split to compare cross- to single-company effort predictions, where all used the ISBSG dataset release 10. Therefore there is a need for these studies to be replicated using different datasets such that the patterns previously observed can be compared and contrasted, and a better understanding with regard to the use of chronological splitting can be reached.

          OBJECTIVE: The aim of this study is to replicate [17] using the same chronological splitting; however a different database - the Finnish dataset.

          METHOD: Chronological splitting was compared with two forms of cross-validation. The chronological splitting used was the project-by-project chronological split, in which a validation set contains a single project, and a regression model is built from scratch using as training set the set of projects completed before the validation project’s start date. We used 201 single-company projects and 593 cross-company projects from the Finnish dataset.

          RESULTS: Single-company models presented significantly better prediction than cross-company models. Chronological splitting provided significantly worse accuracy than leave-one and leave-two out cross-validations when based on single-company data; and provided similar accuracy when based on cross-company data.

          CONCLUSIONS: Results did not seem promising when using project-by-project splitting; however in a real scenario companies that use their own data can only apply some sort of chronological splitting when obtaining effort estimates for their new projects. Therefore we urge the use of chronological splitting in effort estimation studies such that more realistic results can be provided to inform industry.

          Related collections

          Most cited references 12

          • Record: found
          • Abstract: not found
          • Article: not found

          Cross versus Within-Company Cost Estimation Studies: A Systematic Review

            • Record: found
            • Abstract: not found
            • Article: not found

            What accuracy statistics really measure

              • Record: found
              • Abstract: not found
              • Conference Proceedings: not found

              An assessment and comparison of common software cost estimation modeling techniques


                Author and article information

                April 2009
                April 2009
                : 1-10
                [1 ]Computer Science Department, The University of Auckland, Private Bag 92019, Auckland, New Zealand
                [2 ]School of IT&EE, UNSW@ADFA, Canberra ACT 2600, Australia
                © Emilia Mendes et al. Published by BCS Learning and Development Ltd. 13th International Conference on Evaluation and Assessment in Software Engineering (EASE), Durham University, UK

                This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

                13th International Conference on Evaluation and Assessment in Software Engineering (EASE)
                Durham University, UK
                20 - 21 April 2009
                Electronic Workshops in Computing (eWiC)
                Evaluation and Assessment in Software Engineering (EASE)
                Product Information: 1477-9358BCS Learning & Development
                Self URI (journal page): https://ewic.bcs.org/
                Electronic Workshops in Computing


                Comment on this article