10th International Conference on Evaluation and Assessment in Software Engineering (EASE) (EASE)
Evaluation and Assessment in Software Engineering (EASE)
10 - 11 April 2006
OBJECTIVE – The aim is to report upon an assessment of the impact noise has on the predictive accuracy by comparing noise handling techniques.
METHOD – We describe the process of cleaning a large software management dataset comprising initially of more than 10,000 projects. The data quality is mainly assessed through feedback from the data provider and manual inspection of the data. Three methods of noise correction (polishing, noise elimination and robust algorithms) are compared with each other assessing their accuracy. The noise detection was undertaken by using a regression tree model.
RESULTS – Three noise correction methods are compared and different results in their accuracy where noted.
CONCLUSIONS – The results demonstrated that polishing improves classification accuracy compared to noise elimination and robust algorithms approaches.