Proceedings of the 28th International BCS Human Computer Interaction Conference (HCI 2014) (HCI)
BCS Human Computer Interaction Conference (HCI 2014)
9 - 12 September 2014
This paper examines combinations of complementary evaluation methods as a strategy for efficient usability problem discovery. A data set from an earlier study is re-analyzed, involving three evaluation methods applied to two virtual environment applications. Results of a mixed-effects logistic regression suggest that usability testing and inspection discover rather disjunctive sets of problems. A resampling analysis reveals that mixing inspection and usability testing sessions in equal parts finds 20% more problems with the same number of sessions.