Blog
About

31
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Usability of a Patient Education and Motivation Tool Using Heuristic Evaluation

      , MD, MPH , 1 , , MS 1 , , PhD 2 , , MS, RN 1 , , MS 1 , , PhD 1

      (Reviewer), (Reviewer)

      Journal of Medical Internet Research

      Gunther Eysenbach

      Computers, health, education, usability, heuristic

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Computer-mediated educational applications can provide a self-paced, interactive environment to deliver educational content to individuals about their health condition. These programs have been used to deliver health-related information about a variety of topics, including breast cancer screening, asthma management, and injury prevention. We have designed the Patient Education and Motivation Tool (PEMT), an interactive computer-based educational program based on behavioral, cognitive, and humanistic learning theories. The tool is designed to educate users and has three key components: screening, learning, and evaluation.

          Objective

          The objective of this tutorial is to illustrate a heuristic evaluation using a computer-based patient education program (PEMT) as a case study. The aims were to improve the usability of PEMT through heuristic evaluation of the interface; to report the results of these usability evaluations; to make changes based on the findings of the usability experts; and to describe the benefits and limitations of applying usability evaluations to PEMT.

          Methods

          PEMT was evaluated by three usability experts using Nielsen’s usability heuristics while reviewing the interface to produce a list of heuristic violations with severity ratings. The violations were sorted by heuristic and ordered from most to least severe within each heuristic.

          Results

          A total of 127 violations were identified with a median severity of 3 (range 0 to 4 with 0 = no problem to 4 = catastrophic problem). Results showed 13 violations for visibility (median severity = 2), 38 violations for match between system and real world (median severity = 2), 6 violations for user control and freedom (median severity = 3), 34 violations for consistency and standards (median severity = 2), 11 violations for error severity (median severity = 3), 1 violation for recognition and control (median severity = 3), 7 violations for flexibility and efficiency (median severity = 2), 9 violations for aesthetic and minimalist design (median severity = 2), 4 violations for help users recognize, diagnose, and recover from errors (median severity = 3), and 4 violations for help and documentation (median severity = 4).

          Conclusion

          We describe the heuristic evaluation method employed to assess the usability of PEMT, a method which uncovers heuristic violations in the interface design in a quick and efficient manner. Bringing together usability experts and health professionals to evaluate a computer-mediated patient education program can help to identify problems in a timely manner. This makes this method particularly well suited to the iterative design process when developing other computer-mediated health education programs. Heuristic evaluations provided a means to assess the user interface of PEMT.

          Related collections

          Most cited references 23

          • Record: found
          • Abstract: found
          • Article: not found

          Cognitive and usability engineering methods for the evaluation of clinical information systems.

          Increasingly healthcare policy and decision makers are demanding evidence to justify investments in health information systems. This demand requires an adequate evaluation of these systems. A wide variety of approaches and methodologies have been applied in assessing the impact of information systems in health care, ranging from controlled clinical trials to use of questionnaires and interviews with users. In this paper we describe methodological approaches which we have applied and refined for the past 10 years for the evaluation of health information systems. The approaches are strongly rooted in theories and methods from cognitive science and the emerging field of usability engineering. The focus is on assessing human computer interaction and in particular, the usability of computer systems in both laboratory and naturalistic settings. The methods described can be a part of the formative evaluation of systems during their iterative development, and can also complement more traditional assessment methods used in summative system evaluation of completed systems. The paper provides a review of the general area of systems evaluation with the motivation and rationale for methodological approaches underlying usability engineering and cognitive task analysis as applied to health information systems. This is followed by a detailed description of the methods we have applied in a variety of settings in conducting usability testing and usability inspection of systems such as computer-based patient records. Emerging trends in the evaluation of complex information systems are discussed.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence.

            Usability evaluation is now widely recognized as critical to the success of interactive health care applications. However, the broad range of usability inspection and testing methods available may make it difficult to decide on a usability assessment plan. To guide novices in the human-computer interaction field, we provide an overview of the methodological and empirical research available on the three usability inspection and testing methods most often used. We describe two 'expert-based' and one 'user-based' usability method: (1) the heuristic evaluation, (2) the cognitive walkthrough, and (3) the think aloud. All three usability evaluation methods are applied in laboratory settings. Heuristic evaluation is a relatively efficient usability evaluation method with a high benefit-cost ratio, but requires high skills and usability experience of the evaluators to produce reliable results. The cognitive walkthrough is a more structured approach than the heuristic evaluation with a stronger focus on the learnability of a computer application. Major drawbacks of the cognitive walkthrough are the required level of detail of task and user background descriptions for an adequate application of the latest version of the technique. The think aloud is a very direct method to gain deep insight in the problems end users encounter in interaction with a system but data analyses is extensive and requires a high level of expertise both in the cognitive ergonomics and in computer system application domain. Each of the three usability evaluation methods has shown its usefulness, has its own advantages and disadvantages; no single method has revealed any significant results indicating that it is singularly effective in all circumstances. A combination of different techniques that compliment one another should preferably be used as their collective application will be more powerful than applied in isolation. Innovative mobile and automated solutions to support end-user testing have emerged making combined approaches of laboratory, field and remote usability evaluations of new health care applications more feasible.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Designing the user interface Strategies for effective human-computer interaction

                Bookmark

                Author and article information

                Contributors
                Journal
                J Med Internet Res
                JMIR
                Journal of Medical Internet Research
                Gunther Eysenbach (Centre for Global eHealth Innovation, Toronto, Canada )
                1438-8871
                Oct-Dec 2009
                6 November 2009
                : 11
                : 4
                Affiliations
                2Usability Engineering GroupXerox CorporationBaltimoreMDUSA
                1simpleDepartment of Information Systems simpleUniversity of Maryland Baltimore CountyBaltimoreMDUSA
                Article
                v11i4e47
                10.2196/jmir.1244
                2802560
                19897458
                © Ashish Joshi, Mohit Arora, Liwei Dai, Kathleen Price, Lisa Vizer, Andrew Sears. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 06.11.2009.  

                This is an open-access article distributed under the terms of the Creative Commons Attribution License ( http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.

                Categories
                Tutorial

                Medicine

                computers, health, education, usability, heuristic

                Comments

                Comment on this article