46
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Distributed situation awareness in dynamic systems: theoretical development and application of an ergonomics methodology

      Read this article at

      ScienceOpenPublisherPubMed
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The purpose of this paper is to propose foundations for a theory of situation awareness based on the analysis of interactions between agents (i.e. both human and non-human) in subsystems. This approach may help to promote a better understanding of technology-mediated interaction in systems, as well as helping in the formulation of hypotheses and predictions concerning distributed situation awareness. It is proposed that agents within a system each hold their own situation awareness, which may be very different from (although compatible with) that of other agents. It is argued that we should not always hope for, or indeed want, sharing of this awareness, as different system agents have different purposes. This view marks situation awareness as a dynamic and collaborative process binding agents together on tasks on a moment-by-moment basis. Implications of this viewpoint for the development of a new theory of, and accompanying methodology for, distributed situation awareness are offered.

          Related collections

          Most cited references8

          • Record: found
          • Abstract: found
          • Article: not found

          Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation.

          B Muir, N Moray (1996)
          Two experiments are reported which examined operators' trust in and use of the automation in a simulated supervisory process control task. Tests of the integrated model of human trust in machines proposed by Muir (1994) showed that models of interpersonal trust capture some important aspects of the nature and dynamics of human-machine trust. Results showed that operators' subjective ratings of trust in the automation were based mainly upon their perception of its competence. Trust was significantly reduced by any sign of incompetence in the automation, even one which had no effect on overall system performance. Operators' trust changed very little with experience, with a few notable exceptions. Distrust in one function of an automatic component spread to reduce trust in another function of the same component, but did not generalize to another independent automatic component in the same system, or to other systems. There was high positive correlation between operators' trust in and use of the automation; operators used automation they trusted and rejected automation they distrusted, preferring to do the control task manually. There was an inverse relationship between trust and monitoring of the automation. These results suggest that operators' subjective ratings of trust and the properties of the automation which determine their trust, can be used to predict and optimize the dynamic allocation of functions in automated systems.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Trust in automation: Part I. Theoretical issues in the study of trust and human intervention in automated systems

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Situation Awareness Is Adaptive, Externally Directed Consciousness

                Bookmark

                Author and article information

                Journal
                Ergonomics
                Ergonomics
                Informa UK Limited
                0014-0139
                1366-5847
                February 20 2007
                October 10 2006
                February 20 2007
                October 10 2006
                : 49
                : 12-13
                : 1288-1311
                Affiliations
                [1 ]a Human Factors Integration – Defence Technology Centre, BITlab, School of Engineering and Design , Brunel University , Uxbridge, Middlesex, UB8 3PH, UK
                [2 ]b Human Factors group, School of Engineering , Cranfield University , Cranfield, Bedfordshire, MK43 0AL, UK
                [3 ]c Department of Electrical and Electronic Engineering , University of Birmingham , Edgbaston, Birmingham, B15 2TT, UK
                [4 ]d SEA , SEA House, PO Box 800, Bristol, BS16 1SU, Fishponds, Bristol, UK
                [5 ]e Lockheed Martin UK Ltd – Integrated Systems , Building 7000, Langstone Technology Park, Havant, Hants, PO9 1SW, UK
                [6 ]f MBDA Missile Systems , Golf Course Lane, Filton, Bristol, BS34 7QW, UK
                Article
                10.1080/00140130600612762
                17008257
                c0ccee77-3e45-42a3-a76b-b723a626d0b4
                © 2006
                History

                Comments

                Comment on this article