35
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      A Paradigm Shift in Interactive Computing: Deriving Multimodal Design Principles from Behavioral and Neurological Foundations

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references61

          • Record: found
          • Abstract: found
          • Article: not found

          Humans integrate visual and haptic information in a statistically optimal fashion.

          When a person looks at an object while exploring it with their hand, vision and touch both provide information for estimating the properties of the object. Vision frequently dominates the integrated visual-haptic percept, for example when judging size, shape or position, but in some circumstances the percept is clearly affected by haptics. Here we propose that a general principle, which minimizes variance in the final estimate, determines the degree to which vision or haptics dominates. This principle is realized by using maximum-likelihood estimation to combine the inputs. To investigate cue combination quantitatively, we first measured the variances associated with visual and haptic estimation of height. We then used these measurements to construct a maximum-likelihood integrator. This model behaved very similarly to humans in a visual-haptic task. Thus, the nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator. Visual dominance occurs when the variance associated with visual estimation is lower than that associated with haptic estimation.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            The episodic buffer: a new component of working memory?

            In 1974, Baddeley and Hitch proposed a three-component model of working memory. Over the years, this has been successful in giving an integrated account not only of data from normal adults, but also neuropsychological, developmental and neuroimaging data. There are, however, a number of phenomena that are not readily captured by the original model. These are outlined here and a fourth component to the model, the episodic buffer, is proposed. It comprises a limited capacity system that provides temporary storage of information held in a multimodal code, which is capable of binding information from the subsidiary systems, and from long-term memory, into a unitary episodic representation. Conscious awareness is assumed to be the principal mode of retrieval from the buffer. The revised model differs from the old principally in focussing attention on the processes of integrating information, rather than on the isolation of the subsystems. In doing so, it provides a better basis for tackling the more complex aspects of executive control in working memory.
              Bookmark
              • Record: found
              • Abstract: found
              • Book Chapter: found

              Working Memory

                Bookmark

                Author and article information

                Journal
                International Journal of Human-Computer Interaction
                International Journal of Human-Computer Interaction
                Informa UK Limited
                1044-7318
                1532-7590
                June 2004
                June 2004
                : 17
                : 2
                : 229-257
                Article
                10.1207/s15327590ijhc1702_7
                71f958d1-6925-4689-8dc8-e54b15eced27
                © 2004
                History

                Comments

                Comment on this article