18
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Non-Speech Sound in Human-Computer Interaction: A Review and Design Guidelines

      1 , 1
      Journal of Educational Computing Research
      Baywood Publishing Company, Inc.

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references26

          • Record: found
          • Abstract: found
          • Article: not found

          Localization using nonindividualized head-related transfer functions.

          A recent development in human-computer interfaces is the virtual acoustic display, a device that synthesizes three-dimensional, spatial auditory information over headphones using digital filters constructed from head-related transfer functions (HRTFs). The utility of such a display depends on the accuracy with which listeners can localize virtual sound sources. A previous study [F. L. Wightman and D. J. Kistler, J. Acoust. Soc. Am. 85, 868-878 (1989)] observed accurate localization by listeners for free-field sources and for virtual sources generated from the subjects' own HRTFs. In practice, measurement of the HRTFs of each potential user of a spatial auditory display may not be feasible. Thus, a critical research question is whether listeners can obtain adequate localization cues from stimuli based on nonindividualized transforms. Here, inexperienced listeners judged the apparent direction (azimuth and elevation) of wideband noisebursts presented in the free-field or over headphones; headphone stimuli were synthesized using HRTFs from a representative subject of Wightman and Kistler. When confusions were resolved, localization of virtual sources was quite accurate and comparable to the free-field sources for 12 of the 16 subjects. Of the remaining subjects, 2 showed poor elevation accuracy in both stimulus conditions, and 2 showed degraded elevation accuracy with virtual sources. Many of the listeners also showed high rates of front-back and up-down confusions that increased significantly for virtual sources compared to the free-field stimuli. These data suggest that while the interaural cues to horizontal location are robust, the spectral cues considered important for resolving location along a particular cone-of-confusion are distorted by a synthesis process that uses nonindividualized HRTFs.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Auditory Perception of Temporal Order

            Ira Hirsh (1959)
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Localization in Virtual Acoustic Displays

                Bookmark

                Author and article information

                Journal
                Journal of Educational Computing Research
                Journal of Educational Computing Research
                Baywood Publishing Company, Inc.
                0735-6331
                1541-4140
                November 28 2016
                October 1994
                November 28 2016
                October 1994
                : 11
                : 3
                : 211-233
                Affiliations
                [1 ]University of Washington
                Article
                10.2190/MKD9-W05T-YJ9Y-81NM
                dc73ab77-af21-492e-9120-b440505135b2
                © 1994

                http://journals.sagepub.com/page/policies/text-and-data-mining-license

                History

                Comments

                Comment on this article