Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1978942.1979354acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

On the audio representation of radial direction

Published: 07 May 2011 Publication History

Abstract

We present and evaluate an approach towards eyes-free auditory display of spatial information that considers radial direction as a fundamental type of value primitive. There are many benefits to being able to sonify radial directions, such as indicating the heading towards a point of interest in a direct and dynamic manner, rendering a path or shape outline by sonifying a continual sequence of tangent directions as the path is traced, and providing direct feedback of the direction of motion of the user in a physical space or a pointer in a virtual space. We propose a concrete mapping of vowel-like sounds to radial directions as one potential method to enable sonification of such information. We conducted a longitudinal study with five sighted and two blind participants to evaluate the learnability and effectiveness of this method. Results suggest that our directional sound mapping can be learned within a few hours and be used to aurally perceive spatial information such as shape outlines and path contours.

References

[1]
Begault, D. R. 3-D sound for virtual reality and multimedia. Academic Press Professional, Inc., 1994.
[2]
Blauert, J. Spatial hearing: the psychophysics of human sound localization. MIT Press, 1997.
[3]
Brown, L. M. and Brewster, S. A. Drawing by ear: interpreting sonified line graphs. Proc. ICAD 2003, (2003).
[4]
Capp, M. and Picton, P. The optophone: an electronic blind aid. Engineering Science and Education Journal 9, 3 (2000), 137--143.
[5]
Cassidy, R. J., Berger, J., Lee, K., Maggioni, M., and Coifman, R. R. Auditory display of hyperspectral colon tissue images using vocal synthesis models. Proc. ICAD 2004, (2004).
[6]
Cohen, R. F., Haven, V., Lanzoni, J. A., Meacham, A., Skaff, J., and Wissell, M. Using an audio interface to assist users who are visually impaired with steering tasks. Proc. ASSETS 2006, ACM (2006), 119--124.
[7]
Crossan, A. and Brewster, S. Multimodal trajectory playback for teaching shape information and trajectories to visually impaired computer users. ACM Transactions on Accessible Computing 1, 2 (2008), 1--34.
[8]
van den Doel, K. SoundView: sensing color images by kinesthetic audio. Proc. ICAD 2003, (2003), 303--306.
[9]
Fox, J., Carlile, J., and Berger, J. SoniMime: sonification of fine motor skills. Proc. ICAD 2005, (2005).
[10]
Harada, S., Wobbrock, J. O., and Landay, J. A. VoiceDraw: a hands-free voice-driven drawing application for people with motor impairments. Proc. ASSETS 2007, ACM (2007), 27--34.
[11]
Harada, S., Wobbrock, J. O., Malkin, J., Bilmes, J. A., and Landay, J. A. Longitudinal study of people learning to use continuous voice-based cursor control. Proc. CHI 2009, ACM (2009), 347--356.
[12]
Hermann, T., Baier, G., Stephani, U., and Ritter, H. Kernel regression mapping for vocal EEG sonification. Proc. ICAD 2008, (2008).
[13]
Holland, S., Morse, D. R., and Gedenryd, H. AudioGPS: spatial audio navigation with a minimal attention interface. Personal Ubiq. Comp. 6, 4 (2002), 253--259.
[14]
Hummel, J., Hermann, T., Frauenberger, C., and Stockman, T. Interactive sonification of German wheel sports movement. Proc. ISon 2010, (2010), 17--22.
[15]
Kleiman-Weiner, M. and Berger, J. The sound of one arm swinging: a model for multidimensional auditory display of physical motion. Proc. ICAD 2006, (2006), 278--280.
[16]
Kramer, G. An introduction to auditory display. Addison Wesley Longman, 1992.
[17]
Loomis, J. M., Golledge, R. G., and Klatzky, R. L. Navigation system for the blind: auditory display modes and guidance. Presence: Teleoperators and Virtual Environments 7, 2 (1998), 193--203.
[18]
Malkin, J., Li, X., Harada, S., Landay, J., and Bilmes, J. The Vocal Joystick Engine v1.0. Computer Speech & Language in press, (2010).
[19]
Mansur, D. L., Blattner, M. M., and Joy, K. I. Sound Graphs: a numerical data analysis method for the blind. Journal of Medical Systems 9, 3 (1985), 163--174.
[20]
Meijer, P. An experimental system for auditory image representations. IEEE Transactions on Biomedical Engineering 39, 2 (1992), 112--121.
[21]
Plimmer, B., Crossan, A., Brewster, S. A., and Blagojevic, R. Multimodal collaborative handwriting training for visually-impaired people. Proc. CHI 2008, ACM (2008), 393--402.
[22]
Ramloll, R., Yu, W., Brewster, S., Riedel, B., Burton, M., and Dimigen, G. Constructing sonified haptic line graphs for the blind student: first steps. Proc. ASSETS 2000, ACM (2000), 17--25.
[23]
Rigas, D. and Alty, J. The rising pitch metaphor: an empirical study. International Journal of Human-Computer Studies 62, 1 (2005), 1--20.
[24]
Talbot, M. and Cowan, W. On the audio representation of distance for blind users. Proc. CHI 2009, ACM (2009), 1839--1848.
[25]
Wilson, J., Walker, B. N., Lindsay, J., Cambias, C., and Dellaert, F. SWAN: system for wearable audio navigation. Proc. ISWC 2007, (2007), 1--8.
[26]
Zhao, H., Plaisant, C., and Shneiderman, B. "I hear the pattern": interactive sonification of geographical data patterns. Ext. Abtract CHI 2005, ACM (2005), 1905--1908.

Cited By

View all

Index Terms

  1. On the audio representation of radial direction
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        CHI '11: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
        May 2011
        3530 pages
        ISBN:9781450302289
        DOI:10.1145/1978942
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 07 May 2011

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. audio display
        2. navigation
        3. non-speech
        4. non-verbal
        5. radial direction
        6. sonification
        7. visual impairment

        Qualifiers

        • Research-article

        Conference

        CHI '11
        Sponsor:

        Acceptance Rates

        CHI '11 Paper Acceptance Rate 410 of 1,532 submissions, 27%;
        Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

        Upcoming Conference

        CHI '25
        CHI Conference on Human Factors in Computing Systems
        April 26 - May 1, 2025
        Yokohama , Japan

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)11
        • Downloads (Last 6 weeks)6
        Reflects downloads up to 09 Nov 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2022)Applying Sonification to Sketching in the Air With Mobile AR DevicesIEEE Transactions on Human-Machine Systems10.1109/THMS.2022.318659252:6(1352-1363)Online publication date: Dec-2022
        • (2022)Finding Objects Faster in Dense Environments Using a Projection Augmented Robotic ArmHuman-Computer Interaction – INTERACT 201510.1007/978-3-319-22698-9_15(221-238)Online publication date: 10-Mar-2022
        • (2019)Scientific DocumentsWeb Accessibility10.1007/978-1-4471-7440-0_22(397-415)Online publication date: 4-Jun-2019
        • (2016)Evaluating Haptic and Auditory Directional Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted CamerasACM Transactions on Accessible Computing10.1145/29147939:1(1-38)Online publication date: 21-Oct-2016
        • (2016)Using Audio Cues to Support Motion Gesture Interaction on Mobile DevicesACM Transactions on Applied Perception10.1145/289751613:3(1-19)Online publication date: 28-May-2016
        • (2015)Audio-Based Feedback Techniques for Teaching Touchscreen GesturesACM Transactions on Accessible Computing10.1145/27649177:3(1-29)Online publication date: 14-Nov-2015
        • (2015)The Design and Preliminary Evaluation of a Finger-Mounted Camera and Feedback System to Enable Reading of Printed Text for the BlindComputer Vision - ECCV 2014 Workshops10.1007/978-3-319-16199-0_43(615-631)Online publication date: 20-Mar-2015
        • (2014)A design space of guidance techniques for large and dense physical environmentsProceedings of the 26th Conference on l'Interaction Homme-Machine10.1145/2670444.2670455(9-17)Online publication date: 28-Oct-2014
        • (2013)A Systematic Review of Mapping Strategies for the Sonification of Physical QuantitiesPLoS ONE10.1371/journal.pone.00824918:12(e82491)Online publication date: 17-Dec-2013
        • (2013)Follow that soundProceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/2513383.2513455(1-8)Online publication date: 21-Oct-2013
        • Show More Cited By

        View Options

        Get Access

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media