Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3528575.3551449acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
extended-abstract

Eye-Phonon: Wearable Sonification System based on Smartphone that Colors and Deepens the Daily Conversations for Person with Visual Impairment

Published: 28 September 2022 Publication History

Abstract

For people with vision impairment, communication with others is one of the barriers. In this study, we proposed a smartphone-based sonification system that aims to enhance conversation for people with visual impairment. This is named Eye-Phonon, a system that expresses surrounding information seen with the eyes in a phonographic manner. The purpose of this study was to collect the communication barriers that people with visual impairment face, whether there is a need for our system, and their impressions and suggestions for improvement when using our prototype. The semi-structured interviews provided feedback on the daily communication barriers that people with visual impairment face and their required functions. We found that not only does Eye-Phonon ensure both the real-time and wearable features that the visually impaired need, but the sonification method is also acceptable.

References

[1]
Malika Auvray, Sylvain Hanneton, and J Kevin O’Regan. 2007. Learning to perceive with a visuo—auditory substitution system: localisation and object recognition with ‘The Voice’. Perception 36, 3 (2007), 416–430.
[2]
Amit Barde, William S Helton, Gun Lee, and Mark Billinghurst. 2016. Binaural spatialization over a bone conduction headset: Minimum discernable angular difference. In Audio Engineering Society Convention 140. Audio Engineering Society.
[3]
Drew Berge, Danilo Bettencourt, Stanley Lageweg, Willie Overman, Amir Zaidi, and Rafael Bidarra. 2020. Pinball for the Visually Impaired–an Audio Spatialization and Sonification Mobile Game. In Extended Abstracts of the 2020 Annual Symposium on Computer-Human Interaction in Play. 43–46.
[4]
VE Bishop. 1986. Identifying the components of success in mainstreaming. Journal of Visual Impairment & Blindness 80, 9 (1986), 939–946.
[5]
Guido Bologna, Benoît Deville, Thierry Pun, and Michel Vinckenbosch. 2007. Transforming 3D coloured pixels into musical instrument notes for vision substitution applications. EURASIP Journal on Image and Video Processing 2007 (2007), 1–14.
[6]
Michael Bowen, David F Edgar, Beverley Hancock, Sayeed Haque, Rakhee Shah, Sarah Buchanan, Steve Iliffe, Susan Maskell, James Pickett, John-Paul Taylor, 2016. The Prevalence of Visual Impairment in People with Dementia (the PrOVIDe study): a cross-sectional study of people aged 60–89 years with dementia and qualitative exploration of individual, carer and professional perspectives. (2016).
[7]
Michael Brock and Per Ola Kristensson. 2013. Supporting Blind Navigation Using Depth Sensing and Sonification. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication(Zurich, Switzerland) (UbiComp ’13 Adjunct). Association for Computing Machinery, New York, NY, USA, 255–258. https://doi.org/10.1145/2494091.2494173
[8]
Michal Bujacz, Piotr Skulimowski, and Pawel Strumillo. 2011. Sonification of 3D scenes using personalized spatial audio to aid visually impaired persons. International Community for Auditory Display.
[9]
Envision Technologies B.V.2021. Envision AI. https://www.letsenvision.com/envision-app
[10]
Hilary Casey, Nuala Brady, and Suzanne Guerin. 2013. ‘Is Seeing Perceiving?’Exploring issues concerning access to public transport for people with sight loss. British Journal of Visual Impairment 31, 3 (2013), 217–227.
[11]
Aira Tech Corp. 2022. Aira. https://aira.io/
[12]
Microsoft Corporation. 2021. Seeing AI | Talking Camera App for Those with a Visual Impairment. https://www.microsoft.com/en-us/ai/seeing-ai
[13]
Steve Duck. 1998. Human relationships. Sage.
[14]
Steve Duck and Dorothy Miell. 2021. Charting the development of personal relationships. In The emerging field of personal relationships. Routledge, 133–143.
[15]
Ron L Evans. 1983. Loneliness, depression, and social activity after determination of legal blindness. Psychological Reports 52, 2 (1983), 603–608.
[16]
William W Gaver. 1989. The SonicFinder: An interface that uses auditory icons. Human–Computer Interaction 4, 1 (1989), 67–94.
[17]
Dawn M Guthrie, Jacob GS Davidson, Nicole Williams, Jennifer Campos, Kathleen Hunter, Paul Mick, Joseph B Orange, M Kathleen Pichora-Fuller, Natalie A Phillips, Marie Y Savundranayagam, 2018. Combined impairments in vision, hearing and cognition are associated with greater levels of functional and communication difficulties than cognitive impairment alone: Analysis of interRAI data for home care and long-term care recipients in Ontario. PloS one 13, 2 (2018), e0192971.
[18]
Thomas J Heesterbeek, Hilde PA van der Aa, Ger HMB van Rens, Johannes WR Twisk, and Ruth MA van Nispen. 2017. The incidence and predictors of depressive and anxiety symptoms in older adults with vision impairment: a longitudinal prospective cohort study. Ophthalmic and Physiological Optics 37, 4 (2017), 385–398.
[19]
Thomas Hermann, Alexander Neumann, and Sebastian Zehe. 2012. Head gesture sonification for supporting social interaction. In Proceedings of the 7th Audio Mostly Conference: A Conference on Interaction with Sound. 82–89.
[20]
Mollie Hoben and Valerie Lindstrom. 1980. Evidence of isolation in the mainstream. Journal of Visual Impairment & Blindness 74, 8 (1980), 289–292.
[21]
Rabia Jafri and Marwa Mahmoud Khan. 2018. User-centered design of a depth data based obstacle detection and avoidance system for the visually impaired. Human-centric Computing and Information Sciences 8, 1 (2018), 1–30.
[22]
Mark L Knapp, Judith A Hall, and Terrence G Horgan. 2013. Nonverbal communication in human interaction. Cengage Learning.
[23]
KM Kramer, DS Hedin, and DJ Rolkosky. 2010. Smartphone based face recognition tool for the blind. In 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology. IEEE, 4538–4541.
[24]
Sreekar Krishna, Greg Little, John Black, and Sethuraman Panchanathan. 2005. iCARE interaction assistant: a wearable face recognition system for individuals with visual impairments. In Proceedings of the 7th International ACM SIGACCESS Conference on Computers and Accessibility. 216–217.
[25]
Peter BL Meijer. 1992. An experimental system for auditory image representations. IEEE transactions on biomedical engineering 39, 2 (1992), 112–121.
[26]
MIPsoft. 2021. BlindSquare. https://www.blindsquare.com/
[27]
Haruna Miyakawa, Noko Kuratomo, Hisham E Bilal Salih, and Keiichi Zempo. 2021. Auditory Uta-Karuta: Development and Evaluation of an Accessible Card Game System Using Audible Cards for the Visually Impaired. Electronics 10, 6 (2021), 750.
[28]
Chihab Nadri, Chairunisa Anaya, Shan Yuan, and Myounghoon Jeon. 2019. Preliminary guidelines on the sonification of visual artworks: Linking music, sonification & visual arts. Georgia Institute of Technology.
[29]
World Health Organization 2019. World report on vision. (2019).
[30]
Joseph Ramsay and Hyung Jin Chang. 2020. Body Pose Sonification for a View-Independent Auditory Aid to Blind Rock Climbers. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision. 3414–3421.
[31]
Pei-Luen Patrick Rau and Jian Zheng. 2019. Modality capacity and appropriateness in multimodal display of complex non-semantic information stream. International Journal of Human-Computer Studies 130 (2019), 166–178.
[32]
Masaaki Sadasue, Daichi Tagami, Sayan Sarcar, and Yoichi Ochiai. 2021. Blind-Badminton. In International Conference on Human-Computer Interaction. Springer, 494–506.
[33]
Jerome R Sehulster. 2006. Things we talk about, how frequently, and to whom: Frequency of topics in everyday conversation as a function of gender, age, and marital status. The American journal of psychology(2006), 407–432.
[34]
Ian W Stewart, Vincent B Van Hasselt, Janet Simon, and William B Thompson. 1985. The community adjustment program (CAP) for visually impaired adolescents. Journal of Visual Impairment & Blindness 79, 2 (1985), 49–54.
[35]
M. Iftekhar Tanveer, A. S. M. Iftekhar Anam, Mohammed Yeasin, and Majid Khan. 2013. Do You See What I See? Designing a Sensory Substitution Device to Access Non-Verbal Modes of Communication. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (Bellevue, Washington) (ASSETS ’13). Association for Computing Machinery, New York, NY, USA, Article 10, 8 pages. https://doi.org/10.1145/2513383.2513438
[36]
David A Wilson, Kevin D Frick, Kovin Shunmugam Naidoo, and Brien A Holden. 2015. The global burden of potential productivity loss from uncorrected presbyopia. Investigative Ophthalmology & Visual Science 56, 7 (2015), 2133–2133.

Index Terms

  1. Eye-Phonon: Wearable Sonification System based on Smartphone that Colors and Deepens the Daily Conversations for Person with Visual Impairment

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      MobileHCI '22: Adjunct Publication of the 24th International Conference on Human-Computer Interaction with Mobile Devices and Services
      September 2022
      124 pages
      ISBN:9781450393416
      DOI:10.1145/3528575
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 28 September 2022

      Check for updates

      Author Tags

      1. Assistive Technology
      2. People with Vision Impairment
      3. Sensibility Information
      4. Sonification
      5. Verbal Communication

      Qualifiers

      • Extended-abstract
      • Research
      • Refereed limited

      Conference

      MobileHCI '22
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 202 of 906 submissions, 22%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 107
        Total Downloads
      • Downloads (Last 12 months)28
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 12 Nov 2024

      Other Metrics

      Citations

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media