Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3517428.3544821acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

VRBubble: Enhancing Peripheral Awareness of Avatars for People with Visual Impairments in Social Virtual Reality

Published: 22 October 2022 Publication History
  • Get Citation Alerts
  • Abstract

    Social Virtual Reality (VR) is growing for remote socialization and collaboration. However, current social VR applications are not accessible to people with visual impairments (PVI) due to their focus on visual experiences. We aim to facilitate social VR accessibility by enhancing PVI’s peripheral awareness of surrounding avatar dynamics. We designed VRBubble, an audio-based VR technique that provides surrounding avatar information based on social distances. Based on Hall’s proxemic theory, VRBubble divides the social space with three Bubbles—Intimate, Conversation, and Social Bubble—generating spatial audio feedback to distinguish avatars in different bubbles and provide suitable avatar information. We provide three audio alternatives: earcons, verbal notifications, and real-world sound effects. PVI can select and combine their preferred feedback alternatives for different avatars, bubbles, and social contexts. We evaluated VRBubble and an audio beacon baseline with 12 PVI in a navigation and a conversation context. We found that VRBubble significantly enhanced participants’ avatar awareness during navigation and enabled avatar identification in both contexts. However, VRBubble was shown to be more distracting in crowded environments.

    References

    [1]
    ASM Iftekhar Anam, Shahinur Alam, and Mohammed Yeasin. 2014. Expression: A dyadic conversation aid using Google Glass for people who are blind or visually impaired. In 6th International Conference on Mobile Computing, Applications and Services. IEEE, 57–64.
    [2]
    Ronny Andrade, Steven Baker, Jenny Waycott, and Frank Vetere. 2018. Echo-house: Exploring a virtual environment by using echolocation. In Proceedings of the 30th Australian Conference on Computer-Human Interaction. 278–289.
    [3]
    Matthew T Atkinson, Sabahattin Gucukoglu, Colin HC Machin, and Adrian E Lawrence. 2006. Making the mainstream accessible: redefining the game. In Proceedings of the 2006 ACM SIGGRAPH Symposium on Videogames. 21–28.
    [4]
    Stephen A Brewster, Peter C Wright, and Alistair DN Edwards. 1993. An evaluation of earcons for use in auditory human-computer interfaces. In Proceedings of the INTERACT’93 and CHI’93 conference on Human factors in computing systems. 222–227.
    [5]
    Hendrik P Buimer, Marian Bittner, Tjerk Kostelijk, Thea M Van Der Geest, Abdellatif Nemri, Richard JA Van Wezel, and Yan Zhao. 2018. Conveying facial expressions to blind and visually impaired persons through a wearable vibrotactile device. PloS one 13, 3 (2018), e0194737.
    [6]
    J.J. Cadiz, Gina Venolia, Gavin Jancke, and Anoop Gupta. 2002. Designing and deploying an information awareness interface. CSCW02 (November 2002). https://doi.org/10.1145/587078.587122
    [7]
    Chetz Colwell, Helen Petrie, Diana Kornbrot, Andrew Hardwick, and Stephen Furner. 1998. Haptic virtual reality for blind computer users. ASSETS98 (January 1998). https://doi.org/10.1145/274497.274515
    [8]
    Nils Dahlback, Arne Jonsson, and Lars Ahrenberg. 1993. WIZARD OF OZ STUDIES — WHY AND HOW. Intelligent User Interfaces(1993).
    [9]
    Maitraye Das, Thomas Barlow McHugh, Anne Marie Piper, and Darren Gergle. 2022. Co11ab: Augmenting Accessibility in Synchronous Collaborative Writing for People with Vision Impairments. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (New Orleans, LA, USA) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 196, 18 pages. https://doi.org/10.1145/3491102.3501918
    [10]
    Patricia A. de Oliveira, Erich P. Lotto, Ana Grasielle D. Correa, Luis G. G. Taboada, Laisa C. P. Costa, and Roseli D. Lopes. 2015. Virtual Stage: An Immersive Musical Game for People with Visual Impairment. In 2015 14th Brazilian Symposium on Computer Games and Digital Entertainment (SBGames). 135–141. https://doi.org/10.1109/SBGames.2015.26
    [11]
    Euan Freeman, Graham Wilson, Stephen Brewster, Gabriel Baud-Bovy, Charlotte Magnusson, and Hector Caltenco. 2017. Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently. CHI17 (January 2017). https://doi.org/10.1145/3025453.3025518
    [12]
    Lakshmi Gade, Sreekar Krishna, and Sethuraman Panchanathan. 2009. Person localization using a wearable camera towards enhancing social interactions for individuals with visual impairment. In Proceedings of the 1st ACM SIGMM international workshop on Media studies and implementations that help improving access to disabled users. 53–62.
    [13]
    Evil Dog Games. 2014. Blind Swordsman. https://devpost.com/software/blind-swordsman last accessed 18 Jan 2014.
    [14]
    Ellen R Girden. 1992. ANOVA: Repeated measures. Number 84. Sage.
    [15]
    Cole Gleason, Alexander J. Fiannaca, Melanie Kneisel, Edward Cutrell, and Meredith Ringel Morris. 2018. FootNotes: Geo-Referenced Audio Annotations for Nonvisual Exploration. 2, 3, Article 109 (sep 2018), 24 pages. https://doi.org/10.1145/3264919
    [16]
    Glitch. 2000. Glitch. https://glitch.com last accessed 12 January 2022.
    [17]
    Jens Grubert, Eyal Ofek, Michel Pahud, and Per Ola Kristensson. 2018. The office of the future: Virtual, portable, and global. IEEE computer graphics and applications 38, 6 (2018), 125–133.
    [18]
    Edward T Hall. 1966. The Hidden Dimension. Doubleday.
    [19]
    Wilko Heuten, Daniel Wichmann, and Susanne Boll. 2006. Interactive 3D sonification for the exploration of city maps. Nordic conference on Human-computer interaction: changing roles (2006), 155–164.
    [20]
    Simon Holland, David Morse, and Henrik Gedenryd. 2002. AudioGPS: Spatial Audio Navigation with a Minimal Attention Interface. Personal and Ubiquitous Computing 6 (09 2002). https://doi.org/10.1007/s007790200025
    [21]
    Seyedeh Maryam Fakhr Hosseini, Andreas Riener, Rahul Bose, and Myounghoon Jeon. 2014. “Listen2dRoom”: Helping visually impaired people navigate indoor environments using an ultrasonic sensor-based orientation aid.
    [22]
    Rec Room Inc. 2016. Rec Room. Oculus Quest, Microsoft Windows, PlayStation, iOS, Android, Xbox.
    [23]
    Toru Ishikawa, Hiromichi Fujiwara, Osamu Imai, and Atsuyuki Okabe. 2008. Wayfinding with a GPS-based mobile navigation system: A comparison with maps and direct experience. Journal of environmental psychology 28, 1 (2008), 74–82.
    [24]
    Gunnar Jansson. 1999. Can a Haptic Display Rendering of Virtual Three-Dimensional Objects be useful for People with Visual Impairments?93, 7 (July 1999). https://doi.org/10.1177/0145482X9909300707
    [25]
    Gunnar Jansson, Helen Petrie, Chetz Colwell, Diana Kornbrot, J Fänger, H König, Katarina Billberger, Andrew Hardwick, and Stephen Furner. 1999. Haptic virtual environments for blind people: Exploratory experiments with two devices. International journal of virtual reality 4, 1 (1999), 8–17.
    [26]
    Tiger F. Ji, Brianna R Cochran, and Yuhang Zhao. 2022. Demonstration of VRBubble: Enhancing Peripheral Avatar Awareness for People with Visual Impairments in Social Virtual Reality(CHI EA ’22). Association for Computing Machinery, New York, NY, USA, Article 401, 6 pages. https://doi.org/10.1145/3491101.3519657
    [27]
    Ralf Jung. 2008. Smart sound environments: merging intentional soundscapes, nonspeech audio cues and ambient intelligence. The Journal of the Acoustical Society of America 123, 5 (2008), 3935–3935. https://doi.org/10.1121/1.2936002 arXiv:https://doi.org/10.1121/1.2936002
    [28]
    Matthew Kay, Lisa A. Elkin, James J. Higgins, and Jacob O. Wobbrock. 2021. ARTool: Aligned Rank Transform for Nonparametric Factorial ANOVAs. https://doi.org/10.5281/zenodo.594511 R package version 0.11.1.
    [29]
    Fredrik Kilander and Pekka Loennqvist. 2002. A whisper in the woods - an ambient soundscape for peripheral awareness of remote processes.
    [30]
    Julian Kreimeier and Timo Gotzelmann. 2019. First Steps Towards Walk-In-Place Locomotion and Haptic Feedback in Virtual Reality for Visually Impaired. CHI EA19 (May 2019). https://doi.org/10.1145/3290607.3312944
    [31]
    Sreekar Krishna, Shantanu Bala, Troy McDaniel, Stephen McGuire, and Sethuraman Panchanathan. 2010. VibroGlove: an assistive technology aid for conveying facial expressions. In CHI’10 Extended Abstracts on Human Factors in Computing Systems. 3637–3642.
    [32]
    Sreekar Krishna, Dirk Colbry, John Black, Vineeth Balasubramanian, and Sethuraman Panchanathan. 2008. A systematic requirements analysis and development of an assistive device to enhance the social interaction of people who are blind or visually impaired. In Workshop on Computer Vision Applications for the Visually Impaired.
    [33]
    Orly Lahav and David Mioduser. 2004. Exploration of unknown spaces by people who are blind using a multi-sensory virtual environment. Journal of Special Education Technology 19, 3 (2004), 15–23.
    [34]
    Cheuk Yin Phipson Lee, Zhuohao Zhang, Jaylin Herskovitz, JooYoung Seo, and Anhong Guo. 2022. CollabAlly: Accessible Collaboration Awareness in Document Editing. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (New Orleans, LA, USA) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 596, 17 pages. https://doi.org/10.1145/3491102.3517635
    [35]
    Tapio Lokki and Matti Grohn. 2005. Navigation with auditory cues in a virtual environment. IEEE MultiMedia 12, 2 (2005), 80–86.
    [36]
    Shachar Maidenbaum, Shelly Levy-Tzedek, Daniel-Robert Chebat, and Amir Amedi. 2013. Increasing accessibility to the blind of virtual environments, using a virtual mobility aid based on the” EyeCane”: Feasibility study. PloS one 8, 8 (2013), e72555.
    [37]
    Masaki Matsuo, Takahiro Miura, Masatsugu Sakajiri, Junji Onishi, and Tsukasa Ono. 2016. Audible Mapper & ShadowRine: Development of Map Editor Using only Sound in Accessible Game for Blind Users, and Accessible Action RPG for Visually Impaired Gamers. In International Conference on Computers Helping People with Special Needs. Springer, 537–544.
    [38]
    Tara Matthews and Jennifer Mankoff. 2005. A toolkit for evaluating peripheral awareness displays. In From the Awareness Systems Workshop at CHI.
    [39]
    Keenan R May, Brianna J Tomlinson, Xiaomeng Ma, Phillip Roberts, and Bruce N Walker. 2020. Spotlights and soundscapes: On the design of mixed reality auditory environments for persons with visual impairment. ACM Transactions on Accessible Computing (TACCESS) 13, 2 (2020), 1–47.
    [40]
    Troy McDaniel, Sreekar Krishna, Vineeth Balasubramanian, Dirk Colbry, and Sethuraman Panchanathan. 2008. Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind. In 2008 IEEE international workshop on haptic audio visual environments and games. IEEE, 13–18.
    [41]
    Joshua McVeigh-Schultz, Anya Kolesnichenko, and Katherine Isbister. 2019. Shaping pro-social interaction in VR: an emerging design framework. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–12.
    [42]
    Lotfi Merabet and Jaime Sanchez. 2009. Audio-based navigation using virtual environments: combining technology and neuroscience. AER Journal: Research and Practice in Visual Impairment and Blindness 2, 3(2009), 128–137.
    [43]
    Microsoft. 2013. Altspace. Oculus Quest, Microsoft Windows.
    [44]
    Microsoft. 2022. Microsoft Soundscape. Microsoft Windows.
    [45]
    Cecily Morrison, Ed Cutrell, Martin Grayson, Geert Roumen, Rita Faia Marques, Anja Thieme, Alex Taylor, and Abigail Sellen. 2021. PeopleLens. Interactions 28, 3 (2021), 10–13.
    [46]
    Konstantinos Moustakas, Georgios Nikolakis, Konstantinos Kostopoulos, Dimitrious Tzovaras, and Michael G. Strintzis. 2007. Haptic Rendering of Visual Data for the Visually Impaired. IEEE (January 2007). https://doi.org/10.1145/274497.274515
    [47]
    Robby Nadler. 2020. Understanding “Zoom fatigue”: Theorizing spatial dynamics as third skins in computer-mediated communication. Computers and Composition 58 (October 2020). https://doi.org/10.1016/j.compcom.2020.102613
    [48]
    Gerhard Nahler. 2009. Latin square. Springer Vienna, Vienna, 105–105. https://doi.org/10.1007/978-3-211-89836-9_776
    [49]
    Vishnu Nair, Jay L Karp, Samuel Silverman, Mohar Kalra, Hollis Lehv, Faizan Jamil, and Brian A Smith. 2021. NavStick: Making Video Games Blind-Accessible via the Ability to Look Around. In The 34th Annual ACM Symposium on User Interface Software and Technology. 538–551.
    [50]
    Research Nester. 2021. Virtual Collaboration Market Segmentation by Tools (Instant Communication, Project Management Tools, Cloud-Based Tools, and Others); and by End-User (IT & Telecom, BFSI, Retail, Healthcare, Logistics & Transportation, Education, Manufacturing, and Others) – Global Demand Analysis and Opportunity Outlook 2021-2029. https://www.researchnester.com/reports/virtual-collaboration-market/2994 last accessed 10 January 2022.
    [51]
    Donald A. Norman. 1986. User Centered System Design; New Perspectives on Human-Computer Interaction. L. Erlbaum Associates Inc.
    [52]
    Bugra Oktay and Eelke Folmer. 2010. TextSL: a screen reader accessible interface for second life. W4A10 (April 2010). https://doi.org/10.1145/1805986.1806017
    [53]
    World Health Organization. 2021. Blindness and vision impairment. https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment#:~:text=Key%20facts,has%20yet%20to%20be%20addressed. last accessed 26 Jan 2022.
    [54]
    Andrew Osterland. 2021. Virtual Reality Headset Market Size, Share & Trends Analysis Report By End-device (Low-end, High-end), By Product Type (Standalone, Smartphone-enabled), By Application (Gaming, Education), And Segments Forecasts, 2021 - 2028. https://www.grandviewresearch.com/industry-analysis/virtual-reality-vr-headset-market last accessed 26 Jan 2022.
    [55]
    Jamie Pauls. 2020. Vintage Games Series, Part 4: Immerse Yourself in the World of Shades of Doom. https://www.afb.org/aw/21/12/17336 last accessed 18 Jan 2022.
    [56]
    Elin R. Pedersen and Tomas Sokoler. 1997. AROMA: abstract representation of presence supporting mutual awareness. CHI97 (March 1997). https://doi.org/10.1145/258549.258584
    [57]
    Aaron Preece. 2018. A Review of A Hero’s Call, an Accessible Role Playing Game from Out of Sight Games. https://www.afb.org/aw/19/3/15113 last accessed 18 Jan 2022.
    [58]
    Giorgio Presti, Dragan Ahmetovic, Mattia Ducci, Cristian Bernareggi, Luca Ludovico, Adriano Baratè, Federico Avanzini, and Sergio Mascetti. 2019. WatchOut: Obstacle Sonification for People with Visual Impairment or Blindness. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility(Pittsburgh, PA, USA) (ASSETS ’19). Association for Computing Machinery, New York, NY, USA, 402–413. https://doi.org/10.1145/3308561.3353779
    [59]
    Radegast Project. 2009. Radegast. Microsoft Windows, Mac, Linux.
    [60]
    Shi Qiu, Matthias Rauterberg, and Jun Hu. 2016. Designing and evaluating a wearable device for accessing gaze signals from the sighted. In International Conference on Universal Access in Human-Computer Interaction. Springer, 454–464.
    [61]
    Mose Sakashita, E. Andy Ricci, Jatin Arora, and François Guimbretière. 2022. RemoteCoDe: Robotic Embodiment for Enhancing Peripheral Awareness in Remote Collaboration Tasks. Proc. ACM Hum.-Comput. Interact. 6, CSCW1, Article 63 (apr 2022), 22 pages. https://doi.org/10.1145/3512910
    [62]
    Johnny Saldaña. 2021. The coding manual for qualitative researchers. sage.
    [63]
    Jaime Sanchez and Mauricio Lumbreras. 1997. Hyperstories: Interactive narrative in virtual worlds. Hypertextes et hypermédias(1997), 329–338.
    [64]
    Jaime Sánchez and Mauricio Lumbreras. 1999. Virtual environment interaction through 3D audio by blind children. CyberPsychology & Behavior 2, 2 (1999), 101–111.
    [65]
    JH Sánchez and MA Sáenz. 2006. Assisting the mobilization through subway networks by users with visual disabilities. Virtual Reality & Assoc. Tech(2006).
    [66]
    Daisuke Sato, Uran Oh, João Guerreiro, Dragan Ahmetovic, Kakuya Naito, Hironobu Takagi, Kris M. Kitani, and Chieko Asakawa. 2019. NavCog3 in the Wild: Large-Scale Blind Indoor Navigation Assistant with Semantic Features. ACM Trans. Access. Comput. 12, 3, Article 14 (aug 2019), 30 pages. https://doi.org/10.1145/3340319
    [67]
    David W Schloerb, Orly Lahav, Joseph G Desloge, and Mandayam A Srinivasan. 2010. BlindAid: Virtual environment system for self-reliant trip planning and orientation and mobility training. In 2010 IEEE Haptics Symposium. IEEE, 363–370.
    [68]
    Alexa F Siu, Mike Sinclair, Robert Kovacs, Eyal Ofek, Christian Holz, and Edward Cutrell. 2020. Virtual reality without vision: A haptic and auditory white cane to navigate complex virtual worlds. In Proceedings of the 2020 CHI conference on human factors in computing systems. 1–13.
    [69]
    Zach Smith and Bre Pettis. 2008. Thingiverse. https://www.thingiverse.com/ last accessed 12 April 2022.
    [70]
    Statista. 2019. Leading barriers to mass adoption of VR according to XR professionals worldwide as of the 3rd quarter of 2019. https://www.statista.com/statistics/1099109/barriers-to-mass-consumer-adoption-of-vr/ last accessed 7 July 2015.
    [71]
    Supermedium. 2015. A-Frame. Cross-platform.
    [72]
    Manohar Swaminathan, Sujeath Pareddy, Tanuja S. Sawant, and Shubi Agarwal. 2018. Video Gaming for the Vision Impaired. ASSETS18 (October 2018). https://doi.org/10.1145/3234695.3241025
    [73]
    Juan R Terven, Joaquin Salas, and Bogdan Raducanu. 2014. Robust head gestures recognition for assistive technology. In Mexican Conference on Pattern Recognition. Springer, 152–161.
    [74]
    Shari Trewin, Vicki L Hanson, Mark R Laff, and Anna Cavender. 2008. PowerUp: an accessible virtual world. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility. 177–184.
    [75]
    Dimitrios Tzovaras, Konstantinos Moustakas, Georgios Nikolakis, and Michael G Strintzis. 2009. Interactive mixed reality white cane simulation for the training of the blind and the visually impaired. Personal and Ubiquitous Computing 13, 1 (2009), 51–58.
    [76]
    Dimitrios Tzovaras, Georgios Nikolakis, George Fergadis, Stratos Malasiotis, and Modestos Stavrakis. 2002. Design and implementation of virtual environments training of the visually impaired. In Proceedings of the fifth international ACM conference on Assistive technologies. 41–48.
    [77]
    Pablo Vera, Daniel Zenteno, and Joaquín Salas. 2014. A Smartphone-Based Virtual White Cane. Pattern Anal. Appl. 17, 3 (aug 2014), 623–632. https://doi.org/10.1007/s10044-013-0328-8
    [78]
    Inc VRChat. 2017. VRChat. Oculus Quest, Microsoft Windows.
    [79]
    Bruce N. Walker and Jeffrey Lindsay. 2006. Navigation Performance With a Virtual Auditory Display: Effects of Beacon Sound, Capture Radius, and Practice. Human Factors 48, 2 (2006), 265–278.
    [80]
    Dean A. Waters and Husam H. Abulula. 2001. The virtual bat: Echolocation in virtual reality.
    [81]
    Ryan Wedoff, Lindsay Ball, Amelia Wang, Yi Xuan Khoo, Lauren Lieberman, and Kyle Rector. 2019. Virtual showdown: An accessible virtual reality game with scaffolds for youth with visual impairments. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1–15.
    [82]
    Thomas Westin. 2004. Game accessibility case study: Terraformers–a real-time 3D graphic game. In Proceedings of the 5th International Conference on Disability, Virtual Reality and Associated Technologies, ICDVRAT.
    [83]
    Matt Wilkerson, Amanda Koenig, and James Daniel. 2010. Does a Sonar System Make a Blind Maze Navigation Computer Game More ”Fun”?. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility(Orlando, Florida, USA) (ASSETS ’10). Association for Computing Machinery, New York, NY, USA, 309–310. https://doi.org/10.1145/1878803.1878886
    [84]
    Jeff Wilson, Bruce N. Walker, Jeffrey Lindsay, Craig Cambias, and Frank Dellaert. 2007. SWAN: System for Wearable Audio Navigation. In 2007 11th IEEE International Symposium on Wearable Computers. 91–98. https://doi.org/10.1109/ISWC.2007.4373786
    [85]
    Yuhang Zhao, Cynthia L Bennett, Hrvoje Benko, Edward Cutrell, Christian Holz, Meredith Ringel Morris, and Mike Sinclair. 2018. Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. In Proceedings of the 2018 CHI conference on human factors in computing systems. 1–14.
    [86]
    Yuhang Zhao, Edward Cutrell, Christian Holz, Meredith Ringel Morris, Eyal Ofek, and Andrew D Wilson. 2019. Seeingvr: A set of tools to make virtual reality more accessible to people with low vision. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1–14.
    [87]
    Yuhang Zhao, Shaomei Wu, Lindsay Reynolds, and Shiri Azenkot. 2018. A face recognition application for people with visual impairments: Understanding use beyond the lab. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–14.

    Cited By

    View all
    • (2024)SoundShift: Exploring Sound Manipulations for Accessible Mixed-Reality AwarenessProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661556(116-132)Online publication date: 1-Jul-2024
    • (2024)SocialCueSwitch: Towards Customizable Accessibility by Representing Social Cues in Multiple SensesExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651109(1-7)Online publication date: 11-May-2024
    • (2024)Springboard, Roadblock or “Crutch”?: How Transgender Users Leverage Voice Changers for Gender Presentation in Social Virtual Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00057(354-364)Online publication date: 16-Mar-2024
    • Show More Cited By

    Index Terms

    1. VRBubble: Enhancing Peripheral Awareness of Avatars for People with Visual Impairments in Social Virtual Reality

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        ASSETS '22: Proceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility
        October 2022
        902 pages
        ISBN:9781450392587
        DOI:10.1145/3517428
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 22 October 2022

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. audio feedback
        2. proxemics
        3. social virtual reality
        4. visual impairments

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Conference

        ASSETS '22
        Sponsor:

        Acceptance Rates

        ASSETS '22 Paper Acceptance Rate 35 of 132 submissions, 27%;
        Overall Acceptance Rate 436 of 1,556 submissions, 28%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)275
        • Downloads (Last 6 weeks)39
        Reflects downloads up to 11 Aug 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)SoundShift: Exploring Sound Manipulations for Accessible Mixed-Reality AwarenessProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661556(116-132)Online publication date: 1-Jul-2024
        • (2024)SocialCueSwitch: Towards Customizable Accessibility by Representing Social Cues in Multiple SensesExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651109(1-7)Online publication date: 11-May-2024
        • (2024)Springboard, Roadblock or “Crutch”?: How Transgender Users Leverage Voice Changers for Gender Presentation in Social Virtual Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00057(354-364)Online publication date: 16-Mar-2024
        • (2023)A Preliminary Interview: Understanding XR Developers' Needs towards Open-Source Accessibility Support2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW58643.2023.00107(493-496)Online publication date: Mar-2023
        • (2023)The Design Space of the Auditory Representation of Objects and Their Behaviours in Virtual Reality for Blind PeopleIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.324709429:5(2763-2773)Online publication date: 1-May-2023
        • (2023)What And How Together: A Taxonomy On 30 Years Of Collaborative Human-Centered XR Tasks2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00047(322-335)Online publication date: 16-Oct-2023
        • (2023)Making Avatar Gaze Accessible for Blind and Low Vision People in Virtual Reality: Preliminary Insights2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct60411.2023.00150(701-705)Online publication date: 16-Oct-2023
        • (2023)Edge Cloud Collaboration Intelligent Assistive Cane for Visually Impaired People2023 3rd International Conference on Smart Data Intelligence (ICSMDI)10.1109/ICSMDI57622.2023.00031(135-139)Online publication date: Mar-2023
        • (2023)Proposed Avatar Model for Sustainable Smart Tourism Growth: Systematic Literature Review2023 10th International Conference on ICT for Smart Society (ICISS)10.1109/ICISS59129.2023.10291456(1-7)Online publication date: 6-Sep-2023
        • (2023)A scientometric analysis of research on people with visual impairments in the field of HCI design: mapping the intellectual structure and evolutionUniversal Access in the Information Society10.1007/s10209-023-01067-xOnline publication date: 15-Dec-2023
        • Show More Cited By

        View Options

        Get Access

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media