Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3334480.3383095acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
abstract

PneuFetch: Supporting Blind and Visually Impaired People to Fetch Nearby Objects via Light Haptic Cues

Published: 25 April 2020 Publication History
  • Get Citation Alerts
  • Abstract

    We present PneuFetch, a light haptic cue-based wearable device that supports blind and visually impaired (BVI) people to fetch nearby objects in an unfamiliar environment. In our design, we generate friendly, non-intrusive, and gentle presses and drags to deliver direction and distance cues on BVI user's wrist and forearm. As a concept of proof, we discuss our PneuFetch wearable prototype, contrast it with past work, and describe a preliminary user study.

    References

    [1]
    Jeffrey P. Bigham, Chandrika Jayant, Andrew Miller, Brandyn White, and Tom Yeh. 2010. VizWiz: LocateIt-enabling blind people to locate objects in their environment. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops, 65--72.
    [2]
    Ricardo Chincha and YingLi Tian. 2011. Finding objects for blind people based on SURF features. In 2011 IEEE International Conference on Bioinformatics and Biomedicine Workshops (BIBMW), 526--527.
    [3]
    Alexandra Delazio, Ken Nakagaki, Roberta L. Klatzky, Scott E. Hudson, Jill Fain Lehman, and Alanson P. Sample. 2018. Force Jacket: Pneumatically-Actuated Jacket for Embodied Haptic Experiences. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18), 1--12.
    [4]
    Nils Dahlbäck, Arne Jönsson, and Lars Ahrenberg. 1993. Wizard of Oz studies: why and how. In Proceedings of the 1st international conference on Intelligent user interfaces (IUI '93), 193--200.
    [5]
    Ruofei Du and Liang He. 2016. VRSurus: Enhancing Interactivity and Tangibility of Puppets in Virtual Reality. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16), 2454--2461.
    [6]
    Anhong Guo, Xiang "Anthony" Chen, Haoran Qi, Samuel White, Suman Ghosh, Chieko Asakawa, and Jeffrey P. Bigham. 2016. VizLens: A Robust and Interactive Screen Reader for Interfaces in the Real World. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16), 651--664.
    [7]
    Anhong Guo, Saige McVea, Xu Wang, Patrick Clary, Ken Goldman, Yang Li, Yu Zhong, and Jeffrey P. Bigham. 2018. Investigating Cursorbased Interactions to Support Non-Visual Exploration in the Real World. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '18), 3--14.
    [8]
    Chris Harrison, Shilpa Ramamurthy, and Scott E.Hudson. 2012. On-body Interaction: Armed and Dangerous. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction (TEI '12),69--76.
    [9]
    Jonggi Hong, Alisha Pradhan, Jon E. Froehlich, and Leah Findlater. 2017. Evaluating Wrist-Based Haptic Feedback for Non-Visual Target Finding and Path Tracing on a 2D Surface. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '17), 210--219.
    [10]
    Liang He, Cheng Xu, Ding Xu, and Ryan Brill. 2015. PneuHaptic: delivering haptic cues with a pneumatic armband. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC '15), 47--48.
    [11]
    Liang He, Zijian Wan, Leah Findlater, and Jon E. Froehlich. 2017. TacTILE: A Preliminary Toolchain for Creating Accessible Graphics with 3D-Printed Overlays and Auditory Annotations. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '17), 397--398.
    [12]
    Nur Al-huda Hamdan, Adrian Wagner, Simon Voelker, Jürgen Steimle, and Jan Borchers. 2019. Springlets: Expressive, Flexible and Silent On-Skin Tactile Interfaces. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19), 1--14.
    [13]
    Teng Han, Qian Han, Michelle Annett, Fraser Anderson, Da-Yuan Huang, and Xing-Dong Yang. 2017. Frictio: Passive Kinesthetic Force Feedback for Smart Ring Output. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17), 131--142.
    [14]
    Ali Israr and Ivan Poupyrev. 2011. Tactile brush: drawing on skin with a tactile grid display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11), 2019-- 2028.
    [15]
    Alexandra Ion, Edward Jay Wang, and Patrick Baudisch. 2015. Skin Drag Displays: Dragging a Physical Tactor across the User's Skin Produces a Stronger Tactile Stimulus than Vibrotactile. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15), 2501--2504.
    [16]
    Seungwoo Je, Minkyeong Lee, Yoonji Kim, Liwei Chan, Xing-Dong Yang, and Andrea Bianchi. 2018. PokeRing: Notifications by Poking Around the Finger. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18), 1--10.
    [17]
    Slim Kammoun, Christophe Jouffrais, Tiago Guerreiro, Hugo Nicolau, and Joaquim Jorge. 2012. Guiding blind people with haptic feedback. Frontiers in Accessibility for Pervasive Computing (Pervasive 2012) 3.
    [18]
    Shaun K. Kane, Meredith Ringel Morris, and Jacob O. Wobbrock. 2013. Touchplates: low-cost tactile overlays for visually impaired touch screen users. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '13), Article 22, 1--8.
    [19]
    Topi Kaaresoja and Jukka Linjama. 2005. Perception of short tactile pulses generated by a vibration motor in a mobile phone. In First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference, 471--472.
    [20]
    Orly Lahav and David Mioduser. 2008. Hapticfeedback support for cognitive mapping of unknown spaces by people who are blind. International Journal of Human-Computer Studies 66, no. 1: 23--35.
    [21]
    Orly Lahav, David W. Schloerb, Siddarth Kumar, and Mandayam A. Srinivasan. 2008. BlindAid: A learning environment for enabling people who are blind to explore and navigate through unknown real spaces. In 2008 Virtual Rehabilitation, 193197.
    [22]
    Pedro Lopes, Alexandra Ion, and Patrick Baudisch. 2015. Impacto: Simulating Physical Impact by Combining Tactile Stimulation with Electrical Muscle Stimulation. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15), 11--19.
    [23]
    Pedro Lopes, Patrik Jonell, and Patrick Baudisch. 2015. Affordance++: Allowing Objects to Communicate Dynamic Use. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15), 2515--2524.
    [24]
    Roshan Lalitha Peiris, Yuan-Ling Feng, Liwei Chan, and Kouta Minamizawa. 2019. ThermalBracelet: Exploring Thermal Haptic Feedback Around the Wrist. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19), 1--11.
    [25]
    Boris Schauerte, Manel Martinez, Angela Constantinescu, and Rainer Stiefelhagen. 2012. An assistive vision system for the blind that helps find lost things. In International Conference on Computers for Handicapped Persons, 566--572.
    [26]
    Mayuree Srikulwong and Eamonn O'Neill. 2010. A direct experimental comparison of back array and waist-belt tactile interfaces for indicating direction. In Workshop on Multimodal Location Based Techniques for Extreme Navigation at Pervasive, 5--8.
    [27]
    Misha Sra, Xuhai Xu, and Pattie Maes. 2017. GalVR: a novel collaboration interface using GVS. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology (VRST '17), Article 61, 1--2.
    [28]
    Oliver S. Schneider, Ali Israr, and Karon E. MacLean. 2015. Tactile Animation by Direct Manipulation of Grid Displays. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15), 21-- 30.
    [29]
    Rajinder Sodhi, Ivan Poupyrev, Matthew Glisson, and Ali Israr. 2013. AIREAL: interactive tactile experiences in free air. ACM Trans. Graph. 32, 4, Article 134, 10 pages.
    [30]
    Xuhai Xu, Haitian Shi, Xin Yi, Wenjia Liu, Yukang Yan, Yuanchun Shi, Alex Mariakakis, Jennifer Mankoff, and Anind K. Dey. 2020. EarBuddy: Enabling On-Face Interaction via Wireless Earbuds. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI '20), 14.
    [31]
    Xuhai Xu, Chun Yu, Anind K. Dey, and Jennifer Mankoff. 2019. Clench Interface: Novel Biting Input Techniques. In Proceedings of the 2019 CHI Conferenceon Human Factors in Computing Systems (CHI '19), Article 275, 12 pages.
    [32]
    Eric M. Young, Amirhossein H. Memar, Priyanshu Agarwal, and Nick Colonnese. 2019. Bellowband: A Pneumatic Wristband for Delivering Local Pressure and Vibration. In 2019 IEEE World Haptics Conference (WHC), 55--60.

    Cited By

    View all
    • (2024)A Mobile Lens: Voice-Assisted Smartphone Solutions for the Sightless to Assist Indoor Object IdentificationEAI Endorsed Transactions on Internet of Things10.4108/eetiot.645010Online publication date: 28-Jun-2024
    • (2024)WatchCap: Improving Scanning Efficiency in People with Low Vision through Compensatory Head Movement StimulationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595928:2(1-32)Online publication date: 15-May-2024
    • (2024)MindShift: Leveraging Large Language Models for Mental-States-Based Problematic Smartphone Use InterventionProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642790(1-24)Online publication date: 11-May-2024
    • Show More Cited By

    Index Terms

    1. PneuFetch: Supporting Blind and Visually Impaired People to Fetch Nearby Objects via Light Haptic Cues

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems
      April 2020
      4474 pages
      ISBN:9781450368193
      DOI:10.1145/3334480
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 25 April 2020

      Check for updates

      Author Tags

      1. accessibility
      2. blind and visually impaired people
      3. fabrication
      4. object fetching
      5. pneumatic system
      6. touch

      Qualifiers

      • Abstract

      Conference

      CHI '20
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

      Upcoming Conference

      CHI PLAY '24
      The Annual Symposium on Computer-Human Interaction in Play
      October 14 - 17, 2024
      Tampere , Finland

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)57
      • Downloads (Last 6 weeks)2

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)A Mobile Lens: Voice-Assisted Smartphone Solutions for the Sightless to Assist Indoor Object IdentificationEAI Endorsed Transactions on Internet of Things10.4108/eetiot.645010Online publication date: 28-Jun-2024
      • (2024)WatchCap: Improving Scanning Efficiency in People with Low Vision through Compensatory Head Movement StimulationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595928:2(1-32)Online publication date: 15-May-2024
      • (2024)MindShift: Leveraging Large Language Models for Mental-States-Based Problematic Smartphone Use InterventionProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642790(1-24)Online publication date: 11-May-2024
      • (2023)Understanding Challenges and Opportunities in Body Movement Education of People who are Blind or have Low VisionProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608409(1-19)Online publication date: 22-Oct-2023
      • (2023)Understanding Curators' Practices and Challenge of Making Exhibitions More Accessible for People with Visual ImpairmentsProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608384(1-18)Online publication date: 22-Oct-2023
      • (2023)A Miniature Direct-Drive Hydraulic Actuator for Wearable Haptic Devices based on Ferrofluid Magnetohydrodynamic Levitation2023 IEEE World Haptics Conference (WHC)10.1109/WHC56415.2023.10224414(293-298)Online publication date: 10-Jul-2023
      • (2022)EmotiTactor: Exploring How Designers Approach Emotional Robotic TouchProceedings of the 2022 ACM Designing Interactive Systems Conference10.1145/3532106.3533487(1330-1344)Online publication date: 13-Jun-2022
      • (2022)DragTapVib: An On-Skin Electromagnetic Drag, Tap, and Vibration Actuator for Wearable ComputingProceedings of the Augmented Humans International Conference 202210.1145/3519391.3519395(203-211)Online publication date: 13-Mar-2022
      • (2022)Demonstrating Interaction: The Case of Assistive TechnologyACM Transactions on Computer-Human Interaction10.1145/351423629:5(1-37)Online publication date: 20-Oct-2022
      • (2022)“I Shake The Package To Check If It’s Mine”Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502063(1-15)Online publication date: 29-Apr-2022
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media