Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

FingerSight: A Vibrotactile Wearable Ring for Assistance With Locating and Reaching Objects in Peripersonal Space

Published: 01 April 2020 Publication History

Abstract

This paper describes a prototype guidance system, “FingerSight,” to help people without vision locate and reach to objects in peripersonal space. It consists of four evenly spaced tactors embedded into a ring worn on the index finger, with a small camera mounted on top. Computer-vision analysis of the camera image controls vibrotactile feedback, leading users to move their hand to near targets. Two experiments tested the functionality of the prototype system. The first found that participants could discriminate between five different vibrotactile sites (four individual tactors and all simultaneously) with a mean accuracy of 88.8% after initial training. In the second experiment, participants were blindfolded and instructed to move their hand wearing the device to one of four locations within arm's reach, while hand trajectories were tracked. The tactors were controlled using two different strategies: (1) repeatedly signal axis with largest error, and (2) signal both axes in alternation. Participants demonstrated essentially straight-line trajectories toward the target under both instructions, but the temporal parameters (rate of approach, duration) showed an advantage for correction on both axes in sequence.

References

[1]
R. Varma, et al., “Visual impairment and blindness in adults in the united states,” JAMA Ophthalmology, vol. 134, no. 7, pp. 802–809, 2016.
[2]
D. S. Burch and D. T. Pawluk, “A cheap, portable haptic device for a method to relay 2-D texture-enriched graphical information to individuals who are visually impaired,” in Proc. 11th Int. ACM SIGACCESS Conf. Comput. Accessibility, 2009, pp. 215–216.
[3]
R. Velazquez, E. Fontaine, and E. Pissaloux, “Coding the environment in tactile maps for real-time guidance of the visually impaired,” in Proc. IEEE Int. Symp. MicroNanoMechanical Human Sci., 2006, pp. 1–6.
[4]
Seeing AI Project — Pivothead Wearable Imaging, 2019. [Online]. Available: http://www.pivothead.com/seeingai/
[5]
G. Ng, P. Barralon, G. Dumont, S. Schwarz, and J. M. Ansermino, “Optimizing the tactile display of physiological information: Vibro-tactile vs. electro-tactile stimulation, and forearm or wrist location,” in Proc. 29th Annu. Int. Conf. IEEE Eng. Medicine Biol. Soc., 2007, pp. 4202–4205.
[6]
D. Dakopoulos and N. G. Bourbakis, “Wearable obstacle avoidance electronic travel aids for blind: A survey,” IEEE Trans. Syst., Man, Cybern. Part C Appl. Rev., vol. 40, no. 1, pp. 25–35, Jan. 2010.
[7]
S. Ertan, C. Lee, A. Willets, H. Tan, and A. Pentland, “A wearable haptic navigation guidance system,” in Proc. Digest Papers 2nd Int. Symp. Wearable Comput. (Cat. No.98EX215), 1988, pp. 164–165.
[8]
F. Gemperle, N. Ota, and D. Siewiorek, “Design of a wearable tactile display,” in Proc. 5th Int. Symp. Wearable Comput., 2001, pp. 5–12.
[9]
K. Tsukada and M. Yasumura, “ActiveBelt: Belt-type wearable tactile display for directional navigation,” in UbiComp 2004: Ubiquitous Computing (Lecture Notes in Computer Science 3250). Berlin, Germany: Springer, 2004, pp. 384–399.
[10]
W. Heuten, N. Henze, S. Boll, and M. Pielot, “Tactile wayfinder,” in Proc. 5th Nordic Conf. Human-Comput. Interaction Building Bridges, 2008, pp. 172–181.
[11]
L. Lobo, et al., “Route selection and obstacle avoidance with a short-range haptic sensory substitution device,” Int. J. Human-Comput. Studies, vol. 132, pp. 25–33, 2019.
[12]
J. R. Marston, J. M. Loomis, R. L. Klatzky, and R. G. Golledge, “Nonvisual route following with guidance from a simple haptic or auditory display,” J. Visual Impairment Blindness, vol. 101, no. 4, pp. 203–211, 2007.
[13]
G. Flores, S. Kurniawan, R. Manduchi, E. Martinson, L. M. Morales, and E. A. Sisbot, “Vibrotactile guidance for wayfinding of blind walkers,” IEEE Trans. Haptics, vol. 8, no. 3, pp. 306–317, Jul.-Sep. 2015.
[14]
R. Velázquez, E. Pissaloux, P. Rodrigo, M. Carrasco, N. I. Giannoccaro, and A. Lay-Ekuakille, “An outdoor navigation system for blind pedestrians using GPS and tactile-foot feedback, Appl. Sci., vol. 8, no. 4, p. 578, 2018. https://doi.org/10.3390/app8040578
[15]
R. Velázquez, “Wearable assistive devices for the blind,” in Wearable and Autonomous Biomedical Devices and Systems for Smart Environment (Lecture Notes in Electrical Engineering). Berlin, Germany: Springer, 2010, pp. 331–349.
[16]
V. R. Schinazi, T. Thrash, and D.-R. Chebat, “Spatial navigation by congenitally blind individuals,” Wiley Interdisciplinary Rev.: Cogn. Sci., vol. 7, no. 1, pp. 37–58, 2015.
[17]
C. Lenay, O. Gapenne, S. Hanneton, C. Marque, and C. Genouelle. “Sensory substitution: Limits and perspectives,” in Touching For Knowing: Cognitive Psychology of Haptic Manual Perception, Y. Hatwell, A. Steri, and E. Gentaz, Eds. Amsterdam, Netherlands: John Benjamins, 2003, pp. 275–292.
[18]
C. Lenay, S. Canu, and P. Villon, “Technology and perception: the contribution of sensory substitution systems,” in Proc. Int. Conf. Cogn. Technol., 1997, pp. 44–53.
[19]
G. Stetten and R. Klatzky, “Fingertip visual haptic sensor controller,” U.S. Patent 9,024,874, May 2015.
[20]
C. Spence, “The skin as a medium for sensory substitution,” Multisensory Res., vol. 27, pp. 293–312, 2014.
[21]
K. Zawrotny, A. Craig, D. Weiser, R. Klatzky, and G. Stetten, “Fingertip vibratory transducer for detecting optical edges using regenerative feedback,” 14th Symp. Haptic Interfaces Virtual Environ. Teleoperator Syst., Mar. 2006, pp. 373–374.
[22]
G. Stetten, et al., “Fingersight: Fingertip visual haptic sensing and control,” in Proc. IEEE Int. Workshop Haptic Audio Visual Environ., Ontario, Canada, Oct. 2007, pp. 80–83.
[23]
S. Horvath, J. Galeotti, B. Wu, R. Klatzky, M. Siegel, and G. Stetten, “FingerSight: Fingertip haptic sensing of the visual environment,” IEEE J. Translational Eng. Health Medicine, vol. 2:2700109, Mar. 6, 2014.
[24]
Z. Yu, et al., “PalmSight: An assistive technology helping the blind to locate and grasp objects,” Robot. Inst., Carnegie Mellon Univ., Pittsburgh, PA, USA, Tech. Rep., CMU-RI-TR-16-59, Dec. 2016.
[25]
G. Westling and R. S. Johansson, “Responses in glabrous skin mechanoreceptors during precision grip in humans,” Exp. Brain Res., vol. 66, pp. 128–140, 1987.
[26]
Y. J. Liu, X. R. Wu, J. S. Leng, R. Davidson, and P. Taylor, “Tactile display based on controllable fluid actuator,” in Progress in Light Metals, Aerospace Materials and Superconductors (Materials Science Forum). Zurich, Switzerland: Trans Tech Publ., 2007, pp. 1669–1672.
[27]
L. Lobo, D. Travieso, D. M. Jacobs, M. Rodger, and C. Craig, “Sensory substitution: Using a vibrotactile device to orient and walk to targets,” J. Exp. Psychol.: Appl., vol. 1, pp. 108–124, 2018.
[28]
“Optotrak certus user guide,” manualzz.com. [Online]. Available: https://manualzz.com/doc/28945333/optotrak-certus-user-guide. Accessed: Mar. 14, 2019.
[29]
Z. Chen, et al., “An empirical study of latency in an emerging class of edge computing applications,” in Proc. Second ACM/IEEE Symp. Edge Comput. (SEC '17), New York, NY, USA, p. 14, Oct. 2017, Art. no. 14. https://doi.org/10.1145/3132211.3134458
[30]
H. Kuper, et al., “Does cataract surgery alleviate poverty? Evidence from a multi-centre intervention study conducted in Kenya, the Philippines and Bangladesh,” PLoS ONE, vol. 5, no. 11, 2010, e15431.

Cited By

View all
  • (2023)Aircraft Cockpit Interaction in Virtual Reality with Visual, Auditive, and Vibrotactile FeedbackProceedings of the ACM on Human-Computer Interaction10.1145/36264817:ISS(420-443)Online publication date: 1-Nov-2023
  • (2023)Grasping objects with a sensory substitution gloveInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2022.102963170:COnline publication date: 1-Feb-2023
  • (2022)AIGuide: Augmented Reality Hand Guidance in a Visual ProstheticACM Transactions on Accessible Computing10.1145/350850115:2(1-32)Online publication date: 19-May-2022
  • Show More Cited By

Index Terms

  1. FingerSight: A Vibrotactile Wearable Ring for Assistance With Locating and Reaching Objects in Peripersonal Space
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image IEEE Transactions on Haptics
    IEEE Transactions on Haptics  Volume 13, Issue 2
    April-June 2020
    189 pages

    Publisher

    IEEE Computer Society Press

    Washington, DC, United States

    Publication History

    Published: 01 April 2020

    Qualifiers

    • Research-article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 27 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Aircraft Cockpit Interaction in Virtual Reality with Visual, Auditive, and Vibrotactile FeedbackProceedings of the ACM on Human-Computer Interaction10.1145/36264817:ISS(420-443)Online publication date: 1-Nov-2023
    • (2023)Grasping objects with a sensory substitution gloveInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2022.102963170:COnline publication date: 1-Feb-2023
    • (2022)AIGuide: Augmented Reality Hand Guidance in a Visual ProstheticACM Transactions on Accessible Computing10.1145/350850115:2(1-32)Online publication date: 19-May-2022
    • (2021)GuideCopter - A Precise Drone-Based Haptic Guidance Interface for Blind or Visually Impaired PeopleProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445676(1-14)Online publication date: 6-May-2021

    View Options

    View options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media