Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

REVEL: tactile feedback technology for augmented reality

Published: 01 July 2012 Publication History
  • Get Citation Alerts
  • Abstract

    REVEL is an augmented reality (AR) tactile technology that allows for change to the tactile feeling of real objects by augmenting them with virtual tactile textures using a device worn by the user. Unlike previous attempts to enhance AR environments with haptics, we neither physically actuate objects or use any force- or tactile-feedback devices, nor require users to wear tactile gloves or other apparatus on their hands. Instead, we employ the principle of reverse electrovibration where we inject a weak electrical signal anywhere on the user body creating an oscillating electrical field around the user's fingers. When sliding his or her fingers on a surface of the object, the user perceives highly distinctive tactile textures augmenting the physical object. By tracking the objects and location of the touch, we associate dynamic tactile sensations to the interaction context. REVEL is built upon our previous work on designing electrovibration-based tactile feedback for touch surfaces [Bau, et al. 2010]. In this paper we expand tactile interfaces based on electrovibration beyond touch surfaces and bring them into the real world. We demonstrate a broad range of application scenarios where our technology can be used to enhance AR interaction with dynamic and unobtrusive tactile feedback.

    Supplementary Material

    MP4 File (tp187_12.mp4)

    References

    [1]
    Amberg, M., Fr, Giraud, R., Semail, B., Olivo, P., Casiez, R. and Roussel, N. 2011. STIMTAC: a tactile input device with programmable friction. In Proc. of UIST'11, ACM, 7--8.
    [2]
    Azuma, R. 1997. A Survey of Augmented Reality. Presence: Teleoperatore and Virtual Environments 6, 355--385.
    [3]
    Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S. and Macintyre, B. 2001. Recent Advances in Augmented Reality. IEEE Comput. Graph. Appl. 21, 34--47.
    [4]
    Bau, O., Petrevski, U. and Mackay, W. 2009. BubbleWrap: a textile-based electromagnetic haptic display. In Proc. of CHI EA'09, ACM, 3607--3612.
    [5]
    Bau, O., Poupyrev, I., Israr, A. and Harrison, C. 2010. TeslaTouch: electrovibration for touch surfaces. In Proc. of UIST'10, ACM, 283--292.
    [6]
    Benko, H., Wilson, A., Balakrishnan, R. and Chen, B. Sphere: multi-touch interactions on a spherical display. In Proc. of UIST'08, ACM. 77--86
    [7]
    Bianchi, G., Knoerlein, B., Szekely, M. and Harders, M. 2006. High precision augmented reality haptics. In Proc. of EuroHaptics'06, 169--178.
    [8]
    Burdea, G. C. 1996. Force and touch feedback for virtual reality.
    [9]
    Carlin, A., Hoffman, H. and Weghorst, S. 1997. Virtual reality and tactile augmentation in the treatment of spider phobia: a case report. Behavior Research and Therapy 35, 153--159.
    [10]
    Fitzmaurice, G., Ishii, H. and Buxton, W. 1995. Bricks: Laying the foundations for graspable user interfaces. In Proc. of CHI'95, ACM, 442--449.
    [11]
    Grimnes, S. 1983. Dielectric breakdown of human skin in vivo. Medical and Biological Engineering and Computing 21, 379--381.
    [12]
    Grimnes, S. 1983. Electrovibration, cutaneous sensation of microampere current. Acta Physiologica Scandinavica 118, 19--25.
    [13]
    Harrison, C., Benko, H. and Wilson, A. 2011. OmniTouch: Wearable Multitouch Interaction Everywhere. In Proc. of UIST'11, ACM, 441--450
    [14]
    Huang, K., Starner, T., Do, E., Weiberg, G., Kohlsdorf, D., Ahlrichs, C. and Leibrandt, R. 2010. Mobile music touch: mobile tactile stimulation for passive learning. In Proc. of CHI'10, ACM, 791--800.
    [15]
    Israr, A. and Poupyrev, I. 2011. Tactile brush: Drawing on skin with a tactile grid display. In Proc. of CHI'11, ACM, 2019--2028.
    [16]
    Iwata, H., Yano, H., Nakaizumi, F. and Kawamura, R. 2001. Project FEELEX: adding haptic surface to graphics. In Proc. of SIGGRAPH'01, ACM, 469-476.
    [17]
    Jeon, S. and Choi, S. 2009. Haptic Augmented Reality: Taxonomy and an Example of Stiffness Modulation. Presence: Teleoperators and Virtual Environments 18, 387--408.
    [18]
    Kaczmarek, K., Nammi, K., Agarwal, A., Tyler, M., Haase, S. and Beebe, D. 2006. Polarity effect in electrovibration for tactile display. IEEE Transactions on Biomedical Engineering 10, 2047--2054.
    [19]
    Kajimoto, H. 2010. Electro-tactile display with real-time impedance feedback. In Proc. of Haptics Symposium, Springer-Verlag, 285--291.
    [20]
    Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K. and Tachibana, K. 2000. Virtual Object Manipulation on a Table-Top AR Environment. In Proc. of International Symposium on Augmented Reality, ACM, 111--119.
    [21]
    Knoerlein, B., Szekely, G. and Harders, M. 2007. Visuo-haptic collaborative augmented reality ping-pong. In Proc. of ACET'07, ACM, 91--94.
    [22]
    Kron, A. and Schmidt, G. 2003. Multi-Fingered Tactile Feedback from Virtual and Remote Environments. In Proc. of HAPTICS'03, IEEE, 16.
    [23]
    Kruijff, E., Schmalstieg, D. and Beckhaus, S. 2006. Using Neuromuscular Electrical Stimulation for Pseudo-Haptic Feedback. In Proceedings of VRST'06, ACM, 316--319.
    [24]
    Mallinckrodt, E., Hughes, A. and Sleator, W. 1953. Perception by the Skin of Electrically Induced Vibrations. Science 118, 277--278.
    [25]
    Matsushita, N. and Rekimoto, J. 1997. HoloWall: designing a finger, hand, body, and object sensitive wall. In Proc. of UIST'97, ACM, 209--210.
    [26]
    MICROSOFT. 2010 Microsoft Surface 2.0.
    [27]
    Minsky, M., Ming, O.-Y., Steele, O., Frederick P. Brooks, J. and Behensky, M. 1990. Feeling and seeing: issues in force display. In Proc. of SIGGRAPH'90. 235--241.
    [28]
    Niwa, M., Nozaki, T., Maeda, T. and Ando, H. 2010. Fingernail-Mounted Display of Attraction Force and Texture. In Proc. of EuroHaptics'10, Springer-Verlag, 3--8.
    [29]
    Nojima, T., Sekiguchi, D., Inami, M. and Tachi, S. 2002. The SmartTool: A system for Augmented Reality of Haptics. In Proc. of VR'02, IEEE, 67--72.
    [30]
    Poupyrev, I., Tan, D. et al. 2002. Developing a generic augmented-reality interface, IEEE Computer, 2002. 35: 44--49
    [31]
    Poupyrev, I. and Maruyama, S. 2003. Tactile interfaces for small touch screens. In Proc. of UIST'03, ACM, 217--220.
    [32]
    Poupyrev, I., Nashida, T., Okabe, M. 2007. Actuation and Tangible User Interfaces: the Vaucanson Duck, Robots, and Shape Displays. In Proc. of TEI'07, ACM, 205--212
    [33]
    Rekimoto, J. 2009. SenseableRays: Opto-Haptic Substitution for Touch-Enhanced Interactive Spaces. In Proc. of CHI EA'09 ACM, 2519--2528.
    [34]
    Rekimoto, J. and Saitoh, M. 1999. Augmented surfaces: a spatially continuous work space for hybrid computing environments. In Proc. of CHI'99, ACM, 378--385.
    [35]
    Ryu, J. and Kim, G. 2004. Using a Vibro-tactile Display for Enhanced Collision Perception and Presence. In Proc. of VRST'04 ACM, 89--96.
    [36]
    Schmalstieg, D., Fuhrmann, A. and Hesina, G. 2000. Bridging multiple user interface dimensions with augmented reality. In Proc. of ISAR'00, IEEE, 20--29
    [37]
    Strong, R. M. and Troxel, D. E. 1970. An electrotactile display. IEEE Transactions on Man-Machine Systems 11, 72--79.
    [38]
    Takeuchi, Y. 2010. Gilded gait: reshaping the urban experience with augmented footsteps. In Proc. of UIST'10, ACM, 185--188.
    [39]
    Tamaki, E., Miyaki, T. and Rekimoto, J. 2011. PossessedHand: techniques for controlling human hands using electrical muscles stimuli. In Proc. of the SIGCHI 2011, ACM, 543--552.
    [40]
    Tan, H. and Pentland, A. 1997. Tactual displays for wearable computing. In ISWC'97 IEEE, 84--89.
    [41]
    Tang, H. and Beebe, D. 1998. A microfabricated electrostatic haptic display for persons with visual imairments. IEEE Transactions on Rehabilitation Engineering 6, 241--248.
    [42]
    Tsetserukou, D., Sato, K. and Tachi, S. 2010. ExoInterfaces: Novel Exosceleton Haptic Interfaces for Virtual Reality, Augmented Sport and Rehabilitation. In Proc. of the Augmented Human'10 2010 ACM, 1--6.
    [43]
    Ullmer, B. and Ishii, H. 1997. The metaDESK: models and prototypes for tangible user interfaces. In Proc. of UIST'97 ACM, 223--232.
    [44]
    Vallino, J. and Brown, C. 1999. Haptics in augmented reality. In Proceedings of the Multimedia Computing and Systems 1999 IEEE, 195--200.
    [45]
    Webster, J. 1998. Medical instrumentation: Application and design Wiley, 173.
    [46]
    Willis, K. D. D., Poupyrev, I., Hudson, S. E. and Mahler, M. 2011. SideBySide: ad-hoc multi-user interaction with handheld projectors. In Proc. of UIST'11, ACM, 431--440.
    [47]
    Wilson, A. D. 2010. Using a depth camera as a touch sensor. In Proc. of ITS 2010, 2010 ACM, 69--72.
    [48]
    Wilson, A. D. and Benko, H. 2010. Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In Proc. of UIST'10, ACM, 273--282.
    [49]
    Woodward, C., Honkamaa, P., Jppinen, J. and Pykkimies, W. 2004. Camball - augmented virtual table tennis with real rackets. In Proc. of the ACET, ACM, 275--276.

    Cited By

    View all
    • (2024)Interaction-Power Stations: Turning Environments into Ubiquitous Power Stations for Charging WearablesExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650769(1-8)Online publication date: 11-May-2024
    • (2024)Stick&Slip: Altering Fingerpad Friction via Liquid CoatingsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642299(1-14)Online publication date: 11-May-2024
    • (2024)Augmenting the feel of real objectsInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103244185:COnline publication date: 1-May-2024
    • Show More Cited By

    Recommendations

    Reviews

    Angelica de Antonio

    This paper presents a new technology for tactile feedback in augmented reality (AR) applications. It is based on the generation of a reverse electrovibration, which provokes a tactile sensation that is felt by users when they slide their fingers on a certain surface. Unlike traditional approaches to AR tactile displays, this solution does not rely on instrumenting real-world objects with active devices but on instrumenting the user's body. It allows for the application of virtual tactile textures to both virtual and real objects and surfaces. This technology opens up the potential for really ubiquitous tactile interfaces that can be used almost anywhere, whenever the target objects and surfaces meet some compatibility constraints. Several ways to achieve the required compatibility are described. The paper presents the design of the REVEL tactile display and its underlying physical principle, and describes several application scenarios, some already implemented and others of a more futuristic nature. The authors do a good job of comparing their proposal to the currently existing alternatives. They help readers understand the potential for this solution, without neglecting the limitations of the approach. Online Computing Reviews Service

    Access critical reviews of Computing literature here

    Become a reviewer for Computing Reviews.

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Graphics
    ACM Transactions on Graphics  Volume 31, Issue 4
    July 2012
    935 pages
    ISSN:0730-0301
    EISSN:1557-7368
    DOI:10.1145/2185520
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 July 2012
    Published in TOG Volume 31, Issue 4

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. augmented reality
    2. augmented surfaces
    3. haptics
    4. tactile displays
    5. tangible interfaces
    6. touch interaction

    Qualifiers

    • Research-article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)210
    • Downloads (Last 6 weeks)24

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Interaction-Power Stations: Turning Environments into Ubiquitous Power Stations for Charging WearablesExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650769(1-8)Online publication date: 11-May-2024
    • (2024)Stick&Slip: Altering Fingerpad Friction via Liquid CoatingsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642299(1-14)Online publication date: 11-May-2024
    • (2024)Augmenting the feel of real objectsInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103244185:COnline publication date: 1-May-2024
    • (2024)The user experience of distal arm-level vibrotactile feedback for interactions with virtual versus physical displaysVirtual Reality10.1007/s10055-024-00977-228:2Online publication date: 22-Mar-2024
    • (2023)HaptoMapping: Visuo-Haptic Augmented Reality by Embedding User-Imperceptible Tactile Display Control Signals in a Projected ImageIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2021.313621429:4(2005-2019)Online publication date: 1-Apr-2023
    • (2023)Unobtrusive interaction: a systematic literature review and expert surveyHuman–Computer Interaction10.1080/07370024.2022.2162404(1-37)Online publication date: 1-Feb-2023
    • (2023)Investigating the minimum perceived linewidth of electroadhesion devicesDisplays10.1016/j.displa.2022.10234276(102342)Online publication date: Jan-2023
    • (2022)HaptiDragProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35503106:3(1-26)Online publication date: 7-Sep-2022
    • (2022)Estimating the Just Noticeable Difference of Tactile Feedback in Oculus Quest 2 Controllers2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR55827.2022.00013(1-7)Online publication date: Oct-2022
    • (2022)Multi-Sensory HMI for Human-Centric Industrial Digital Twins: A 6G Vision of Future Industry2022 IEEE Symposium on Computers and Communications (ISCC)10.1109/ISCC55528.2022.9912932(1-7)Online publication date: 30-Jun-2022
    • Show More Cited By

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media