Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2858036.2858161acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Direct Manipulation in Tactile Displays

Published: 07 May 2016 Publication History

Abstract

Tactile displays have predominantly been used for information transfer using patterns or as assistive feedback for interactions. With recent advances in hardware for conveying increasingly rich tactile information that mirrors visual information, and the increasing viability of wearables that remain in constant contact with the skin, there is a compelling argument for exploring tactile interactions as rich as visual displays. Direct Manipulation underlies much of the advances in visual interactions. In this work, we introduce the concept of a Direct Manipulation-enabled Tactile display (DMT). We define the concepts of a tactile screen, tactile pixel, tactile pointer, and tactile target which enable tactile pointing, selection and drag & drop. We build a proof of concept tactile display and study its precision limits. We further develop a performance model for DMTs based on a tactile target acquisition study. Finally, we study user performance in a real-world DMT menu application. The results show that users are able to use the application with relative ease and speed.

Supplementary Material

ZIP File (pn0727-file4.zip)
pn0727-file4.zip
suppl.mov (pn0727-file3.mp4)
Supplemental video
MP4 File (p3683-gupta.mp4)

References

[1]
Stephen A Brewster and Alison King. 2005. The design and evaluation of a vibrotactile progress bar. In World Haptics 2005. IEEE, 499--500.
[2]
Matt Brian. 2015. Sony is crowdfunding a smart watch with a dumb face. (2015). http://www.engadget.com/ 2015/08/31/sony-wena-smartwatch/
[3]
William Buxton. 1990. A three-state model of graphical input. In Proc. INTERACT'90. 449--456. http://billbuxton.com/3stateModel.pdf
[4]
Marta G. Carcedo, Soon Hau Chua, Simon T. Perrault, Paweł Wozniak, Raj Joshi, Mohammad Obaid, Morten Fjeld, and Shengdong Zhao. 2016. HaptiColor: Interpolating Color Information as Haptic Feedback to Assist the Color Blind. In Proc. CHI '16. 1--12. http://dx.doi.org/10.1145/2858036.2858220
[5]
Géry Casiez, Daniel Vogel, Ravin Balakrishnan, and Andy Cockburn. 2008. The impact of control-display gain on user performance in pointing tasks. Human-Computer Interaction 23, 3 (2008), 215--250. http://dx.doi.org/10.1080/07370020802278163
[6]
Hsiang-yu Chen, Joseph Santos, Matthew Graves, Kwangtaek Kim, and Hong Z. Tan. 2008. Tactor Localization at the Wrist. In Proc. Eurohaptics'08. 209--218. http://citeseerx.ist.psu.edu/viewdoc/ summary?doi=10.1.1.151.7152&rank=1
[7]
Roger W. Cholewiak and Amy A. Collins. 2003. Vibrotactile localization on the arm: Effects of place, space, and age. Percept. Psychophys. 65, 7 (2003), 1058--1077. http: //www.springerlink.com/index/10.3758/BF03194834
[8]
Sarah A Douglas, Arthur E Kirkpatrick, and I Scott MacKenzie. 1999. Testing pointing device performance and user assessment with the ISO 9241, Part 9 standard. In Proc. CHI '99. 215--222. http://dx.doi.org/10.1145/302979.303042
[9]
James J. Gibson. 1962. Observations on Active Touch. Psychological Review 69, 6 (1962), 477--491. http://psycnet.apa.org/psycinfo/1963-06036-001
[10]
Ali Israr and Ivan Poupyrev. 2011. Tactile Brush: Drawing on Skin with a Tactile Grid Display. In Proc. CHI'11. 2019--2028. http://doi.acm.org/10.1145/1978942.1979235
[11]
Yeon Sub Jin, Han Yong Chun, Eun Tai Kim, and Sungchul Kang. 2014. VT-ware: A wearable tactile device for upper extremity motion guidance. In Proc. RO-MAN'14. 335--340. http://ieeexplore.ieee.org/ lpdocs/epic03/wrapper.htm?arnumber=6926275
[12]
12. Seungyon "Claire" Lee and Thad Starner. 2010. BuzzWear: alert perception in wearable tactile displays on the wrist. In Proc. CHI'10. 433--442. http: //dl.acm.org/citation.cfm?id=1753326.1753392
[13]
Charles Lenay, S. Canu, and P. Villon. 1997. Technology and perception: the contribution of sensory substitution systems. In Proc. ICCT'97. 44--53. http://dx.doi.org/10.1109/CT.1997.617681
[14]
Shachar Maidenbaum, Sami Abboud, and Amir Amedi. 2014. Sensory substitution: closing the gap between basic research and widespread practical visual rehabilitation. Neuroscience & Biobehavioral Reviews 41 (2014), 3--15.
[15]
Michael Matscheko, Alois Ferscha, Andreas Riener, and Manuel Lehner. 2010a. Tactor placement in wrist worn wearables. In Proc. ISWC'10. 1--8. http://ieeexplore.ieee.org/lpdocs/epic03/ wrapper.htm?arnumber=5665867
[16]
Michael Matscheko, Alois Ferscha, Andreas Riener, and Manuel Lehner. 2010b. Tactor placement in wrist worn wearables. In Int. Symp. Wearable Comput. 2010. IEEE, 1--8.
[17]
Hugo Nicolau, João Guerreiro, Tiago Guerreiro, and Luís Carriço. 2013. UbiBraille: designing and evaluating a vibrotactile Braille-reading device. In Proc. ASSETS'13. 1--8. http://dx.doi.org/10.1145/2513383.2513437
[18]
Jerome Pasquero, Scott J. Stobbe, and Noel Stonehouse. 2011. A haptic wristwatch for eyes-free interactions. In Proc. CHI'11. 3257--3266. http: //dl.acm.org/citation.cfm?id=1978942.1979425
[19]
L. Rahal, Jongeun Cha, A.E. Saddik, J. Kammerl, and E. Steinbach. 2009. Investigating the influence of temporal intensity changes on apparent movement phenomenon. In Virtual Environments, Human-Computer Interfaces and Measurements Systems, 2009. VECIMS '09. IEEE International Conference on. 310--313.
[20]
Christian Schönauer, Kenichiro Fukushi, Alex Olwal, Hannes Kaufmann, and Ramesh Raskar. 2012. Multimodal motion guidance: techniques for adaptive and dynamic feedback. In Proc. ICMI'12. 133--140. http: //dl.acm.org/citation.cfm?id=2388676.2388706
[21]
Ben Shneiderman. 1983. Direct Manipulation: A Step Beyond Programming Languages. Computer 16, 8 (1983), 57--69. http://dx.doi.org/10.1109/MC.1983.1654471
[22]
Georg von Bekesy. 1957. Sensations on the Skin Similar to Directional Hearing, Beats, and Harmonics of the Ear. J. Acoust. Soc. Am. 29, 4 (1957), 489--501. http://psycnet.apa.org/doi/10.1121/1.1908938
[23]
Cheng Xu, Ali Israr, Ivan Poupyrev, Olivier Bau, and Chris Harrison. 2011. Tactile display for the visually impaired using TeslaTouch. In Ext. Abs. CHI'11. 317--322. http://dx.doi.org/10.1145/1979742.1979705

Cited By

View all
  • (2023)DrivingVibe: Enhancing VR Driving Experience using Inertia-based Vibrotactile Feedback around the HeadProceedings of the ACM on Human-Computer Interaction10.1145/36042537:MHCI(1-22)Online publication date: 13-Sep-2023
  • (2023)Gaze Speedup: Eye Gaze Assisted Gesture Typing in Virtual RealityProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584072(595-606)Online publication date: 27-Mar-2023
  • (2023)Investigating Eyes-away Mid-air Typing in Virtual Reality using Squeeze haptics-based Postural ReinforcementProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581467(1-11)Online publication date: 19-Apr-2023
  • Show More Cited By

Index Terms

  1. Direct Manipulation in Tactile Displays

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '16: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems
    May 2016
    6108 pages
    ISBN:9781450333627
    DOI:10.1145/2858036
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 May 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Badges

    • Honorable Mention

    Author Tags

    1. direct manipulation
    2. tactile displays
    3. wearables
    4. wrist

    Qualifiers

    • Research-article

    Funding Sources

    • NSERC Canada
    • Project Happiness

    Conference

    CHI'16
    Sponsor:
    CHI'16: CHI Conference on Human Factors in Computing Systems
    May 7 - 12, 2016
    California, San Jose, USA

    Acceptance Rates

    CHI '16 Paper Acceptance Rate 565 of 2,435 submissions, 23%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)56
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 16 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)DrivingVibe: Enhancing VR Driving Experience using Inertia-based Vibrotactile Feedback around the HeadProceedings of the ACM on Human-Computer Interaction10.1145/36042537:MHCI(1-22)Online publication date: 13-Sep-2023
    • (2023)Gaze Speedup: Eye Gaze Assisted Gesture Typing in Virtual RealityProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584072(595-606)Online publication date: 27-Mar-2023
    • (2023)Investigating Eyes-away Mid-air Typing in Virtual Reality using Squeeze haptics-based Postural ReinforcementProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581467(1-11)Online publication date: 19-Apr-2023
    • (2023)MultiVibes: What if your VR Controller had 10 Times more Vibrotactile Actuators?2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00085(703-712)Online publication date: 16-Oct-2023
    • (2022)Eyes-Off Your Fingers: Gradual Surface Haptic Feedback Improves Eyes-Free Touchscreen InteractionProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501872(1-10)Online publication date: 29-Apr-2022
    • (2022)Design, Control, and Psychophysics of Tasbi: A Force-Controlled Multimodal Haptic BraceletIEEE Transactions on Robotics10.1109/TRO.2022.316484038:5(2962-2978)Online publication date: Oct-2022
    • (2022)HapticProxy: Providing Positional Vibrotactile Feedback on a Physical Proxy for Virtual-Real Interaction in Augmented RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2022.204189539:3(449-463)Online publication date: 18-Apr-2022
    • (2021)Studying the Role of Haptic Feedback on Virtual Embodiment in a Drawing TaskFrontiers in Virtual Reality10.3389/frvir.2020.5731671Online publication date: 18-Jan-2021
    • (2021)OM: A Comprehensive Tool to Elicit Subjective Vibrotactile Expressions Associated with Contextualised Meaning in Our Everyday LivesProceedings of the 23rd International Conference on Mobile Human-Computer Interaction10.1145/3447526.3472028(1-12)Online publication date: 27-Sep-2021
    • (2021)Potential of Wrist-worn Vibrotactile Feedback to Enhance the Perception of Virtual Objects during Mid-air GesturesExtended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411763.3451655(1-7)Online publication date: 8-May-2021
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media