Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

Composite Line Designs and Accuracy Measurements for Tactile Line Tracing on Touch Surfaces

Published: 05 November 2021 Publication History

Abstract

Eyes-free operation of mobile devices is critical in situations where the visual channel is either unavailable or attention is needed elsewhere. In such situations, vibrotactile tracing along paths or lines can help users to navigate and identify symbols and shapes without visual information. In this paper, we investigated the applicability of different metrics that can measure the effectiveness of vibrotactile line tracing methods on touch screens. In two user studies, we compare trace Length Error, Area Error, and Fréchet Distance as alternatives to commonly used trace Time. Our results show that a lower Fréchet distance is correlated better with the comprehension of a line trace. Furthermore, we show that distinct feedback methods perform differently with varying geometric features in lines and propose a segmented line design for tactile line tracing studies. We believe the results will inform future designs of eyes-free operation techniques and studies.

Supplementary Material

MP4 File (v5iss491svf.mp4)
Supplemental video

References

[1]
Paul Bach-y Rita and Stephen W. Kercel. 2003. Sensory substitution and the human-machine interface. Trends in Cognitive Sciences, Vol. 7, 12 (Dec. 2003), 541--546. https://doi.org/10.1016/j.tics.2003.10.013
[2]
Andrea Bianchi, Ian Oakley, Jong Keun Lee, Dong Soo Kwon, and Vassilis Kostakos. 2011. Haptics for tangible interaction: A vibro-tactile prototype. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction - TEI '11. ACM Press, New York, New York, USA, 283. https://doi.org/10.1145/1935701.1935764
[3]
Stephen Brewster and Lorna M. Brown. 2004. Tactons: Structured Tactile Messages for Non-Visual Information Display. In Proceedings of the Fifth Conference on Australasian User Interface - Volume 28 (Dunedin, New Zealand) (AUIC '04). Australian Computer Society, Inc., AUS, 15--23. https://dl.acm.org/doi/10.5555/976310.976313
[4]
Stephen A. Brewster, Peter C. Wright, and Alistair D. N. Edwards. 1993. An Evaluation of Earcons for Use in Auditory Human-Computer Interfaces. In Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems (Amsterdam, The Netherlands) (CHI '93). Association for Computing Machinery, New York, NY, USA, 222--227. https://doi.org/10.1145/169059.169179
[5]
S. Choi and K. J. Kuchenbecker. 2013. Vibrotactile Display: Perception, Technology, and Applications. Proc. IEEE, Vol. 101, 9 (2013), 2093--2104. https://doi.org/10.1109/JPROC.2012.2221071
[6]
Ashley Colley, Lasse Virtanen, Timo Ojala, and Jonna H"a kkil"a. 2016. Guided touch screen: enhanced eyes-free interaction. In Proceedings of the 5th ACM International Symposium on Pervasive Displays - PerDis '16 . ACM Press, New York, New York, USA, 80--86. https://doi.org/10.1145/2914920.2915008
[7]
Heather Culbertson, Julie M. Walker, Michael Raitor, and Allison M. Okamura. 2017. WAVES: A Wearable Asymmetric Vibration Excitation System for Presenting Three-Dimensional Translation and Rotation Cues. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Vol. 2017-May. ACM, New York, NY, USA, 4972--4982. https://doi.org/10.1145/3025453.3025741
[8]
Kiran Dandekar, Balasundar I. Raju, and Mandayam A. Srinivasan. 2003. 3-D Finite-Element Models of Human and Monkey Fingertips to Investigate the Mechanics of Tactile Sense . Journal of Biomechanical Engineering, Vol. 125, 5 (oct 2003), 682--691. https://doi.org/10.1115/1.1613673
[9]
Ronald Ecker, Verena Broy, Andreas Butz, and Alexander De Luca. 2009. pieTouch: a direct touch gesture interface for interacting with in-vehicle information systems. In Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI '09. ACM Press, New York, New York, USA, 1. https://doi.org/10.1145/1613858.1613887
[10]
Nicholas A Giudice, Hari Prasath Palani, Eric Brenner, and Kevin M Kramer. 2012. Learning non-visual graphical information using a touch-based vibro-audio interface. In Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility (Boulder, Colorado, USA) (ASSETS '12). Association for Computing Machinery, New York, NY, USA, 103--110. https://doi.org/10.1145/2384916.2384935
[11]
L. H. Goldish and H. E. Taylor. 1974. The Optacon: a valuable device for blind persons. New Outlook for the Blind, Vol. 68, 2 (1974), 49--56. https://doi.org/10.1177/0145482X7406800201
[12]
Daniel Goldreich and Ingrid M. Kanics. 2006. Performance of blind and sighted humans on a tactile grating detection task. Perception & Psychophysics, Vol. 68, 8 (Nov. 2006), 1363--1371. https://doi.org/10.3758/BF03193735
[13]
Cagatay Goncu and Kim Marriott. 2011. GraVVITAS: Generic Multi-touch Presentation of Accessible Graphics. In Human-Computer Interaction -- INTERACT 2011, Pedro Campos, Nicholas Graham, Joaquim Jorge, Nuno Nunes, Philippe Palanque, and Marco Winckler (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 30--48. https://doi.org/10.1007/978--3--642--23774--4_5
[14]
Jenna L Gorlewicz, Jennifer L Tennison, P Merlin Uesbeck, Margaret E Richard, Hari P Palani, Andreas Stefik, Derrick W Smith, and Nicholas A Giudice. 2020. Design Guidelines and Recommendations for Multimodal, Touchscreen-based Graphics. ACM Trans. Access. Comput., Vol. 13, 3 (Aug. 2020), 1--30. https://doi.org/10.1145/3403933
[15]
Jay Henderson, Jeff Avery, Laurent Grisoni, and Edward Lank. 2019. Leveraging Distal Vibrotactile Feedback for Target Acquisition. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI '19). Association for Computing Machinery, New York, NY, USA, 1--11. https://doi.org/10.1145/3290605.3300715
[16]
Jonggi Hong, Alisha Pradhan, Jon E Froehlich, and Leah Findlater. 2017. Evaluating Wrist-Based Haptic Feedback for Non-Visual Target Finding and Path Tracing on a 2D Surface. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (Baltimore, Maryland, USA) (ASSETS '17). Association for Computing Machinery, New York, NY, USA, 210--219. https://doi.org/10.1145/3132525.3132538
[17]
R D Jacobson. 1998. Navigating maps with little or no sight: An audio-tactile approach. Proceedings of the Workshop on Content Visualization and Intermedia Representations (CVIR) (1998), 95--102. https://aclanthology.org/W98-0214
[18]
Charles F. Jekel, Gerhard Venter, Martin P. Venter, Nielen Stander, and Raphael T. Haftka. 2019. Similarity measures for identifying material parameters from hysteresis loops using inverse analysis. International Journal of Material Forming, Vol. 12, 3 (May 2019), 355--378. https://doi.org/10.1007/s12289-018--1421--8
[19]
Hwan Kim, HyeonBeom Yi, Hyein Lee, and Woohun Lee. 2018. HapCube: A Wearable Tactile Device to Provide Tangential and Normal Pseudo-Force Feedback on a Fingertip .Association for Computing Machinery, New York, NY, USA, 1--13. https://doi.org/10.1145/3173574.3174075
[20]
Roberta L. Klatzky, Nicholas A. Giudice, Christopher R. Bennett, and Jack M. Loomis. 2014. Touch-Screen Technology for the Dynamic Display of 2D Spatial Information Without Vision: Promise and Progress. Multisensory Research, Vol. 27, 5--6 (2014), 359 -- 378. https://doi.org/10.1163/22134808-00002447
[21]
Pedro Lopes, Douaa Yüksel, Franccois Guimbretière, and Patrick Baudisch. 2016. Muscle-Plotter: An Interactive System Based on Electrical Muscle Stimulation That Produces Spatial Output. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (Tokyo, Japan) (UIST '16). Association for Computing Machinery, New York, NY, USA, 207--217. https://doi.org/10.1145/2984511.2984530
[22]
Christopher McAdam and Stephen Brewster. 2009. Distal Tactile Feedback for Text Entry on Tabletop Computers. In Proceedings of the 23rd British HCI Group Annual Conference on People and Computers: Celebrating People and Technology (Cambridge, United Kingdom) (BCS-HCI '09). BCS Learning & Development Ltd., Swindon, GBR, 504--511. https://doi.org/10.14236/ewic/HCI2009.63
[23]
Hari P. Palani, Paul D. S. Fink, and Nicholas A. Giudice. 2020. Design Guidelines for Schematizing and Rendering Haptically Perceivable Graphical Elements on Touchscreen Devices . International Journal of Human--Computer Interaction, Vol. 36, 15 (Sept. 2020), 1393--1414. https://doi.org/10.1080/10447318.2020.1752464 Publisher: Taylor & Francis.
[24]
Sylvain Pauchet, Catherine Letondal, Jean-Luc Vinot, Mickaë l Causse, Mathieu Cousy, Valentin Becquet, and Guillaume Crouzet. 2018. GazeForm: Dynamic gaze-adaptive touch surface for eyes-free interaction in airliner cockpits. In Proceedings of the 2018 on Designing Interactive Systems Conference 2018 - DIS '18. ACM Press, New York, New York, USA, 1193--1205. https://doi.org/10.1145/3196709.3196712
[25]
Martin Pielot, Benjamin Poppinga, Wilko Heuten, and Susanne Boll. 2011. A Tactile Compass for Eyes-Free Pedestrian Navigation. In Human-Computer Interaction -- INTERACT 2011, Pedro Campos, Nicholas Graham, Joaquim Jorge, Nuno Nunes, Philippe Palanque, and Marco Winckler (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 640--656. https://doi.org/10.1007/978--3--642--23771--3_47
[26]
Jun Rekimoto. 2013. Traxion: A Tactile Interaction Device with Virtual Force Sensation. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (St. Andrews, Scotland, United Kingdom) (UIST '13). Association for Computing Machinery, New York, NY, USA, 427--432. https://doi.org/10.1145/2501988.2502044
[27]
Marzia Taibbi, Cristian Bernareggi, Andrea Gerino, Dragan Ahmetovic, and Sergio Mascetti. 2014. AudioFunctions: Eyes-Free Exploration of Mathematical Functions on Tablets. In Computers Helping People with Special Needs, Klaus Miesenberger, Deborah Fels, Dominique Archambault, Petr Pevn áz, and Wolfgang Zagler (Eds.). Springer International Publishing, Cham, 537--544. https://doi.org/10.1007/978--3--319-08596--8_84
[28]
Jennifer L. Tennison and Jenna L. Gorlewicz. 2016. Toward Non-visual Graphics Representations on Vibratory Touchscreens: Shape Exploration and Identification. In Haptics: Perception, Devices, Control, and Applications, Fernando Bello, Hiroyuki Kajimoto, and Yon Visell (Eds.). Springer International Publishing, Cham, 384--395. https://doi.org/10.1007/978--3--319--42324--1_38
[29]
Jennifer L Tennison and Jenna L Gorlewicz. 2019. Non-visual Perception of Lines on a Multimodal Touchscreen Tablet. ACM Trans. Appl. Percept., Vol. 16, 1 (Feb. 2019), 1--19. https://doi.org/10.1145/3301415
[30]
E Yoo, E Park, and B Chung. 2001. Mental practice effect on line-tracing accuracy in persons with hemiparetic stroke: a preliminary study. Arch. Phys. Med. Rehabil., Vol. 82, 9 (Sept. 2001), 1213--1218. https://doi.org/10.1053/apmr.2001.25095
[31]
Stanislaw Zabramski. 2011. Careless Touch: A Comparative Evaluation of Mouse, Pen, and Touch Input in Shape Tracing Task. In Proceedings of the 23rd Australian Computer-Human Interaction Conference (Canberra, Australia) (OzCHI '11). Association for Computing Machinery, New York, NY, USA, 329--332. https://doi.org/10.1145/2071536.2071588
[32]
Stanislaw Zabramski, Suman Shrestha, and Wolfgang Stuerzlinger. 2013. Easy vs. Tricky: The Shape Effect in Tracing, Selecting, and Steering With Mouse, Stylus, and Touch. In Proceedings of International Conference on Making Sense of Converging Media (Tampere, Finland) (AcademicMindTrek '13). Association for Computing Machinery, New York, NY, USA, 99--103. https://doi.org/10.1145/2523429.2523444
[33]
Kaixing Zhao, Marcos Serrano, Bernard Oriola, and Christophe Jouffrais. 2020. VibHand: On-Hand Vibrotactile Interface Enhancing Non-Visual Exploration of Digital Graphics. Proc. ACM Hum.-Comput. Interact., Vol. 4, ISS, Article 207 (Nov. 2020), bibinfonumpages19 pages. https://doi.org/10.1145/3427335
[34]
Shengdong Zhao, Pierre Dragicevic, Mark Chignell, Ravin Balakrishnan, and Patrick Baudisch. 2007. Earpod: Eyes-free menu selection using touch input and reactive audio feedback. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '07 . ACM Press, New York, New York, USA, 1395--1404. https://doi.org/10.1145/1240624.1240836

Cited By

View all
  • (2024)IrOnTex: Using Ironable 3D Printed Objects to Fabricate and Prototype Customizable Interactive TextilesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785438:3(1-26)Online publication date: 9-Sep-2024
  • (2024)Fabricating Customizable 3-D Printed Pressure Sensors by Tuning Infill CharacteristicsIEEE Sensors Journal10.1109/JSEN.2024.335833024:6(7604-7613)Online publication date: 15-Mar-2024

Index Terms

  1. Composite Line Designs and Accuracy Measurements for Tactile Line Tracing on Touch Surfaces

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image Proceedings of the ACM on Human-Computer Interaction
      Proceedings of the ACM on Human-Computer Interaction  Volume 5, Issue ISS
      ISS
      November 2021
      481 pages
      EISSN:2573-0142
      DOI:10.1145/3498314
      Issue’s Table of Contents
      This work is licensed under a Creative Commons Attribution International 4.0 License.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 November 2021
      Published in PACMHCI Volume 5, Issue ISS

      Check for updates

      Author Tags

      1. haptics
      2. metrics
      3. tactile line tracing
      4. touch screens

      Qualifiers

      • Research-article

      Funding Sources

      • Australian Research Council Discovery Early Career Researcher Award

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)128
      • Downloads (Last 6 weeks)19
      Reflects downloads up to 16 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)IrOnTex: Using Ironable 3D Printed Objects to Fabricate and Prototype Customizable Interactive TextilesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785438:3(1-26)Online publication date: 9-Sep-2024
      • (2024)Fabricating Customizable 3-D Printed Pressure Sensors by Tuning Infill CharacteristicsIEEE Sensors Journal10.1109/JSEN.2024.335833024:6(7604-7613)Online publication date: 15-Mar-2024

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Login options

      Full Access

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media