Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3340555.3353723acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Understanding the Attention Demand of Touch and Tangible Interaction on a Composite Task

Published: 14 October 2019 Publication History

Abstract

Bimanual input is frequently used on touch and tangible interaction on tabletop surfaces. Considering a composite task, such as moving a set of objects, attention, decision making and fine motor control have to be phased with the coordination of the two hands. However, attention demand is an important factor to design easy to learn and recall interaction techniques. This, determining what interaction modality demands less attention and which one performs better in these conditions is important to improve design. In this work, we present the first empirical results on this matter. We report that users are consistent in their assessments of the attention demand for both touch and tangible modalities, even under different hands synchronicity, and different population sizes and densities. Our findings indicate that the one hand condition and small populations demand less attention comparing to respectively, two hands conditions and bigger populations. Also, we show that tangible modality reduces significantly the attention when using two hands synchronous movements or when moving the sparse populations, decreases the movement time over touch modality, without compromising the traveled distance. We use our findings to outline a set of guidelines to assist touch and tangible design.

References

[1]
[n.d.]. Attention demand definition. https://en.wikipedia.org/wiki/Attention.
[2]
Shiroq Al-Megren and Roy A. Ruddle. 2016. Comparing Tangible and Multi-touch Interaction for Interactive Data Visualization Tasks. ACM Press, 279–286. https://doi.org/10.1145/2839462.2839464
[3]
Wafa Almukadi and A. Lucas Stephane. 2015. BlackBlocks: Tangible Interactive System for Children to Learn 3-Letter Words and Basic Math. In Proceedings of ITS. 421–424. https://doi.org/10.1145/2817721.2823482
[4]
Lisa Anthony, Radu-Daniel Vatavu, and Jacob O. Wobbrock. 2013. Understanding the Consistency of Users’ Pen and Finger Stroke Gesture Articulation. In Proceedings of Graphics Interface 2013(GI ’13). Canadian Information Processing Society, Toronto, Ont., Canada, 87–94. http://dl.acm.org/citation.cfm?id=2532129.2532145
[5]
Alissa N. Antle and Sijie Wang. 2013. Comparing motor-cognitive strategies for spatial problem solving with tangible and multi-touch interfaces. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction. ACM Press, 65. https://doi.org/10.1145/2460625.2460635
[6]
Caroline Appert and Shumin Zhai. 2009. Using Strokes As Command Shortcuts: Cognitive Benefits and Toolkit Support. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’09). ACM, New York, NY, USA, 2289–2298. https://doi.org/10.1145/1518701.1519052
[7]
Maribeth Back, Jonathan Cohen, Rich Gold, Steve Harrison, and Scott Minneman. 2001. Listen reader: an electronically augmented paper-based book. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 23–29.
[8]
Saskia Bakker, Elise van den Hoven, and Berry Eggen. 2010. Design for the Periphery. EuroHaptics 2010 71(2010).
[9]
Elie Cattan, Amélie Rochet-Capellan, and François Bérard. 2016. Effect of Touch Latency on Elementary vs. Bimanual Composite Tasks. In Proceedings of the 2016 ACM on Interactive Surfaces and Spaces - ISS ’16. ACM Press, Niagara Falls, Ontario, Canada, 103–108. https://doi.org/10.1145/2992154.2992160
[10]
Lim Kok Cheng, Chen Soong Der, Manjit Singh Sidhu, and Ridha Omar. 2011. GUI vs. TUI: engagement for children with no prior computing experience. Electronic Journal of Computer Science and Information Technology 3, 1(2011).
[11]
J. Cohen. 1992. A power primer. Psychological Bulletin 112, 1 (1992), 155–159.
[12]
Julie Ducasse, Marc Macé, Bernard Oriola, and Christophe Jouffrais. 2018. BotMap: Non-Visual Panning and Zooming with an Actuated Tabletop Tangible Interface. Transactions on Computer-Human Interaction 25, 4 (2018). https://doi.org/10.1145/3204460
[13]
Katherine M Everitt, Scott R Klemmer, Robert Lee, and James A Landay. 2003. Two worlds apart: bridging the gap between physical and virtual media for distributed design collaboration. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 553–560.
[14]
George W. Fitzmaurice and William Buxton. 1997. An Empirical Evaluation of Graspable User Interfaces: Towards Specialized, Space-multiplexed Input. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems(CHI ’97). ACM, New York, NY, USA, 43–50. https://doi.org/10.1145/258549.258578
[15]
George W. Fitzmaurice, Hiroshi Ishii, and William AS Buxton. 1995. Bricks: laying the foundations for graspable user interfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM Press, 442–449.
[16]
Alix Goguey, Mathieu Nancel, Géry Casiez, and Daniel Vogel. 2016. The Performance and Preference of Different Fingers and Chords for Pointing, Dragging, and Object Transformation. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 4250–4261. https://doi.org/10.1145/2858036.2858194
[17]
Eva Hornecker. 2011. The Role of Physicality in Tangible and Embodied Interactions. interactions 18, 2 (March 2011), 19–23. https://doi.org/10.1145/1925820.1925826
[18]
Eva Hornecker and Jacob Buur. 2006. Getting a grip on tangible interaction: a framework on physical space and social interaction. In Proceedings of the SIGCHI conference on Human Factors in computing systems. ACM, 437–446.
[19]
Eva Hornecker and Jacob Buur. 2006. Getting a Grip on Tangible Interaction: A Framework on Physical Space and Social Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’06). ACM, New York, NY, USA, 437–446. https://doi.org/10.1145/1124772.1124838
[20]
Hiroshi Ishii. 2008. Tangible bits: beyond pixels. In Proceedings of the 2nd international conference on Tangible and embedded interaction. ACM.
[21]
Paul Kabbash, William Buxton, and Abigail Sellen. 1994. Two-handed Input in a Compound Task. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’94). ACM, New York, NY, USA, 417–423. https://doi.org/10.1145/191666.191808
[22]
M.G. Kendall and B. Babington Smith. 1939. The Problem of m Rankings. The Annals of Math. Stat. 10, 3 (1939), 275–287. http://dx.doi.org/10.1214/aoms/1177732186
[23]
Sébastien Kubicki, Marion Wolff, Sophie Lepreux, and Christophe Kolski. 2015. RFID interactive tabletop application with tangible objects: exploratory study to observe young children? behaviors. Personal and Ubiquitous Computing 19, 8 (Dec. 2015), 1259–1274. https://doi.org/10.1007/s00779-015-0891-7
[24]
Daniel Leithinger, Sean Follmer, Alex Olwal, and Hiroshi Ishii. 2014. Physical telepresence: shape capture and display for embodied, computer-mediated remote collaboration. In Proceedings of the 27th annual ACM symposium on User interface software and technology. ACM, 461–470.
[25]
Markus Löchtefeld, Frederik Wiehr, and Sven Gehring. 2017. Analysing the Effect of Tangibile User Interfaces on Spatial Memory. In Proceedings of the 5th Symposium on Spatial User Interaction(SUI ’17). ACM, New York, NY, USA, 78–81. https://doi.org/10.1145/3131277.3132172
[26]
Aurélien Lucchi, Patrick Jermann, Guillaume Zufferey, and Pierre Dillenbourg. 2010. An empirical evaluation of touch and tangible interfaces for tabletop displays. In Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction. ACM, 177–184.
[27]
Andrew Manches, Claire O’Malley, and Steve Benford. 2009. Physical manipulation: evaluating the potential for tangible designs. In Proceedings of the 3rd International Conference on Tangible and Embedded Interaction. ACM, 77–84.
[28]
Paul Marshall. 2007. Do tangible interfaces enhance learning?. In Proceedings of the 1st international conference on Tangible and embedded interaction. ACM, 163–170.
[29]
David McGookin, Euan Robertson, and Stephen Brewster. 2010. Clutching at straws: using tangible interaction to provide non-visual access to graphs. In Proceedings of the SIGCHI conference on human factors in computing systems. ACM, 1715–1724.
[30]
Edward F. Melcer and Katherine Isbister. 2018. Bots & (Main)Frames: Exploring the Impact of Tangible Blocks and Collaborative Play in an Educational Programming Game. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI ’18. ACM Press, Montreal QC, Canada, 1–14. https://doi.org/10.1145/3173574.3173840
[31]
Meredith Ringel Morris, Jacob O. Wobbrock, and Andrew D. Wilson. 2010. Understanding Users’ Preferences for Surface Gestures. In Proceedings of Graphics Interface 2010(GI ’10). Canadian Information Processing Society, Toronto, Ont., Canada, Canada, 261–268. http://dl.acm.org/citation.cfm?id=1839214.1839260
[32]
Martez E. Mott, Radu-Daniel Vatavu, Shaun K. Kane, and Jacob O. Wobbrock. 2016. Smart Touch: Improving Touch Accuracy for People with Motor Impairments with Template Matching. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems(CHI ’16). ACM, New York, NY, USA, 1934–1946. https://doi.org/10.1145/2858036.2858390
[33]
Miguel A. Nacenta, Yemliha Kamber, Yizhou Qiang, and Per Ola Kristensson. 2013. Memorability of Pre-designed and User-defined Gesture Sets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’13). ACM, New York, NY, USA, 1099–1108. https://doi.org/10.1145/2470654.2466142
[34]
Michael Nielsen, Moritz Störring, Thomas B. Moeslund, and Erik Granum. 2004. A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI. In Gesture-Based Communication in Human-Computer Interaction, Antonio Camurri and Gualtiero Volpe (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 409–420.
[35]
Chris North, Tim Dwyer, Bongshin Lee, Danyel Fisher, Petra Isenberg, George Robertson, and Kori Inkpen. 2009. Understanding multi-touch manipulation for surface computing. In IFIP Conference on Human-Computer Interaction. Springer, 236–249.
[36]
Gian Pangaro, Dan Maynes-Aminzade, and Hiroshi Ishii. 2002. The actuated workbench: computer-controlled actuation in tabletop tangible interfaces. In Proceedings of the 15th annual ACM symposium on User interface software and technology. ACM, 181–190.
[37]
Marco Piovesana, Ying-Jui Chen, Neng-Hao Yu, Hsiang-Tao Wu, Li-Wei Chan, and Yi-Ping Hung. 2010. Multi-display map touring with tangible widget. In Proceedings of the 18th ACM international conference on Multimedia. ACM, 679–682.
[38]
S Price, Y Rogers, M Scaife, D Stanton, and H Neale. 2003. Using ?tangibles? to promote novel forms of playful learning. Interacting with Computers 15 (2003), 169–185. https://doi.org/10.1016/S0953-5438(03)00006-7
[39]
Yosra Rekik, Laurent Grisoni, and Nicolas Roussel. 2013. Towards Many Gestures to One Command: A User Study for Tabletops. In Human-Computer Interaction – INTERACT 2013, Paula Kotzé, Gary Marsden, Gitte Lindgaard, Janet Wesson, and Marco Winckler(Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 246–263.
[40]
Yosra Rekik, Radu-Daniel Vatavu, and Laurent Grisoni. 2014. Match-up & Conquer: A Two-step Technique for Recognizing Unconstrained Bimanual and Multi-finger Touch Input. In Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces(AVI ’14). ACM, New York, NY, USA, 201–208. https://doi.org/10.1145/2598153.2598167
[41]
Yosra Rekik, Radu-Daniel Vatavu, and Laurent Grisoni. 2014. Understanding Users’ Perceived Difficulty of Multi-Touch Gesture Articulation. In Proceedings of the 16th International Conference on Multimodal Interaction(ICMI ’14). ACM, New York, NY, USA, 232–239. https://doi.org/10.1145/2663204.2663273
[42]
Jun Rekimoto and Masanori Saitoh. 1999. Augmented surfaces: a spatially continuous work space for hybrid computing environments. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 378–385.
[43]
Jan Richter, Bruce H Thomas, Maki Sugimoto, and Masahiko Inami. 2007. Remote active tangible interactions. In Proceedings of the 1st international conference on Tangible and embedded interaction. ACM, 39–42.
[44]
Eckard Riedenklau, Thomas Hermann, and Helge Ritter. 2012. An integrated multi-modal actuated tangible user interface for distributed collaborative planning. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction. ACM, 169–174.
[45]
Claudia Roda. 2011. Human attention and its implications for human–computer interaction. Cambridge University Press, 11–62. https://doi.org/10.1017/CBO9780511974519.002
[46]
Theodosios Sapounidis and Stavros Demetriadis. 2013. Tangible versus graphical user interfaces for robot programming: exploring cross-age children?s preferences. Personal and Ubiquitous Computing 17, 8 (Dec. 2013), 1775–1786. https://doi.org/10.1007/s00779-013-0641-7
[47]
Bertrand Schneider, Patrick Jermann, Guillaume Zufferey, and Pierre Dillenbourg. 2011. Benefits of a Tangible Interface for Collaborative Learning and Interaction. IEEE Transactions on Learning Technologies 4, 3 (July 2011), 222–232. https://doi.org/10.1109/TLT.2010.36
[48]
Amanda Strawhacker, Amanda Sullivan, and Marina Umaschi Bers. 2013. TUI, GUI, HUI: is a bimodal interface truly worth the sum of its parts?. In Proceedings of the 12th International Conference on Interaction Design and Children - IDC ’13. ACM Press, New York, New York, 309–312. https://doi.org/10.1145/2485760.2485825
[49]
Lucia Terrenghi, David Kirk, Abigail Sellen, and Shahram Izadi. 2007. Affordances for manipulation of physical versus digital media on interactive surfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 1157–1166.
[50]
Philip Tuddenham, David Kirk, and Shahram Izadi. 2010. Graspables revisited: multi-touch vs. tangible input for tabletop displays in acquisition and manipulation tasks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2223–2232.
[51]
Brygg Ullmer and Hiroshi Ishii. 1997. The metaDESK: models and prototypes for tangible user interfaces. In Proceedings of the 10th annual ACM symposium on User interface software and technology. ACM, 223–232.
[52]
B. Ullmer and H. Ishii. 2000. Emerging frameworks for tangible user interfaces. IBM Systems Journal 39, 3.4 (2000), 915–931.
[53]
Ovidiu-Ciprian Ungurean, Radu-Daniel Vatavu, Luis A. Leiva, and Réjean Plamondon. 2018. Gesture Input for Users with Motor Impairments on Touchscreens: Empirical Results Based on the Kinematic Theory. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems(CHI EA ’18). ACM, New York, NY, USA, Article LBW537, 6 pages. https://doi.org/10.1145/3170427.3188619
[54]
Radu-Daniel Vatavu, Daniel Vogel, Géry Casiez, and Laurent Grisoni. 2011. Estimating the Perceived Difficulty of Pen Gestures. In Human-Computer Interaction – INTERACT 2011, Pedro Campos, Nicholas Graham, Joaquim Jorge, Nuno Nunes, Philippe Palanque, and Marco Winckler (Eds.). Springer Berlin Heidelberg, 89–106.
[55]
Andrew D. Wilson. 2005. PlayAnywhere: a compact interactive tabletop projection-vision system. In Proceedings of the 18th annual ACM symposium on User interface software and technology. ACM, 83–92.
[56]
Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-defined Gestures for Surface Computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’09). ACM, New York, NY, USA, 1083–1092. https://doi.org/10.1145/1518701.1518866
[57]
Oren Zuckerman and Ayelet Gal-Oz. 2013. To TUI or not to TUI: Evaluating performance and preference in tangible vs. graphical user interfaces. International Journal of Human-Computer Studies 71, 7-8 (2013), 803–820. https://doi.org/10.1016/j.ijhcs.2013.04.003

Cited By

View all
  • (2023)Using RFID in the Engineering of Interactive Software Systems: A Systematic MappingProceedings of the ACM on Human-Computer Interaction10.1145/35932357:EICS(1-37)Online publication date: 19-Jun-2023
  • (2023)The effect of hands synchronicity on users perceived arms Fatigue in Virtual reality environmentInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103092178:COnline publication date: 1-Oct-2023
  • (2021)RFID-based tangible and touch tabletop for dual reality in crisis management contextJournal on Multimodal User Interfaces10.1007/s12193-021-00370-216:1(31-53)Online publication date: 19-Mar-2021

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICMI '19: 2019 International Conference on Multimodal Interaction
October 2019
601 pages
ISBN:9781450368605
DOI:10.1145/3340555
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 October 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Attention demand
  2. composite task
  3. hands synchronicity
  4. population
  5. set of objects
  6. tangible
  7. touch
  8. user performances

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ICMI '19

Acceptance Rates

Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)2
Reflects downloads up to 06 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Using RFID in the Engineering of Interactive Software Systems: A Systematic MappingProceedings of the ACM on Human-Computer Interaction10.1145/35932357:EICS(1-37)Online publication date: 19-Jun-2023
  • (2023)The effect of hands synchronicity on users perceived arms Fatigue in Virtual reality environmentInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103092178:COnline publication date: 1-Oct-2023
  • (2021)RFID-based tangible and touch tabletop for dual reality in crisis management contextJournal on Multimodal User Interfaces10.1007/s12193-021-00370-216:1(31-53)Online publication date: 19-Mar-2021

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media