Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3131277.3132185acmconferencesArticle/Chapter ViewAbstractPublication PagessuiConference Proceedingsconference-collections
research-article

GestureDrawer: one-handed interaction technique for spatial user-defined imaginary interfaces

Published: 16 October 2017 Publication History

Abstract

Existing empty-handed mid-air interaction techniques for system control are typically limited to a confined gesture set or point-and-select on graphical user interfaces. In this paper, we introduce GestureDrawer, a one-handed interaction with a 3D imaginary interface. Our approach allows users to self-define an imaginary interface, acquire visuospatial memory of the position of its controls in empty space and enables them to select or manipulate those controls by moving their hand in all three dimensions. We evaluate our approach with three user studies and demonstrate that users can indeed position imaginary controls in 3D empty space and select them with an accuracy of 93% without receiving any feedback and without fixed landmarks (e.g. second hand). Further, we show that imaginary interaction is generally faster than mid-air interaction with graphical user interfaces, and that users can retrieve the position of their imaginary controls even after a proprioception disturbance. We condense our findings into several design recommendations and present automotive applications.

Supplementary Material

M4V File (p128-babic.m4v)
MP4 File (p128-babic_presentation.mp4)

References

[1]
Aigner, R., Wigdor, D., Benko, H., Haller, M., Lindbauer, D., Ion, A., Zhao, S., and Koh, J. Understanding mid-air hand gestures: A study of human preferences in usage of gesture types for HCI. Microsoft Research TechReport MSR-TR-2012-111 (2012).
[2]
Baddeley, A.D., and Hitch, G. Working memory. The Psychology of Learning and Motivation: Advances in Research and Theory 8, (1974).
[3]
Baudel, T., and Beaudouin-Lafon, M. Charade: remote control of objects using free-hand gestures. Commun. ACM 36, 7 (1993), 28--35.
[4]
Bellotti, V., Back, M., Edwards, W.K., Grinter, R.E., Henderson, A., and Lopes, C. Making sense of sensing systems: five questions for designers and researchers. In Proc. CHI '02. ACM.
[5]
BMW Gesture Control. 2017. https://arstechnica.com/cars/2016/01/playing-around-with-bmws-7-series-gesture-control-user-interface/ (2017).
[6]
Bowman, D.A., Coquillart, A., Froehlich, B., Hirose, M., Kitamura, Y., Kiyokawa, K., and Stuerzlinger, W. 3D User Interfaces: New Directions and Perspectives. IEEE Comput. Graph. Appl. 28, 6 (2008), 20--36.
[7]
Buxton, W. A three-state model of graphical input. In Proc. INTERACT '90. North-Holland Publishing Co.
[8]
Cockburn, A., Quinn, P., Gutwin, C., Ramos, G., and Looser, J. Air pointing: Design and evaluation of spatial target acquisition with and without visual feedback. Int. J. Hum.-Comput. Stud. 69, 6 (2011), 401--414.
[9]
Fuentes, C.T. and Bastian, A.J. (2010). Where is your arm? Variations in proprioception across space and tasks. Journal of Neurophysiology 103.
[10]
Grosjean, J., Burkhardt, J.M., Coquillart, S., and Richard, P. Evaluation of the Command and Control Cube. In Proc. ICMI '02. IEEE.
[11]
Gustafson, S. Imaginary interfaces: touchscreen-like interaction without the screen. In CHI EA '12. ACM.
[12]
Gustafson, S., Bierwirth, D., and Baudisch, P. Imaginary interfaces: spatial interaction with empty hands and without visual feedback. In Proc. UIST '10. ACM.
[13]
Gustafson, S., Holz, C., and Baudisch, P. Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device. In Proc. UIST '11. ACM.
[14]
Gustafson, S., Rabe, B., and Baudisch, P. Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing. In Proc. CHI '13. ACM.
[15]
Hincapié-Ramos, J.D., Guo, X., Moghadasian, P., and Irani, P. Consumed endurance: a metric to quantify arm fatigue of mid-air interactions. In Proc. CHI '14. ACM.
[16]
Hinckley, K., Pausch, R., and Proffitt, D. Attention and visual feedback: the bimanual frame of reference. In Proceedings of the 1997 symposium on Interactive 3D graphics (I3D '97). ACM.
[17]
Hsieh, Y.T., Jylhä, A., Orso, V., Gamberini, L., and Jacucci, G. Designing a Willing-to-Use-in-Public Hand Gestural Interaction Technique for Smart Glasses. In Proc. CHI '16. ACM.
[18]
Hürst, W. and Vriens, K. Multimodal feedback for finger-based interaction in mobile augmented reality. In Proc. ICMI 2016. ACM.
[19]
Jude, A., Poor, M.P., and Guinness, D. Grasp, Grab or Pinch? Identifying User Preference for In-Air Gestural Manipulation. In Proc. SUI '16. ACM.
[20]
Khambadkar, V. and Folmer, E. GIST: a gestural interface for remote nonvisual spatial perception. In Proc. UIST '13. ACM.
[21]
Kulshreshth, A. and LaViola Jr, J.J. Exploring the usefulness of finger-based 3D gesture menu selection. In Proc. CHI '14. ACM.
[22]
LaViola, J. Kruijff, E, Bowman, D., Pouyprev, I., McMahan, R. 3D User Interfaces: Theory and Practice (second edition), Addison-Wesley, 2017.
[23]
Lin, S.Y., Shie, C.K., Chen, S.C., and Hung, Y.P. AirTouch panel: a re-anchorable virtual touch panel. In Proc. MM '13. ACM.
[24]
Lin, S.Y., Su, C.H., Cheng, K.Y., Liang, R.H., Kuo, T.H., and Chen, B.Y. Pub - point upon body: exploring eyes-free interaction and methods on an arm. In Proc. UIST '11. ACM.
[25]
Lubos, P., Bruder, G., Ariza, O., and Steinicke, F. Touching the Sphere: Leveraging Joint-Centered Kinespheres for Spatial User Interaction. In Proc. SUI '16. ACM.
[26]
Nacenta, M.A., Kamber, Y., Qiang, Y., and Kristensson, P.O. Memorability of pre-designed and user-defined gesture sets. In Proc. CHI '13. ACM.
[27]
Norman, D.A. Natural user interfaces are not natural. interactions 17, 2010.
[28]
Norman, D.A., and Draper, S.W. User Centered System Design; New Perspectives on HCI. L. Erlbaum Assoc. Inc. (1986).
[29]
Ruiz, J., and Vogel, D. Soft-Constraints to Reduce Legacy and Performance Bias to Elicit Whole-body Gestures with Low Arm Fatigue. In Proc. CHI '15.
[30]
Scarr, J., Cockburn, A., and Gutwin, C. Supporting and Exploiting Spatial Memory in User Interfaces. Found. Trends Hum.-Comput. Interact 2013.
[31]
Sherrington, C., 1907. On the proprio-ceptive system, especially in its reflex aspect. Brain 29 (4), 483--486.
[32]
Song, J., Pece, F., Sörös, G., Koelle, M., and Hilliges, O. Joint Estimation of 3D Hand Position and Gestures from Monocular Video for Mobile Interaction. In Proc. CHI '15. ACM.
[33]
Taylor, J., Bordeaux, L., Cashman, T., Corish, B., Keskin, C., Sharp, T., Soto, E., Sweeney, D., Valentin J., Luff, B., Topalian, A., Wood, E., Khamis, S. Kohli, P., Izadi, S., Banks, R., Fitzgibbon, A., and Shotton, J. Efficient and precise interactive hand tracking through joint, continuous optimization of pose and correspondences. ACM Trans. Graph. 35, 4, Article 143 (2016).
[34]
Wang, C.Y., Hsiu, M.C., Chiu, P.T., Chang, C.H., Chan, L., Chen, B.Y., and Chen, M.Y. PalmGesture: Using Palms as Gesture Interfaces for Eyes-free Input. In Proc. MobileHCI '15. ACM.
[35]
Wang, S., Song, J., Lien, J., Poupyrev, I., and Hilliges, O. Interacting with Soli: Exploring Fine-Grained Dynamic Gesture Recognition in the Radio-Frequency Spectrum. In Proc. UIST '16. ACM.
[36]
Wigdor, D., and Wixon, D. Brave NUI World: Designing Natural User Interfaces for Touch and Gesture (1st ed.). Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.
[37]
Yat Li, F.C., Dearman, D., and Truong, K.N. Virtual shelves: interactions with orientation aware devices. In Proc. UIST '09. ACM.

Cited By

View all
  • (2024)Sonically-enhanced in-vehicle air gesture interactions: evaluation of different spearcon compression ratesJournal on Multimodal User Interfaces10.1007/s12193-024-00430-3Online publication date: 19-May-2024
  • (2023)Novel In-Vehicle Gesture Interactions: Design and Evaluation of Auditory Displays and Menu Generation InterfacesProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607164(224-233)Online publication date: 18-Sep-2023
  • (2020)WristLensProceedings of the Augmented Humans International Conference10.1145/3384657.3384797(1-8)Online publication date: 16-Mar-2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SUI '17: Proceedings of the 5th Symposium on Spatial User Interaction
October 2017
167 pages
ISBN:9781450354868
DOI:10.1145/3131277
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 16 October 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. imaginary interfaces
  2. interaction technique: gestures
  3. screen-less
  4. spatial user interfaces
  5. ubiquitous computing

Qualifiers

  • Research-article

Conference

SUI '17
Sponsor:
SUI '17: Symposium on Spatial User Interaction
October 16 - 17, 2017
Brighton, United Kingdom

Acceptance Rates

Overall Acceptance Rate 86 of 279 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)3
Reflects downloads up to 15 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Sonically-enhanced in-vehicle air gesture interactions: evaluation of different spearcon compression ratesJournal on Multimodal User Interfaces10.1007/s12193-024-00430-3Online publication date: 19-May-2024
  • (2023)Novel In-Vehicle Gesture Interactions: Design and Evaluation of Auditory Displays and Menu Generation InterfacesProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607164(224-233)Online publication date: 18-Sep-2023
  • (2020)WristLensProceedings of the Augmented Humans International Conference10.1145/3384657.3384797(1-8)Online publication date: 16-Mar-2020
  • (2019)Experimental Analysis of Barehand Mid-air Mode-Switching Techniques in Virtual RealityProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300426(1-14)Online publication date: 2-May-2019
  • (2019)Eliciting Contact-Based and Contactless Gestures With Radar-Based SensorsIEEE Access10.1109/ACCESS.2019.29513497(176982-176997)Online publication date: 2019
  • (2019)3D Gesture Interface: Japan-Brazil PerceptionsCross-Cultural Design. Methods, Tools and User Experience10.1007/978-3-030-22577-3_19(266-279)Online publication date: 27-Jun-2019
  • (2018)Pocket6Proceedings of the 2018 ACM Symposium on Spatial User Interaction10.1145/3267782.3267785(2-10)Online publication date: 13-Oct-2018

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media