Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2971485.2971496acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
research-article

User Expectations of Everyday Gaze Interaction on Smartglasses

Published: 23 October 2016 Publication History

Abstract

Gaze tracking technology is increasingly seen as a viable and practical input modality in a variety of everyday contexts, such as interacting with computers, mobile devices, public displays and wearables (e.g. smartglasses). We conducted an exploratory study consisting of six focus group sessions to understand people's expectations towards everyday gaze interaction on smartglasses. Our results provide novel insights into the role of use-context and social conventions regarding gaze behavior in acceptance of gaze interaction, various social and personal issues that need to be considered while designing gaze-based applications and user preferences of various gaze-based interaction techniques. Our results have many practical design implications and serve towards human-centric design and development of everyday gaze interaction technologies.

References

[1]
Akkil, D., Isokoski, P., Kangas, J., Rantala, J., and Raisamo, R. TraQuMe: a tool for measuring the gaze tracking quality. In Proc. ETRA'14, ACM Press (2014), 327--330.
[2]
Akkil, D., Kangas, J., Rantala, J., Isokoski, P., Spakov, O., and Raisamo, R. Glance Awareness and Gaze Interaction in Smartwatches. In Proc. CHI EA'15, ACM Press (2014), 1271--1276.
[3]
Baldauf, M., Fröhlich, P. and Hutter, S. KIBITZER: a wearable system for eye-gaze-based mobile urban exploration. In Proc. of AH'10, ACM Press (2010), p. 9.
[4]
Biedert, R., Buscher, G., Schwarz, S., Hees, J. and Dengel, A. Text 2.0. In Proc. of CHI EA'10, ACM Press (2010), 4003--4008.
[5]
Bulling, A. and Gellersen, H. Toward mobile eye-based human-computer interaction. Pervasive Computing, IEEE, 9(4), 8--12.
[6]
Chitty, N. User Fatigue and Eye Controlled Technology. OCAD University (2013).
[7]
Drewes, H., De Luca, A. and Schmidt, A. Eye-gaze interaction for mobile phones. In Proc. Mobility '07, ACM Press (2007), 64--371.
[8]
Dybdal, M.L., Agustin, J.S. and Hansen, J.P. Gaze input for mobile devices by dwell and gestures. In Proc. of ETRA'12, ACM Press (2012), 225--228.
[9]
Esteves, A., Velloso, E., Bulling, A. and Gellersen, H. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements. In Proc. of UIST'15, ACM Press (2015), 457--466.
[10]
Hassenzahl, M. and Tractinsky, N. User experience-a research agenda. Behaviour & information technology (2006), 25(2), 91--97.
[11]
Holtzblatt, K., Wendell, J.B. and Wood, S. Rapid contextual design: a how-to guide to key techniques for user-centered design. Ubiquity (2015), 3--3.
[12]
Isokoski, P. Text input methods for eye trackers using off-screen targets. In Proc. of ETRA'00, ACM Press (2000), 15--21.
[13]
Istance, H., Hyrskykari, A., Immonen, L., Mansikkamaa, S. and Vickers, S. Designing gaze gestures for gaming: an investigation of performance. In Proc. ETRA'10, ACM Press (2010), 323--330.
[14]
Järvenpää, T. and Aaltonen, V. Compact near-to-eye display with integrated gaze tracker. In Proc. SPIE Photonics Europe (2008), 700106.
[15]
Järvenpää, T. and Äyräs, P. Highly integrated near-to-eye display and gaze tracker. In Proc. SPIE Photonics Europe (2010), 77230.
[16]
Kangas, J., Akkil, D., Rantala, J., Isokoski, P., Majaranta, P. and Raisamo, R. Gaze gestures and haptic feedback in mobile devices. In Proc. of CHI'14, ACM Press (2014), 435--438.
[17]
LaFrance, M. and Mayo, C., 1978. Cultural aspects of nonverbal communication. International Journal of Intercultural Relations, (1978) 2(1), 71--89.
[18]
Lee, J.Y., Park, H.M., Lee, S.H., Shin, S.H., Kim, T.E. and Choi, J.S. Design and implementation of an augmented reality system using gaze interaction. Multimedia Tools and Applications (2014), 265--280.
[19]
Lucero, A., Lyons, K., Vetek, A., Järvenpää, T., White, S. and Salmimaa, M. Exploring the interaction design space for interactive glasses. In Proc. of CHI EA'13, ACM Press (2013), 1341--1346.
[20]
Lucero, A. and Vetek, A. NotifEye: using interactive glasses to deal with notifications while walking in public. In Proc. of ACE'14, ACM Press (2014), 17.
[21]
Majaranta, P. and Räihä, K.J. Twenty years of eye typing: systems and design issues. In Proc. of ETRA'02, ACM Press (2002), 15--22.
[22]
McCarthy, J. and Wright, P. Technology as experience. Interactions (2004), 11(5), 42--43.
[23]
Michalco, J., Simonsen, J.G. and Hornbæk, K. An Exploration of the Relation Between Expectations and User Experience. International Journal of Human-Computer Interaction (2015), 31(9), 603--617.
[24]
Mulvey, F., Villanueva, A., Sliney, D., Lange, R., and Donegan, M. Safety issues and infrared light. Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies (2011). 336--358.
[25]
Olsson, T., Lagerstam, E., Kärkkäinen, T. and Väänänen-Vainio-Mattila, K., Expected user experience of mobile augmented reality services: a user study in the context of shopping centres. Personal and ubiquitous computing (2013), 17(2), 287--304.
[26]
Pierce, J.S., Pausch, R., Sturgill, C.B. and Christiansen, K.D. Designing a successful HMD-based experience. Presence (1999), 8(4), 469--473.
[27]
Piumsomboon, T., Clark, A., Billinghurst, M. and Cockburn, A. User-defined gestures for augmented reality. In Proc. INTERACT'13, Springer (2013), 282--299.
[28]
Qvarfordt, P. and Zhai, S. Conversing with the user based on eye-gaze patterns. In Proc. of CHI'05, ACM Press (2005), 221--230.
[29]
Selker, T., Lockerd, A. and Martinez, J. Eye-R, a glasses-mounted eye motion detection interface. In Proc. of CHI EA'01, ACM Press (2001), 179--180.
[30]
Stellmach, S. and Dachselt, R., Look & touch: gaze-supported target acquisition. In Proc. of CHI'12, ACM press (2012), 2981--2990.
[31]
Sugano, Y. and Bulling, A., Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency. In Proc. of UIST '15 (2015), 363--372.
[32]
Utsumi, Y., Kato, Y., Kunze, K., Iwamura, M. and Kise, K. Who are you?: A wearable face recognition system to support human memory. In Proc. of AH'13, ACM Press (2013), 150--153.
[33]
Vidal, M., Bulling, A. and Gellersen, H. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proc. of Ubicomp'13, ACM Press (2013), 439--448.

Cited By

View all
  • (2023)A Review on Recent Trends and Applications of IoT in Additive ManufacturingApplied System Innovation10.3390/asi60200506:2(50)Online publication date: 6-Apr-2023
  • (2023)Reality Anchors: Bringing Cues from Reality to Increase Acceptance of Immersive Technologies in TransitProceedings of the ACM on Human-Computer Interaction10.1145/36042667:MHCI(1-28)Online publication date: 13-Sep-2023
  • (2023)Unobtrusive interaction: a systematic literature review and expert surveyHuman–Computer Interaction10.1080/07370024.2022.216240439:5-6(380-416)Online publication date: Feb-2023
  • Show More Cited By

Index Terms

  1. User Expectations of Everyday Gaze Interaction on Smartglasses

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    NordiCHI '16: Proceedings of the 9th Nordic Conference on Human-Computer Interaction
    October 2016
    1045 pages
    ISBN:9781450347631
    DOI:10.1145/2971485
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 October 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Everyday gaze interaction
    2. gaze tracking
    3. head-mounted displays
    4. interactive eyewear

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    NordiCHI '16

    Acceptance Rates

    NordiCHI '16 Paper Acceptance Rate 58 of 231 submissions, 25%;
    Overall Acceptance Rate 379 of 1,572 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)14
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 11 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)A Review on Recent Trends and Applications of IoT in Additive ManufacturingApplied System Innovation10.3390/asi60200506:2(50)Online publication date: 6-Apr-2023
    • (2023)Reality Anchors: Bringing Cues from Reality to Increase Acceptance of Immersive Technologies in TransitProceedings of the ACM on Human-Computer Interaction10.1145/36042667:MHCI(1-28)Online publication date: 13-Sep-2023
    • (2023)Unobtrusive interaction: a systematic literature review and expert surveyHuman–Computer Interaction10.1080/07370024.2022.216240439:5-6(380-416)Online publication date: Feb-2023
    • (2022)A Design Space for Human Sensor and Actuator Focused In-Vehicle Interaction Based on a Systematic Literature ReviewProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346176:2(1-51)Online publication date: 7-Jul-2022
    • (2022)Iteratively Designing Gesture Vocabularies: A Survey and Analysis of Best Practices in the HCI LiteratureACM Transactions on Computer-Human Interaction10.1145/350353729:4(1-54)Online publication date: 5-May-2022
    • (2022)User Expectations of Serendipitous Recommender Systems[ ] With Design: Reinventing Design Modes10.1007/978-981-19-4472-7_86(1322-1336)Online publication date: 6-Nov-2022
    • (2020)Effects of Position and Alignment of Notifications on AR Glasses during Social InteractionProceedings of the 11th Nordic Conference on Human-Computer Interaction: Shaping Experiences, Shaping Society10.1145/3419249.3420095(1-11)Online publication date: 25-Oct-2020
    • (2020)STAT: Subtle Typing Around the Thigh for Head-Mounted Displays22nd International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3379503.3403549(1-11)Online publication date: 5-Oct-2020
    • (2020)Verge-it: Gaze Interaction for a Binocular Head-Worn Display using Modulated Disparity Vergence Eye MovementExtended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3334480.3382908(1-7)Online publication date: 25-Apr-2020
    • (2020)Gaze-based Kinaesthetic Interaction for Virtual RealityInteracting with Computers10.1093/iwcomp/iwaa00232:1(17-32)Online publication date: 12-Apr-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media