Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2857491.2857537acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper
Public Access

Gaze guidance for improved password recollection

Published: 14 March 2016 Publication History

Abstract

Most computer systems require user authentication, which has led to an increase in the number of passwords one has to remember. In this paper we explore if spatial visual cues can be used to improve password recollection. Specifically, we consider if associating each character in a password to user-defined spatial regions in an image facilitates better recollection. We conduct a user study where participants were asked to recall randomly generated numeric passwords under the following conditions: no image association (No-Image), image association (Image-Only), image association combined with overt visual cues (Overt-Guidance), and image association combined with subtle visual cues (Subtle-Guidance). We measured the accuracy of password recollection and response time as well as average dwell-time at target locations for the gaze guided conditions. Subjects performed significantly better on password recollection when they were actively guided to regions in the associated image using overt visual cues. Accuracy of password recollection using subtle cues was also higher than the No-Image and Image-Only conditions, but the effect was not significant. No significant difference was observed in the average dwell-times between the overt and subtle guidance approaches.

Supplementary Material

MP4 File (p237-sridharan.mp4)

References

[1]
Awh, E., Anllo-Vento, L., Hillyard, S., et al. 2000. The role of spatial selective attention in working memory for locations: evidence from event-related potentials. Cognitive Neuroscience, Journal of 12, 5, 840--847.
[2]
Bailey, R., McNamara, A., Sudarsanam, N., and Grimm, C. 2009. Subtle gaze direction. ACM Trans. Graph. 28, 4 (Sept.), 100:1--100:14.
[3]
Groen, M., and Noyes, J. 2010. Solving problems: How can guidance concerning task-relevancy be provided? Comput. Hum. Behav. 26 (November), 1318--1326.
[4]
Hutton, P. 1993. History as an Art of Memory. University of Vermont.
[5]
Itti, L., and Koch, C. 2000. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research 40, 10-12 (May), 1489--1506.
[6]
Judd, T., Durand, F., and Torralba, A. 2012. A benchmark of computational models of saliency to predict human fixations. In MIT Technical Report.
[7]
Lu, W., Duh, H.-L., Feiner, S., and Zhao, Q. 2014. Attributes of subtle cues for facilitating visual search in augmented reality. Visualization and Computer Graphics, IEEE Transactions on 20, 3 (March), 404--412.
[8]
McNamara, A., Bailey, R., and Grimm, C. 2009. Search task performance using subtle gaze direction with the presence of distractions. ACM Trans. Appl. Percept. 6 (Sept.), 17:1--17:19.
[9]
Miller, G. 1956. The magical number seven, plus or minus two: Some limits on our capacity for processing information. The Psychological Review 63, 81--97.
[10]
Privitera, C. M. 2006. The scanpath theory: its definitions and later developments. In Electronic Imaging 2006, International Society for Optics and Photonics, 60570A--60570A.
[11]
Qvarfordt, P., Biehl, J. T., Golovchinsky, G., and Dunningan, T. 2010. Understanding the benefits of gaze enhanced visual search. In Proceeding of 2010 Symposium on Eye-Tracking Research & Applications, ACM, NY, USA, 283--290.
[12]
Sridharan, S., Bailey, R., McNamara, A., and Grimm, C. 2012. Subtle gaze manipulation for improved mammography training. In Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, NY, USA, ETRA '12, 75--82.
[13]
Veas, E. E., Mendez, E., Feiner, S. K., and Schmalstieg, D. 2011. Directing attention and influencing memory with visual saliency modulation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, NY, USA, CHI '11, 1471--1480.
[14]
Wolfe, J. M. 1994. Guided search 2.0 a revised model of visual search. Psychonomic bulletin & review 1, 2, 202--238.

Cited By

View all
  • (2022)User-centred multimodal authentication: securing handheld mobile devices using gaze and touch inputBehaviour & Information Technology10.1080/0144929X.2022.206959741:10(2061-2083)Online publication date: 6-May-2022
  • (2021)Ubiquitous Interactions for Heads-Up Computing: Understanding Users’ Preferences for Subtle Interaction Techniques in Everyday SettingsProceedings of the 23rd International Conference on Mobile Human-Computer Interaction10.1145/3447526.3472035(1-15)Online publication date: 27-Sep-2021
  • (2019)Charting Subtle Interaction in the HCI LiteratureProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300648(1-15)Online publication date: 2-May-2019
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '16: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications
March 2016
378 pages
ISBN:9781450341257
DOI:10.1145/2857491
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 March 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye-tracking
  2. gaze guidance
  3. password recollection

Qualifiers

  • Short-paper

Funding Sources

Conference

ETRA '16
ETRA '16: 2016 Symposium on Eye Tracking Research and Applications
March 14 - 17, 2016
South Carolina, Charleston

Acceptance Rates

Overall Acceptance Rate 51 of 97 submissions, 53%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)60
  • Downloads (Last 6 weeks)8
Reflects downloads up to 30 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2022)User-centred multimodal authentication: securing handheld mobile devices using gaze and touch inputBehaviour & Information Technology10.1080/0144929X.2022.206959741:10(2061-2083)Online publication date: 6-May-2022
  • (2021)Ubiquitous Interactions for Heads-Up Computing: Understanding Users’ Preferences for Subtle Interaction Techniques in Everyday SettingsProceedings of the 23rd International Conference on Mobile Human-Computer Interaction10.1145/3447526.3472035(1-15)Online publication date: 27-Sep-2021
  • (2019)Charting Subtle Interaction in the HCI LiteratureProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300648(1-15)Online publication date: 2-May-2019
  • (2018)Enhancing Saliency of a Target Object Through Color Modification of Every Object Using Genetic AlgorithmSoft Computing for Problem Solving10.1007/978-981-13-1595-4_61(771-779)Online publication date: 31-Oct-2018
  • (2017)Enhancing Saliency of an Object Using Genetic Algorithm2017 14th Conference on Computer and Robot Vision (CRV)10.1109/CRV.2017.33(337-344)Online publication date: May-2017

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media