Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2976749.2978311acmconferencesArticle/Chapter ViewAbstractPublication PagesccsConference Proceedingsconference-collections
research-article
Open access

Using Reflexive Eye Movements for Fast Challenge-Response Authentication

Published: 24 October 2016 Publication History

Abstract

Eye tracking devices have recently become increasingly popular as an interface between people and consumer-grade electronic devices. Due to the fact that human eyes are fast, responsive, and carry information unique to an individual, analyzing person's gaze is particularly attractive for effortless biometric authentication. Unfortunately, previous proposals for gaze-based authentication systems either suffer from high error rates, or require long authentication times.
We build upon the fact that some eye movements can be reflexively and predictably triggered, and develop an interactive visual stimulus for elicitation of reflexive eye movements that supports the extraction of reliable biometric features in a matter of seconds, without requiring any memorization or cognitive effort on the part of the user. As an important benefit, our stimulus can be made unique for every authentication attempt and thus incorporated in a challenge-response biometric authentication system. This allows us to prevent replay attacks, which are possibly the most applicable attack vectors against biometric authentication.
Using a gaze tracking device, we build a prototype of our system and perform a series of systematic user experiments with 30 participants from the general public. We investigate the performance and security guarantees under several different attack scenarios and show that our system surpasses existing gaze-based authentication methods both in achieved equal error rates (6.3\%) and significantly lower authentication times (5 seconds).

References

[1]
W. W. Abbott and A. A. Faisal. Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain-machine interfaces. Journal of Neural Engineering, 9(4), 2012.
[2]
R. Abrams, D. E. Meyer, and S. Kornblum. Speed and accuracy of saccadic eye movements: characteristics of impulse variability in the oculomotor system. Journal of experimental psychology. Human perception and performance, 15(3), 1989.
[3]
T. Bahill, M. R. Clark, and L. Stark. The main sequence, a tool for studying human eye movements. Mathematical Biosciences, 24(3--4):191--204, 1975.
[4]
T. Bahill and T. Laritz. Why Can't Batters Keep Their Eyes on the Ball? American Scientist, (May - June), 1984.
[5]
A. Boehm, D. Chen, M. Frank, L. Huang, C. Kuo, T. Lolic, I. Martinovic, and D. Song. Safe: Secure authentication with face and eyes. In Privacy and Security in Mobile Systems (PRISMS), 2013 International Conference on, June 2013.
[6]
A. Bulling, F. Alt, and A. Schmidt. Increasing the security of gaze-based cued-recall graphical passwords using saliency masks. In CHI, 2012.
[7]
V. Cantoni, C. Galdi, M. Nappi, M. Porta, and D. Riccio. Gant: Gaze analysis technique for human identification. Pattern Recognition, 48(4), 2015.
[8]
M. S. Castelhano and J. M. Henderson. Stable individual differences across images in human saccadic eye movements. Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale, 62(1):1--14, 2008.
[9]
J. E. S. Choi, P. a. Vaswani, and R. Shadmehr. Vigor of movements and the cost of time in decision making. The Journal of neuroscience : the official journal of the Society for Neuroscience, 34(4), 2014.
[10]
C. Cortes and V. Vapnik. Support-vector networks. Machine Learning, 20:273--297, 1995.
[11]
A. De Luca, M. Denzel, and H. Hussmann. Look into my eyes!: Can you guess my password? In Proceedings of the 5th Symposium on Usable Privacy and Security, SOUPS '09, New York, NY, USA, 2009. ACM.
[12]
F. Di Russo, S. Pitzalis, and D. Spinelli. Fixation stability and saccadic latency in elite shooters. Vision Research, 43(17), 2003.
[13]
S. Eberz, K. B. Rasmussen, V. Lenders, and I. Martinovic. Preventing Lunchtime Attacks: Fighting Insider Threats With Eye Movement Biometrics. In Proceedings of the 2015 Networked and Distributed System Security Symposium., 2015.
[14]
C. Galdi, M. Nappi, D. Riccio, V. Cantoni, and M. Porta. A new gaze analysis based soft-biometric. Lecture Notes in Computer Science, 7914 LNCS, 2013.
[15]
L. R. Gottlob, M. T. Fillmore, and B. D. Abroms. Age-group differences in saccadic interference. The journals of gerontology. Series B, Psychological sciences and social sciences, 62(2):85--89, 2007.
[16]
C. Holland and O. Komogortsev. Complex eye movement pattern biometrics: Analyzing fixations and saccades. In Biometrics (ICB), 2013 International Conference on, June 2013.
[17]
C. Holland and O. V. Komogortsev. Biometric identification via eye movement scanpaths in reading. 2011 International Joint Conference on Biometrics, IJCB 2011, 2011.
[18]
K. Holmqvist, M. Nyström, R. Andersson, R. Dewhurst, J. Halszka, and J. van de Weijer. Eye Tracking : A Comprehensive Guide to Methods and Measures. Oxford University Press, 2011.
[19]
P. Kasprowski. Human Identification Using Eye Movements. Institute of Computer Science, 2004.
[20]
P. Kasprowski. The Second Eye Movements Verification and Identification Competition. In IEEE & IAPR International Joint Conference on Biometrics, 2014.
[21]
P. Kasprowski, O. V. Komogortsev, and A. Karpov. First eye movement verification and identification competition at BTAS 2012. 2012 IEEE 5th International Conference on Biometrics: Theory, Applications and Systems, BTAS 2012, (Btas), 2012.
[22]
P. Kasprowski and J. Ober. Eye Movements in Biometrics. Biometrics, 3087 / 200, 2003.
[23]
Katharine Byrne. MSI & Tobii join forces to create eye-tracking gaming laptop, 2015.
[24]
A. Klin, W. Jones, R. Schultz, F. Volkmar, and D. Cohen. Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Archives of general psychiatry, 59:809--816, 2002.
[25]
T. Kocejko and J. Wtorek. Information Technologies in Biomedicine: Third International Conference, ITIB 2012, Gliwice, Poland, June 11--13, 2012. Proceedings, chapter Gaze Pattern Lock for Elders and Disabled, pages 589--602. Springer Berlin Heidelberg, Berlin, Heidelberg, 2012.
[26]
O. V. Kolesnikova, L. V. Tereshchenko, a. V. Latanov, and V. V. Shulgovskii. Neuroscience and Behavioral Physiology, 40(8):869--876, 2010.
[27]
O. V. Komogortsev, U. K. S. Jayarathna, C. R. Aragon, and M. Mechehoul. Biometric Identification via an Oculomotor Plant Mathematical Model. Eye Tracking Research & Applications (ETRA) Symposium, 2010.
[28]
O. V. Komogortsev, A. Karpov, and C. D. Holland. Attack of Mechanical Replicas : Liveness Detection With Eye Movements. IEEE TIFS, 10(4), 2015.
[29]
M. Kumar, T. Garfinkel, D. Boneh, and T. Winograd. Reducing shoulder-surfing by using gaze-based password entry. In Proceedings of the 3rd Symposium on Usable Privacy and Security, SOUPS '07, New York, NY, USA, 2007. ACM.
[30]
M. F. Land. Oculomotor behaviour in vertebrates and invertebrates. The Oxford handbook of eye movements, 1, 2011.
[31]
I. E. Lazarev and a. V. Kirenskaia. Effect of eye dominance on saccade characteristics and slow EEG potentials. Fiziologiia cheloveka, 34(2):23--33, 2008.
[32]
E. Miluzzo, T. Wang, A. T. Campbell, and a. C. M. S. I. G. o. D. Communication. EyePhone: Activating Mobile Phones with Your Eyes. Workshop on Networking, Systems, and Applications on Mobile Handhelds (MobiHeld), 2010.
[33]
M. Nystrom and K. Holmqvist. An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data. Behavior Research Methods, 42(1), 2010.
[34]
T. Poitschke, F. Laquai, S. Stamboliev, and G. Rigoll. Gaze-based interaction on multiple displays in an automotive environment. In Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics, 2011.
[35]
I. Rigas, G. Economou, and S. Fotopoulos. Biometric identification based on the eye movements and graph matching techniques. Pattern Recognition Letters, 33(6), 2012.
[36]
I. Rigas and O. V. Komogortsev. Biometric Recognition via Probabilistic Spatial Projection of Eye Movement Trajectories in Dynamic Visual Environments. IEEE TIFS, 9(10), 2014.
[37]
U. Saeed. Eye movements during scene understanding for biometric identification. Pattern Recognition Letters, 59, 2015.
[38]
SensoMotoric Instruments GmbH. SMI RED500 Technical Specification. Technical report, SensoMotoric Instruments GmbH, Teltow, Germany, 2011.
[39]
R. Shay, L. F. Cranor, S. Komanduri, A. L. Durity, P. S. Huh, M. L. Mazurek, S. M. Segreti, B. Ur, L. Bauer, and N. Christin. Can long passwords be secure and usable? Proceedings of the 32nd annual ACM conference on Human factors in computing systems - CHI '14, 2014.
[40]
P. Sumner. Determinants of saccade latency. In Oxford handbook of eye movements, volume 22, pages 411--424. 2011.
[41]
R. Walker, D. G. Walker, M. Husain, and C. Kennard. Control of voluntary and reflexive saccades. Experimental Brain Research, 130(4):540--544, Feb. 2000.
[42]
J. Weaver, K. Mock, and B. Hoanca. Gaze-based password authentication through automatic clustering of gaze points. Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics, 2011.
[43]
Y. Zhang, Z. Chi, and D. Feng. An Analysis of Eye Movement Based Authentication Systems. International Conference on Mechanical Engineering and Technology (ICMET-London 2011), 2011.
[44]
M. Zhang, Y; Laurikkala, J; Juhola. Biometric verification of a subject with eye movements, with special reference to temporal variability in saccades between a subject's measurements. Int. J. Biometrics, 6(1), 2014.

Cited By

View all
  • (2024)Act2Auth – A Novel Authentication Concept based on Embedded Tangible Interaction at DesksProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3633360(1-15)Online publication date: 11-Feb-2024
  • (2024)VibHead: An Authentication Scheme for Smart Headsets through VibrationACM Transactions on Sensor Networks10.1145/361443220:4(1-21)Online publication date: 11-May-2024
  • (2024)GazePair: Efficient Pairing of Augmented Reality Devices Using Gaze TrackingIEEE Transactions on Mobile Computing10.1109/TMC.2023.325584123:3(2407-2421)Online publication date: Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CCS '16: Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security
October 2016
1924 pages
ISBN:9781450341394
DOI:10.1145/2976749
This work is licensed under a Creative Commons Attribution-NonCommercial International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 October 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. authentication
  2. biometrics
  3. challenge-response
  4. eye tracking
  5. gaze tracking
  6. reflexive eye movements
  7. reflexive saccades
  8. replay attacks
  9. replay protection
  10. system security

Qualifiers

  • Research-article

Funding Sources

Conference

CCS'16
Sponsor:

Acceptance Rates

CCS '16 Paper Acceptance Rate 137 of 831 submissions, 16%;
Overall Acceptance Rate 1,261 of 6,999 submissions, 18%

Upcoming Conference

CCS '24
ACM SIGSAC Conference on Computer and Communications Security
October 14 - 18, 2024
Salt Lake City , UT , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)191
  • Downloads (Last 6 weeks)21
Reflects downloads up to 30 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Act2Auth – A Novel Authentication Concept based on Embedded Tangible Interaction at DesksProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3633360(1-15)Online publication date: 11-Feb-2024
  • (2024)VibHead: An Authentication Scheme for Smart Headsets through VibrationACM Transactions on Sensor Networks10.1145/361443220:4(1-21)Online publication date: 11-May-2024
  • (2024)GazePair: Efficient Pairing of Augmented Reality Devices Using Gaze TrackingIEEE Transactions on Mobile Computing10.1109/TMC.2023.325584123:3(2407-2421)Online publication date: Mar-2024
  • (2024)Usable Authentication in Virtual Reality: Exploring the Usability of PINs and GesturesApplied Cryptography and Network Security10.1007/978-3-031-54776-8_16(412-431)Online publication date: 29-Feb-2024
  • (2023)Understanding User Behavior in Volumetric Video Watching: Dataset, Analysis and PredictionProceedings of the 31st ACM International Conference on Multimedia10.1145/3581783.3613810(1108-1116)Online publication date: 26-Oct-2023
  • (2023)Visual Indicators Representing Avatars' Authenticity in Social Virtual Reality and Their Impacts on Perceived TrustworthinessIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332023429:11(4589-4599)Online publication date: 1-Nov-2023
  • (2023)Where Are the Dots: Hardening Face Authentication on Smartphones With Unforgeable Eye Movement PatternsIEEE Transactions on Information Forensics and Security10.1109/TIFS.2022.323295718(1295-1308)Online publication date: 2023
  • (2023)Data-Augmentation-Enabled Continuous User Authentication via Passive Vibration ResponseIEEE Internet of Things Journal10.1109/JIOT.2023.326427410:16(14137-14151)Online publication date: 15-Aug-2023
  • (2023)User Authentication by Eye Movement Features Employing SVM and XGBoost ClassifiersIEEE Access10.1109/ACCESS.2023.330900011(93341-93353)Online publication date: 2023
  • (2023)A face recognition taxonomy and review framework towards dimensionality, modality and feature qualityEngineering Applications of Artificial Intelligence10.1016/j.engappai.2023.107056126:PCOnline publication date: 1-Nov-2023
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media