Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2449396.2449415acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
research-article

Locating user attention using eye tracking and EEG for spatio-temporal event selection

Published: 19 March 2013 Publication History

Abstract

In expert video analysis, the selection of certain events in a continuous video stream is a frequently occurring operation, e.g., in surveillance applications. Due to the dynamic and rich visual input, the constantly high attention and the required hand-eye coordination for mouse interaction, this is a very demanding and exhausting task. Hence, relevant events might be missed. We propose to use eye tracking and electroencephalography (EEG) as additional input modalities for event selection. From eye tracking, we derive the spatial location of a perceived event and from patterns in the EEG signal we derive its temporal location within the video stream. This reduces the amount of the required active user input in the selection process, and thus has the potential to reduce the user's workload. In this paper, we describe the employed methods for the localization processes and introduce the developed scenario in which we investigate the feasibility of this approach. Finally, we present and discuss results on the accuracy and the speed of the method and investigate how the modalities interact.

References

[1]
Bechara, A., Damasio, H., and Damasio, A. R. Emotion, Decision Making and the Orbitofrontal Cortex. In Cerebral Cortex 10, no. 3 (March 1, 2000): 295--307.
[2]
Hakenberg, J. P. Estimation of Gaze from EEG for Cursor Control. http://hakenberg.de, 2011
[3]
Schlögl, A., Keinrath, C., Zimmermann, D., Scherer, R., Leeb, R., and Pfurtscheller, G. A Fully Automated Correction Method of EOG Artifacts in EEG Recordings. In Clinical Neurophysiology 118, no. 1 (January 2007): 98--104.
[4]
Krusienski, D. J., Sellers, E. W., McFarland, D. J., Vaughan, T. M., and Wolpaw, J. R. Toward Enhanced P300 Speller Performance. In: Journal of Neuroscience Methods 167, no. 1 (January 15, 2008): 15--21.
[5]
Hsu, C. W., Chang, C. C., and Lin, C. J. A Practical Guide to Support Vector Classification, 2003.
[6]
Shenoy, P. and Desney S. T. Human-aided Computing: Utilizing Implicit Human Processing to Classify Images. In Proceedings of the Twenty-sixth Annual SIGCHI Conference on Human Factors in Computing Systems, 845--854. CHI '08. New York, NY, USA.
[7]
Gerson, A. D., Parra, L. C., and Sajda, P. Cortically Coupled Computer Vision for Rapid Image Search. In IEEE Transactions on Neural Systems and Rehabilitation Engineering 14, no. 2 (June 2006): 174--179.
[8]
Yong, X., Fatourechi, M., Ward, R. K., and Birch, G. E. The Design of a Point-and-Click System by Integrating a Self-Paced Brain Computer Interface With an EyeTracker. In IEEE Journal on Emerging and Selected Topics in Circuits and Systems 1, no. 4 (December 2011): 590--602.
[9]
Zander, T. O., Gaertner, M., Kothe, C., and Vilimek, R. Combining Eye Gaze Input with a Brain-Computer Interface for Touchless Human-Computer Interaction. In International Journal of Human-Computer Interaction 27, no. 1 (2010): 38--51.
[10]
Brunner, P., Joshi, S., Briskin, S., Wolpaw, J. R., Bischof, H., and Schalk, G. Does the 'P300' Speller Depend on Eye Gaze? In Journal of Neural Engineering 7, no. 5 (October 1, 2010): 056013.
[11]
Jung, T.-P., Makeig, S., Humphries, C., Lee, T.-W., McKeown, M. J., Iragui, V., and Sejnowski, T. J. Removing Electroencephalographic Artifacts by Blind Source Separation. In Psychophysiology 37, no. 2 (2000): 163--178.
[12]
Blankertz, B., Lemm, S., Treder, M., Haufe, S., and Müller, K.-R. Single-trial Analysis and Classification of ERP Components - A Tutorial. In NeuroImage 56, no. 2 (May 15, 2011): 814--825.
[13]
Chang, C., and Lin, C. LIBSVM: A Library for Support Vector Machines. In ACM Trans. Intell. Syst. Technol. 2, no. 3 (May 2011): 27:1--27:27.
[14]
Venkataramanan, S., Prabhat, P., Choudhury, S. R., Nemade, H. B., and Sahambi, J. S. Biomedical Instrumentation based on Electrooculogram (EOG) Signal Processing and Application to a Hospital Alarm System. In Proc. ICISIP 2005, IEEE Computer Society Washington, D.C., USA (2005), 535--540.
[15]
Gezeck, S., Fischer, B., and Timmer, J. Saccadic reaction times: a statistical analysis of multimodal distributions. Vision research 37, 15 (1997), 2119--2131.
[16]
Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., and Van de Weijer, J. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford, 2011.
[17]
Salvucci, D. D. and Goldberg, J. H. Identifying fixations and saccades in eye-tracking protocols. In Proc. ETRA 2000, ACM Press (2000), 71--78.
[18]
Sibert, L. E. and Jacob, R. J. K. Evaluation of eye gaze interaction. In Proc. Of the SIGCHI conference on Human factors in computing systems. ACM Press (2000), 281--288.
[19]
Zhang, X. and MacKenzie, I. S. Evaluating eye tracking with ISO 9241 - Part 9. Human-Computer Interaction. HCI Intelligent Multimodal Interaction Environments. Springer, 2007, 779--788.
[20]
MacKenzie, I. S. An eye on user input: research challenges in using the eye for computer input control. In Proc. ETRA 2010, ACM Press, 2010, 11--12.
[21]
Ware, C. and Mikaelian, H. H. An evaluation of an eye tracker as a device for computer input. ACM SIGCHI Bulletin 18, 4 (1987), 183--188.
[22]
Jacob, R. J. K. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (TOIS) 9, 2 (1991), 152--169.

Cited By

View all

Index Terms

  1. Locating user attention using eye tracking and EEG for spatio-temporal event selection

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    IUI '13: Proceedings of the 2013 international conference on Intelligent user interfaces
    March 2013
    470 pages
    ISBN:9781450319652
    DOI:10.1145/2449396
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 March 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eeg
    2. event detection
    3. expert video analysis
    4. eye tracking

    Qualifiers

    • Research-article

    Conference

    IUI '13
    Sponsor:
    IUI '13: 18th International Conference on Intelligent User Interfaces
    March 19 - 22, 2013
    California, Santa Monica, USA

    Acceptance Rates

    IUI '13 Paper Acceptance Rate 43 of 192 submissions, 22%;
    Overall Acceptance Rate 746 of 2,811 submissions, 27%

    Upcoming Conference

    IUI '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)34
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 16 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Brainsourcing for temporal visual attention estimationBiomedical Engineering Letters10.1007/s13534-024-00449-1Online publication date: 11-Jan-2025
    • (2023)Integration of EEG and Eye Tracking Technology: A Systematic ReviewSoutheastCon 202310.1109/SoutheastCon51012.2023.10115167(209-216)Online publication date: 1-Apr-2023
    • (2023)A Case for Personalized Non-Player Character Companion DesignInternational Journal of Human–Computer Interaction10.1080/10447318.2023.218112540:12(3051-3070)Online publication date: 6-Mar-2023
    • (2022)Understanding HCI Practices and Challenges of Experiment Reporting with Brain Signals: Towards Reproducibility and ReuseACM Transactions on Computer-Human Interaction10.1145/349055429:4(1-43)Online publication date: 31-Mar-2022
    • (2022)Brainwave-Augmented Eye Tracker: High-Frequency SSVEPs Improves Camera-Based Eye Tracking AccuracyProceedings of the 27th International Conference on Intelligent User Interfaces10.1145/3490099.3511151(258-276)Online publication date: 22-Mar-2022
    • (2021)Using Brain Activity Patterns to Differentiate Real and Virtual Attended Targets during Augmented Reality ScenariosInformation10.3390/info1206022612:6(226)Online publication date: 26-May-2021
    • (2021)Exploration of Person-Independent BCIs for Internal and External Attention-Detection in Augmented RealityProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34635075:2(1-27)Online publication date: 24-Jun-2021
    • (2021)Assessing Internal and External Attention in AR using Brain Computer Interfaces: A Pilot Study2021 IEEE 17th International Conference on Wearable and Implantable Body Sensor Networks (BSN)10.1109/BSN51625.2021.9507034(1-6)Online publication date: 27-Jul-2021
    • (2021)A passive BCI for monitoring the intentionality of the gaze-based moving object selectionJournal of Neural Engineering10.1088/1741-2552/abda0918:2(026001)Online publication date: 4-Mar-2021
    • (2020)SmartHelm: Towards Multimodal Detection of Attention in an Outdoor Augmented Reality Biking ScenarioCompanion Publication of the 2020 International Conference on Multimodal Interaction10.1145/3395035.3425207(426-432)Online publication date: 25-Oct-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media