Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3197768.3197775acmotherconferencesArticle/Chapter ViewAbstractPublication PagespetraConference Proceedingsconference-collections
research-article

Activity Segmentation and Identification based on Eye Gaze Features

Published: 26 June 2018 Publication History

Abstract

In coherence with the ongoing digitalization of production processes, Human Computer Interaction (HCI) technologies have evolved rapidly in industrial applications, providing abundant numbers of the versatile tracking and monitoring devices suitable to address complex challenges. This paper focuses on Activity Segmentation and Activity Identification as one of the most crucial challenges in pervasive computing, applying only visual attention features captured through mobile eye-tracking sensors. We propose a novel, application-independent approach towards segmentation of task executions in semi-manual industrial assembly setup via exploiting the expressive properties of the distribution-based gaze feature Nearest Neighbor Index (NNI) to build a dynamic activity segmentation algorithm. The proposed approach is enriched with a machine learning validation model acting as a feedback loop to classify segments qualities. The approach is evaluated in an alpine ski assembly scenario with real-world data reaching an overall of 91% detection accuracy.

References

[1]
Lucas Matney, Google buys Eyefluence eye-tracking startup, https://techcrunch.com/2016/10/24/google-buys-eyefluence-eye-tracking-startup/
[2]
Josh Constine, Oculus acquires eye-tracking startup The Eye Tribe, https://techcrunch.com/2016/12/28/the-eye-tribe-oculus/.
[3]
Kumar, Chandan, Raphael Menges, and Steffen Staab. "Eye-controlled interfaces for multimedia interaction." IEEE MultiMedia 23.4 (2016): 6--13.
[4]
Majaranta, Päivi, ed. Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies: Advances in Assistive Technologies. IGI Global, 2011.
[5]
Barea, Rafael, et al. "System for assisted mobility using eye movements based on electrooculography." IEEE transactions on neural systems and rehabilitation engineering 10.4 (2002): 209--218.
[6]
Kandemir, Melih, and Samuel Kaski. "Learning relevance from natural eye movements in pervasive interfaces." Proceedings of the 14th ACM international conference on Multimodal interaction. ACM, 2012.
[7]
Kosmopoulos, Dimitrios I., Nikolaos D. Doulamis, and Athanasios S. Voulodimos. "Bayesian filter based behavior recognition in workflows allowing for user feedback." Computer Vision and Image Understanding 116.3 (2012): 422--434.
[8]
Voulodimos, Athanasios S., et al. "Improving multi-camera activity recognition by employing neural network based readjustment." Applied Artificial Intelligence 26.1-2 (2012): 97--118.
[9]
Hacisalihzade, Selim S., Lawrence W. Stark, and John S. Allen. "Visual perception and sequences of eye movement fixations: A stochastic modeling approach." IEEE Transactions on systems, man, and cybernetics 22.3 (1992): 474--481.
[10]
Canosa, Roxanne L. "Real-world vision: Selective perception and task." ACM Transactions on Applied Perception (TAP) 6.2 (2009): 11.
[11]
Camilli, Marco, et al. "ASTEF: A simple tool for examining fixations." Behavior research methods 40.2 (2008): 373--382.
[12]
J. G. Kreifeldt, An analysis of surface-detected EMG as an amplitude-modulated noise, presented at the 1989 Int. Conf. Medicine and Biological Engineering, Chicago, IL.
[13]
J. Williams, Narrow-band analyzer (Thesis or Dissertation style), Ph.D. dissertation, Dept. Elect. Eng., Harvard Univ., Cambridge, MA, 1993.
[14]
P. J. Clark and F. C. Evans, "Distance to nearest neighbor as a measure of spatial relationships in populations," Ecology, vol. 35, no. 4, pp. 445--453, 1954.
[15]
I. M. Hipiny and W. Mayol-Cuevas, "Recognising egocentric activities from gaze regions with multiple-voting bag of words," Technical Report CSTR-12-003, 2012.
[16]
J. S. Shell, R. Vertegaal, and A. W. Skaburskis, "Eyepliances: attention-seeking devices that respond to visual attention," in CHI'03 extended abstracts on Human factors in computing systems. ACM, 2003, pp. 770--771.
[17]
M. Haslgrübler, P. Fritz, B. Gollan, and A. Ferscha, "Getting through - modality selection in a multi-sensor-actuator industrial iot environment," p. 8, October 2017.
[18]
S. Zhai, "What's in the eyes for attentive input," Communications of the ACM, vol. 46, no. 3, pp. 34--39, 2003.
[19]
S. Martinez-Conde, S. L. Macknik, and D. H. Hubel, "The role of fixational eye movements in visual perception," Nature Reviews Neuroscience, vol. 5, no. 3, pp. 229--240, 2004.
[20]
A. Fathi, Y. Li, and J. M. Rehg, "Learning to recognize daily actions using gaze," in European Conference on Computer Vision. Springer, 2012, pp. 314--327.
[21]
M. F. Land and M. Hayhoe, "In what ways do eye movements contribute to everyday activities?" Vision research, vol. 41, no. 25, pp. 3559--3565, 2001.
[22]
R. Jacob and S. Stellmach, "What you look at is what you get: gaze-based user interfaces," interactions, vol. 23, no. 5, pp. 62--65, 2016.
[23]
A. Bulling, "Pervasive attentive user interfaces." IEEE Computer, vol. 49, no. 1, pp. 94--98, 2016.

Cited By

View all
  • (2024)Use of Lean Management Methods based on Eye-Tracking Information to make User Interfaces in Production more Human-centeredProcedia CIRP10.1016/j.procir.2024.04.014128(514-519)Online publication date: 2024
  • (2024)Eye-tracking support for analyzing human factors in human-robot collaboration during repetitive long-duration assembly processesProduction Engineering10.1007/s11740-024-01294-yOnline publication date: 20-Jun-2024
  • (2023)Eye Tracking, Usability, and User Experience: A Systematic ReviewInternational Journal of Human–Computer Interaction10.1080/10447318.2023.222160040:17(4484-4500)Online publication date: 18-Jun-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
PETRA '18: Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference
June 2018
591 pages
ISBN:9781450363907
DOI:10.1145/3197768
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • NSF: National Science Foundation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Activity Identification
  2. Activity Segmentation
  3. Gaze Behavior Analysis
  4. Human-Centered Computing

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

PETRA '18

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)33
  • Downloads (Last 6 weeks)0
Reflects downloads up to 15 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Use of Lean Management Methods based on Eye-Tracking Information to make User Interfaces in Production more Human-centeredProcedia CIRP10.1016/j.procir.2024.04.014128(514-519)Online publication date: 2024
  • (2024)Eye-tracking support for analyzing human factors in human-robot collaboration during repetitive long-duration assembly processesProduction Engineering10.1007/s11740-024-01294-yOnline publication date: 20-Jun-2024
  • (2023)Eye Tracking, Usability, and User Experience: A Systematic ReviewInternational Journal of Human–Computer Interaction10.1080/10447318.2023.222160040:17(4484-4500)Online publication date: 18-Jun-2023
  • (2022)Estimation of Confidence in the Dialogue based on Eye Gaze and Head Movement InformationEMITTER International Journal of Engineering Technology10.24003/emitter.v10i2.756(338-350)Online publication date: 30-Dec-2022
  • (2022)IoT-Based Patient Health Data Using Improved Context-Aware Data Fusion and Enhanced Recursive Feature Elimination ModelIEEE Access10.1109/ACCESS.2022.322658310(128318-128335)Online publication date: 2022
  • (2022)Opportunities for using eye tracking technology in manufacturing and logisticsComputers and Industrial Engineering10.1016/j.cie.2022.108444171:COnline publication date: 1-Sep-2022
  • (2021)Scanpath analysis into the wild: the spatiotemporal distribution of fixations as an indicator of driver’s mental workloadProceedings of the Human Factors and Ergonomics Society Annual Meeting10.1177/107118132064108464:1(371-375)Online publication date: 9-Feb-2021
  • (2019)A task-independent design and development process for cognitive products in industrial applicationsProceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3316782.3322748(358-367)Online publication date: 5-Jun-2019
  • (2019)A multi-sensor algorithm for activity and workflow recognition in an industrial settingProceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3316782.3321523(69-76)Online publication date: 5-Jun-2019
  • (2019)Towards Industrial Assistance Systems: Experiences of Applying Multi-sensor Fusion in Harsh EnvironmentsPhysiological Computing Systems10.1007/978-3-030-27950-9_9(158-179)Online publication date: 14-Aug-2019

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media