Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3102163.3102191acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
poster

Deep learning for action recognition in augmented reality assistance systems

Published: 30 July 2017 Publication History

Abstract

Recent advances in the development of optical head-mounted displays (HMDs), such as the Microsoft HoloLens, Google Glass, or Epson Moverio, which overlay visual information directly in the user's field of vision, have opened up new possibilities for augmented reality (AR) applications. We propose a system that uses such an optical HMD to assist the user during goal-oriented activities (e.g. manufacturing work) in an intuitive and unobtrusive way (Essig et al. 2016). To this end, our system observes and recognizes the user's actions and generates context-sensitive feedback. Figure 1 shows an overview of our approach, exemplified with the task of assembling a bird house.

Supplementary Material

ZIP File (a75-schroder.zip)
Supplemental files.

References

[1]
Kai Essig, Benjamin Strenge, and Thomas Schack. 2016. ADAMAAS: Towards Smart Glasses for Mobile and Personalized Action Assistance. In International Conference on Pervasive Technologies Related to Assistive Environments. 46:1--46:4.
[2]
Jonathan Long, Evan Shelhamer, and Trevor Darrell. 2015. Fully convolutional networks for semantic segmentation. In Conference on Computer Vision and Pattern Recognition. 3431--3440.
[3]
Karen Simonyan and Andrew Zisserman. 2014. Very Deep Convolutional Networks for Large-Scale Image Recognition. CoRR abs/1409.1556 (2014). http://arxiv.org/abs/1409.1556
[4]
Andrea Tagliasacchi, Matthias Schröder, Anastasia Tkach, Sofien Bouaziz, Mario Botsch, and Mark Pauly. 2015. Robust Articulated-ICP for Real-Time Hand Tracking. Comput. Graph. Forum 34, 5 (2015), 101--114.

Cited By

View all
  • (2024)Make some Noise: Acoustic Classification of Manual Work Steps Towards Adaptive Assistance SystemsProcedia CIRP10.1016/j.procir.2024.07.024127(135-140)Online publication date: 2024
  • (2023)Metaverse in Healthcare Integrated with Explainable AI and Blockchain: Enabling Immersiveness, Ensuring Trust, and Providing Patient Data SecuritySensors10.3390/s2302056523:2(565)Online publication date: 4-Jan-2023
  • (2023)Cognitive assistance for action selection: Challenges and approachesFrontiers in Psychology10.3389/fpsyg.2022.103185813Online publication date: 4-Jan-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGGRAPH '17: ACM SIGGRAPH 2017 Posters
July 2017
173 pages
ISBN:9781450350150
DOI:10.1145/3102163
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 July 2017

Check for updates

Author Tags

  1. action recognition
  2. augmented reality
  3. deep learning

Qualifiers

  • Poster

Conference

SIGGRAPH '17
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,822 of 8,601 submissions, 21%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)30
  • Downloads (Last 6 weeks)1
Reflects downloads up to 11 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Make some Noise: Acoustic Classification of Manual Work Steps Towards Adaptive Assistance SystemsProcedia CIRP10.1016/j.procir.2024.07.024127(135-140)Online publication date: 2024
  • (2023)Metaverse in Healthcare Integrated with Explainable AI and Blockchain: Enabling Immersiveness, Ensuring Trust, and Providing Patient Data SecuritySensors10.3390/s2302056523:2(565)Online publication date: 4-Jan-2023
  • (2023)Cognitive assistance for action selection: Challenges and approachesFrontiers in Psychology10.3389/fpsyg.2022.103185813Online publication date: 4-Jan-2023
  • (2023)XAIR: A Framework of Explainable AI in Augmented RealityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581500(1-30)Online publication date: 19-Apr-2023
  • (2023)A Comprehensive Survey of RGB-Based and Skeleton-Based Human Action RecognitionIEEE Access10.1109/ACCESS.2023.328231111(53880-53898)Online publication date: 2023
  • (2022)Skeleton Graph-Neural-Network-Based Human Action Recognition: A SurveySensors10.3390/s2206209122:6(2091)Online publication date: 8-Mar-2022
  • (2022)Current State and Prospects of Increasing the Functionality of Augmented Reality Using Neural NetworksÈlektronnoe modelirovanie10.15407/emodel.44.05.07344:5(73-89)Online publication date: 10-Sep-2022
  • (2022)FirstPiano: A New Egocentric Hand Action Dataset Oriented Towards Augmented Reality ApplicationsImage Analysis and Processing – ICIAP 202210.1007/978-3-031-06433-3_15(170-181)Online publication date: 15-May-2022
  • (2021)Approaches for Cognitive Assistance in Industry 4.0 AssemblyArtificial Intelligence in Industry 4.010.1007/978-3-030-61045-6_4(45-54)Online publication date: 28-Feb-2021
  • (2020)A Literature Review of AR-Based Remote Guidance Tasks with User StudiesVirtual, Augmented and Mixed Reality. Industrial and Everyday Life Applications10.1007/978-3-030-49698-2_8(111-120)Online publication date: 10-Jul-2020
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media