Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2968219.2971459acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
poster

Multimodal multisensor activity annotation tool

Published: 12 September 2016 Publication History

Abstract

In this paper we describe a multimodal-multisensor annotation tool for physiological computing; for example mobile gesture-based interaction devices or health monitoring devices can be connected. It should be used as an expert authoring tool to annotate multiple video-based sensor streams for domain-specific activities. Resulting datasets can be used as supervised datasets for new machine learning tasks. Our tool provides connectors to commercially available sensor systems (e.g., Intel RealSense F200 3D camera, Leap Motion, and Myo) and a graphical user interface for annotation.

References

[1]
Stamatia Dasiopoulou, Eirini Giannakidou, Georgios Litos, Polyxeni Malasioti, and Yiannis Kompatsiaris. 2011. Knowledge-driven Multimedia Information Extraction and Ontology Evolution. Springer-Verlag, Chapter A Survey of Semantic Image and Video Annotation Tools, 196--239. http://dl.acm.org/citation.cfm?id=2001069.2001077
[2]
Michael Kipp. 2008. Spatiotemporal Coding in ANVIL. In Proceedings of the 6th international conference on Language Resources and Evaluation. ELRA.
[3]
Z. Palotai, M. Lang, A. Sarkany, Z. Töser, D. Sonntag, T. Toyama, and A. Lörincz. 2014. LabelMovie: Semi-supervised machine annotation tool with quality assurance and crowd-sourcing options for videos. In 12th International Workshop on Content-Based Multimedia Indexing. 1--4.
[4]
Klaus Schoeffmann, Marco A. Hudelist, and Jochen Huber. 2015. Video Interaction Tools: A Survey of Recent Work. ACM Comput. Surv. 48, 1, Article 14 (2015), 14:1--14:34 pages.
[5]
Daniel Sonntag. 2014. ERmed -- Towards Medical Multimodal Cyber-Physical Environments. In Foundations of Augmented Cognition. Advancing Human Performance and Decision-Making through Adaptive Systems. Springer, 359--370.
[6]
Thomas Stiefmeier, Daniel Roggen, Georg Ogris, Paul Lukowicz, and Gerhard Tröster. 2008. Wearable Activity Tracking in Car Manufacturing. IEEE Pervasive Computing 7, 2 (April 2008), 42--50.

Cited By

View all
  • (2024)Digital Forms for All: A Holistic Multimodal Large Language Model Agent for Health Data EntryProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36596248:2(1-39)Online publication date: 15-May-2024
  • (2023)Application for Doctoral Consortium IUI 2023Companion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584112(233-236)Online publication date: 27-Mar-2023
  • (2022)MaD GUI: An Open-Source Python Package for Annotation and Analysis of Time-Series DataSensors10.3390/s2215584922:15(5849)Online publication date: 5-Aug-2022
  • Show More Cited By

Index Terms

  1. Multimodal multisensor activity annotation tool

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UbiComp '16: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct
    September 2016
    1807 pages
    ISBN:9781450344623
    DOI:10.1145/2968219
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 12 September 2016

    Check for updates

    Author Tags

    1. data annotation
    2. data capture
    3. multimodal
    4. multisensor

    Qualifiers

    • Poster

    Funding Sources

    Conference

    UbiComp '16

    Acceptance Rates

    Overall Acceptance Rate 764 of 2,912 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)17
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 23 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Digital Forms for All: A Holistic Multimodal Large Language Model Agent for Health Data EntryProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36596248:2(1-39)Online publication date: 15-May-2024
    • (2023)Application for Doctoral Consortium IUI 2023Companion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584112(233-236)Online publication date: 27-Mar-2023
    • (2022)MaD GUI: An Open-Source Python Package for Annotation and Analysis of Time-Series DataSensors10.3390/s2215584922:15(5849)Online publication date: 5-Aug-2022
    • (2022)Understanding the Roles of Video and Sensor Data in the Annotation of Human ActivitiesInternational Journal of Human–Computer Interaction10.1080/10447318.2022.210158939:18(3634-3648)Online publication date: Aug-2022
    • (2021)Signaligner Pro: A Tool to Explore and Annotate Multi-day Raw Accelerometer Data2021 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops51409.2021.9431110(475-480)Online publication date: 22-Mar-2021
    • (2019)Handling annotation uncertainty in human activity recognitionProceedings of the 2019 ACM International Symposium on Wearable Computers10.1145/3341163.3347744(109-117)Online publication date: 9-Sep-2019
    • (2019)Designing Videogames to Crowdsource Accelerometer Data Annotation for Activity Recognition ResearchProceedings of the Annual Symposium on Computer-Human Interaction in Play10.1145/3311350.3347153(135-147)Online publication date: 17-Oct-2019
    • (2019)Even Driven Multimodal Augmented Reality based Command and Control Systems for Mining IndustryProcedia Computer Science10.1016/j.procs.2019.04.135151(965-970)Online publication date: 2019
    • (2018)Exploring Semi-Supervised Methods for Labeling Support in Multimodal DatasetsSensors10.3390/s1808263918:8(2639)Online publication date: 11-Aug-2018
    • (2018)Multimodal Deep Learning for Activity and Context RecognitionProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/31611741:4(1-27)Online publication date: 8-Jan-2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media