Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3267305.3267519acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Summary of the Sussex-Huawei Locomotion-Transportation Recognition Challenge

Published: 08 October 2018 Publication History

Abstract

In this paper we summarize the contributions of participants to the Sussex-Huawei Transportation-Locomotion (SHL) Recognition Challenge organized at the HASCA Workshop of UbiComp 2018. The SHL challenge is a machine learning and data science competition, which aims to recognize eight transportation activities (Still, Walk, Run, Bike, Bus, Car, Train, Subway) from the inertial and pressure sensor data of a smartphone. We introduce the dataset used in the challenge and the protocol for the competition. We present a meta-analysis of the contributions from 19 submissions, their approaches, the software tools used, computational cost and the achieved results. Overall, two entries achieved F1 scores above 90%, eight with F1 scores between 80% and 90%, and nine between 50% and 80%.

References

[1]
M. Gjoreski, et al., Applying multiple knowledge to Sussex-Huawei locomotion challenge. Proc. HASCA2018.
[2]
V. Janko, et al., A new frontier for activity recognition - the Sussex-Huawei locomotion challenge. Proc. HASCA2018.
[3]
C. Ito, et al., Application of CNN for human activity recognition with FFT spectrogram of acceleration and gyro sensors. Proc. HASCA2018.
[4]
P. Widhalm, et al., Top in the lab, flop in the field? Evaluation of a sensor-based travel activity classifier with the SHL dataset. Proc. HASCA2018.
[5]
A. D. Antar, et al., A comparative approach to classification of locomotion and transportation modes using smartphone sensor data. Proc. HASCA2018.
[6]
A Akbari, et al., Hierarchical signal segmentation and classification for accurate activity recognition. Proc. HASCA2018.
[7]
H. Matsuyama, et al., Short segment random forest with post processing using label constraint for SHL challenge. Proc. HASCA2018.
[8]
Y. Nakamura, et al., Multi-stage activity inference for locomotion and transportation analytics of mobile users. Proc. HASCA2018.
[9]
Y. Yuki, et al., Activity Recognition using Dual-ConvLSTM Extracting Local and Global Features for SHL Challenge. Proc. HASCA2018.
[10]
J. Wu, et al., A decision level fusion and signal analysis technique for activity segmentation and recognition on smart phones. Proc. HASCA2018.
[11]
S. S. Saha, et al., Supervised and semi-supervised classifiers for locomotion analysis. Proc. HASCA2018.
[12]
A. Osmani, et al., Hybrid and convolutional neural networks for locomotion recognition. Proc. HASCA2018.
[13]
J. V. Jeyakumar, et al., Deep convolutional bidirectional LSTM based transportation mode recognition. Proc. HASCA2018.
[14]
T. B. Zahid, et al., A fast resource efficient method for human action recognition. Proc. HASCA2018.
[15]
J. H. Choi, et al., Confidence-based deep multimodal fusion for activity recognition. Proc. HASCA2018.
[16]
S. Li, et al., Smartphone-sensors based activity recognition using IndRNN. Proc. HASCA2018.
[17]
T. Yamaguchi, Sussex-Huawei locomotion challenge using XGBoost. (not included in Proc. HASCA2018).
[18]
K. Akamine, et al., SHL recognition challenge: Team TK-2 - combining results of multisize instances. Proc. HASCA2018.
[19]
M. Sloma, et al., Activity recognition by classification with time stabilization for the Sussex-Huawei locomotion challenge. Proc. HASCA2018.
[20]
L. Wang, et al., Benchmarking the SHL recognition challenge with classical and deep-learning pipelines. Proc. HASCA2018.
[21]
H. Gjoreski, et al., The university of Sussex-Huawei transportation-locomotion dataset for multimodal analytics with mobile devices, IEEE Access. 2018.
[22]
L. Wang, et al. Enabling reproducible research in sensor-based transportation mode recognition with the Sussex-Huawei dataset. under review.
[23]
M. Ciliberto, et al. High reliability Android application for multi-device multimodal mobile data acquisition and annotation. Proc. ENSS2017.
[24]
H. Xia, et al., Using smart phone sensors to detect transportation modes. Sensors, (2014), 20843--20865.
[25]
M. C. Yu, et al., Big data small footprint: the design of a low-power classifier for detecting transportation modes. Proc. VLDB Endowment (2014), 1429--1440.

Cited By

View all
  • (2024)SensorNet: An Adaptive Attention Convolutional Neural Network for Sensor Feature LearningSensors10.3390/s2411327424:11(3274)Online publication date: 21-May-2024
  • (2024)Transportation Mode Recognition Based on Low-Rate Acceleration and Location Signals With an Attention-Based Multiple-Instance Learning NetworkIEEE Transactions on Intelligent Transportation Systems10.1109/TITS.2024.338783425:10(14376-14388)Online publication date: Oct-2024
  • (2023)Summary of SHL Challenge 2023: Recognizing Locomotion and Transportation Mode from GPS and Motion SensorsAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610758(575-585)Online publication date: 8-Oct-2023
  • Show More Cited By

Index Terms

  1. Summary of the Sussex-Huawei Locomotion-Transportation Recognition Challenge

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      UbiComp '18: Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers
      October 2018
      1881 pages
      ISBN:9781450359665
      DOI:10.1145/3267305
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 08 October 2018

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Activity recognition
      2. Deep learning
      3. Machine learning
      4. Mobile sensing
      5. Transportation mode recognition

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      UbiComp '18
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 764 of 2,912 submissions, 26%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)21
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 10 Oct 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)SensorNet: An Adaptive Attention Convolutional Neural Network for Sensor Feature LearningSensors10.3390/s2411327424:11(3274)Online publication date: 21-May-2024
      • (2024)Transportation Mode Recognition Based on Low-Rate Acceleration and Location Signals With an Attention-Based Multiple-Instance Learning NetworkIEEE Transactions on Intelligent Transportation Systems10.1109/TITS.2024.338783425:10(14376-14388)Online publication date: Oct-2024
      • (2023)Summary of SHL Challenge 2023: Recognizing Locomotion and Transportation Mode from GPS and Motion SensorsAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610758(575-585)Online publication date: 8-Oct-2023
      • (2023)Recognize Locomotion and Transportation Modes from Wi-Fi Traces via Lightweight Models2023 International Conference on Future Communications and Networks (FCN)10.1109/FCN60432.2023.10544151(1-6)Online publication date: 17-Dec-2023
      • (2023)Communication Scene Recognition Method Based on Multi Phone Sensors and Deep Learning2023 International Conference on Future Communications and Networks (FCN)10.1109/FCN60432.2023.10544117(1-6)Online publication date: 17-Dec-2023
      • (2022)The Devil is in the Details: An Efficient Convolutional Neural Network for Transport Mode DetectionIEEE Transactions on Intelligent Transportation Systems10.1109/TITS.2021.311094923:8(12202-12212)Online publication date: Aug-2022
      • (2022)Using Human Body Capacitance Sensing to Monitor Leg Motion Dominated Activities with a Wrist Worn DeviceSensor- and Video-Based Activity and Behavior Computing10.1007/978-981-19-0361-8_5(81-94)Online publication date: 4-May-2022
      • (2022)Bento Packaging Activity Recognition Based on Statistical FeaturesSensor- and Video-Based Activity and Behavior Computing10.1007/978-981-19-0361-8_13(207-216)Online publication date: 4-May-2022
      • (2022)Adversarial Learning in Accelerometer Based Transportation and Locomotion Mode RecognitionGenerative Adversarial Learning: Architectures and Applications10.1007/978-3-030-91390-8_10(205-232)Online publication date: 7-Feb-2022
      • (2021)Classical machine learning and deep neural network ensemble model for GPS-based activity recognitionAdjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers10.1145/3460418.3479386(369-373)Online publication date: 21-Sep-2021
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media