Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3410530.3414341acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Summary of the sussex-huawei locomotion-transportation recognition challenge 2020

Published: 12 September 2020 Publication History

Abstract

In this paper we summarize the contributions of participants to the third Sussex-Huawei Locomotion-Transportation (SHL) Recognition Challenge organized at the HASCA Workshop of UbiComp/ISWC 2020. The goal of this machine learning/data science challenge is to recognize eight locomotion and transportation activities (Still, Walk, Run, Bike, Bus, Car, Train, Subway) from the inertial sensor data of a smartphone in a user-independent manner with an unknown target phone position. The training data of a "train" user is available from smartphones placed at four body positions (Hand, Torso, Bag and Hips). The testing data originates from "test" users with a smartphone placed at one, but unknown, body position. We introduce the dataset used in the challenge and the protocol of the competition. We present a meta-analysis of the contributions from 15 submissions, their approaches, the software tools used, computational cost and the achieved results. Overall, one submission achieved F1 scores above 80%, three with F1 scores between 70% and 80%, seven between 50% and 70%, and four below 50%, with a latency of maximum of 5 seconds.

References

[1]
Y. Zhu, et al. DenseNetX and GRU for the Sussex-Huawei locomotion-transportation recognition challenge. Proc. UbiComp/ISWC 2020.
[2]
B. Zhao, et al. IndRNN based long-term temporal recognition in the spatial and frequency domain. Proc. UbiComp/ISWC 2020.
[3]
S. Kalabakov, et al. Tackling the SHL Challenge 2020 with person-specific classifiers and semi-supervised learning. Proc. UbiComp/ISWC 2020.
[4]
K. Yaguchi, et al. Human activity recognition using multi-input CNN model with FFT spectrograms. Proc. UbiComp/ISWC 2020.
[5]
C. Naseeb, et al. Activity recognition for locomotion and transportation dataset using deep learning. Proc. UbiComp/ISWC 2020.
[6]
M. S. Siraj, et al. UPIC: user and position independent classical approach for locomotion and transportation modes recognition. Proc. UbiComp/ISWC 2020.
[7]
S. Brajesh, et al. Ensemble approach for sensor-based human activity recognition. Proc. UbiComp/ISWC 2020.
[8]
P. Widhalm, et al. Tackling the SHL recognition challenge with phone position detection and nearest neighbour smoothing. Proc. UbiComp/ISWC 2020.
[9]
R. Sekiguchi, et al. Ensemble learning for human activity recognition. Proc. UbiComp/ISWC 2020.
[10]
Y. Tseng, et al. Hierarchical Classification Using ML/DL for Sussex-Huawei Locomotion-Transportation (SHL) Recognition Challenge. Proc. UbiComp/ISWC 2020.
[11]
B. Friedrich, et al. Combining LSTM and CNN for mode of transportation classification from smartphone sensors. Proc. UbiComp/ISWC 2020.
[12]
M. Hamid, et al. A multi-view architecture for the SHL challenge. Proc. UbiComp/ISWC 2020.
[13]
Team-X, A data-fusion deep learning model for transportation mode detection on extracted features. (withdrawn)
[14]
L. Gunthermann, et al. Smartphone location identification and transport mode recognition using an ensemble of generative adversarial networks. Proc. UbiComp/ISWC 2020.
[15]
G. Dogan, et al. Where are you? Human activity recognition with smartphone sensor data. Proc. UbiComp/ISWC 2020.
[16]
V. Janko, M. Gjoreski, C. M. De Masi, et al. Cross-location transfer learning for the Sussex-Huawei locomotion recognition challenge. Proc. 2019 ACM International Joint Conference and 2019 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, 2019, 730--735.
[17]
J. Engelbrecht, M. J. Booysen, G. van Rooyen, F. J. Bruwer. Survey of smartphone-based sensing in vehicles for intelligent transportation system applications. IET Intelligent Transport Systems, 9(10): 924--935, 2015.
[18]
Y. Vaizman, K. Ellis, G. Lanckriet. Recognizing detailed human context in the wild from smartphones and smartwatches. IEEE Pervasive Computing, 16(4): 62--74, 2017.
[19]
E. Anagnostopoulou, J. Urbancic, E. Bothos, B. Magoutas, L. Bradesko, J. Schrammel, G. Mentzas. From mobility patterns to behavioural change: leveraging travel behaviour and personality profiles to nudge for sustainable transportation. Journal of Intelligent Information Systems, 2018: 1--22, 2018.
[20]
D. A. Johnson, M. M. Trivedi. Driving style recognition using a smartphone as a sensor platform. Proc. IEEE Conference on Intelligent Transportation Systems, 2011, 1609--1615.
[21]
W. Brazil, B. Caulfield. Does green make a difference: The potential role of smartphone technology in transport behaviour. Transportation Research Part C: Emerging Technologies, 37: 93--101, 2013.
[22]
J. Froehlich, T. Dillahunt, P. Klasnja, J. Mankoff, S. Consolvo, B. Harrison, J. A. Landay. Ubigreen: Investigating a mobile tool for tracking and supporting green transportation habits. Proc. SIGCHI Conference on Human Factors Computing Systems, 2009, 1043--1052.
[23]
C. Cottrill, F. Pereira, F. Zhao, I. Dias, H. Lim, M. Ben-Akiva, P. Zegras. Future mobility survey: Experience in developing a smartphone-based travel survey in Singapore. Transportation Research Record: J. Transportation Research Board, 2354(1): 59--67, 2013.
[24]
S. C. Mukhopadhyay. Wearable sensors for human activity monitoring: A review. IEEE Sensors Journal, 15(3): 1321--1330, 2015.
[25]
G. Castignani, T. Derrmann, R. Frank, T. Engel. Driver behavior profiling using smartphones: A low-cost platform for driver monitoring. IEEE Intelligent Transportation Systems Magazine. 7(1): 91--102, 2015.
[26]
H. Xia, Y. Xiao, J. Jian, Y. Chang. Using smart phone sensors to detect transportation modes. Sensors, 14(11): 20843--20865, 2014.
[27]
M. C. Yu, T. Yu, S. C. Wang, C. J. Lin, E. Y. Chang. Big data small footprint: the design of a low-power classifier for detecting transportation modes. Proc. Very Large Data Base Endowment, 2014, 1429--1440.
[28]
S. Richoz, M. Ciliberto, L. Wang, P. Birch, H. Gjoreski, A. Perez-Uribe, D. Roggen. Human and machine recognition of transportation modes from body-worn camera images. Proc. Joint 8th Int. Conf. Informatics, Electronics & Vision and 3rd Int. Conf. Imaging, Vision & Pattern Recognition, 2019, 67--72.
[29]
L. Wang, D. Roggen. Sound-based transportation mode recognition with smartphones. Proc. IEEE International Conference on Acoustics, Speech and Signal Processing, 2019, 930--934.
[30]
S. Richoz, L. Wang, P. Birch, D. Roggen. Transportation mode recognition fusing wearable motion, sound and vision sensors. IEEE Sensors Journal, 20(16): 9314--9328, 2020.
[31]
L. Wang, H. Gjoreski, K. Murao, T. Okita, D. Roggen. Summary of the Sussex-Huawei locomotion-transportation recognition challenge. Proc. 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, 2018, 1521--1530.
[32]
L. Wang, H. Gjoreski, M. Ciliberto, P. Lago, K. Murao, T. Okita, D Roggen. Summary of the Sussex-Huawei locomotion-transportation recognition challenge 2019. Proc. 2019 ACM International Joint Conference and 2019 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, 2019, 849--856.
[33]
H. Gjoreski, M. Ciliberto, L. Wang, F.J.O. Morales, S. Mekki, S. Valentin, D. Roggen. The university of Sussex-Huawei locomotion and transportation dataset for multimodal analytics with mobile devices. IEEE Access, 2018, 42592--42604.
[34]
L. Wang, H. Gjoreski, M. Ciliberto, S. Mekki, S. Valentin, D. Roggen. Enabling reproducible research in sensor-based transportation mode recognition with the Sussex-Huawei dataset. IEEE Access, 2019, 10870--10891.
[35]
L. Wang, H. Gjoreski, M. Ciliberto, S. Mekki, S. Valentin, D. Roggen. Benchmarking the SHL recognition challenge with classical and deep-learning pipelines. Proc. 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, 2018, 1626--1635.

Cited By

View all
  • (2024)A Hybrid Algorithmic Pipeline for Robust Recognition of Human Locomotion: Addressing Missing Sensor ModalitiesCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678462(591-596)Online publication date: 5-Oct-2024
  • (2023)Summary of SHL Challenge 2023: Recognizing Locomotion and Transportation Mode from GPS and Motion SensorsAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610758(575-585)Online publication date: 8-Oct-2023
  • (2023)Enhancing Transportation Mode Detection using Multi-scale Sensor Fusion and Spatial-topological AttentionAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610751(534-539)Online publication date: 8-Oct-2023
  • Show More Cited By

Index Terms

  1. Summary of the sussex-huawei locomotion-transportation recognition challenge 2020

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      UbiComp/ISWC '20 Adjunct: Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers
      September 2020
      732 pages
      ISBN:9781450380768
      DOI:10.1145/3410530
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 12 September 2020

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. activity recognition
      2. deep learning
      3. machine learning
      4. mobile sensing
      5. transportation mode recognition

      Qualifiers

      • Research-article

      Conference

      UbiComp/ISWC '20

      Acceptance Rates

      Overall Acceptance Rate 764 of 2,912 submissions, 26%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)29
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 10 Oct 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)A Hybrid Algorithmic Pipeline for Robust Recognition of Human Locomotion: Addressing Missing Sensor ModalitiesCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678462(591-596)Online publication date: 5-Oct-2024
      • (2023)Summary of SHL Challenge 2023: Recognizing Locomotion and Transportation Mode from GPS and Motion SensorsAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610758(575-585)Online publication date: 8-Oct-2023
      • (2023)Enhancing Transportation Mode Detection using Multi-scale Sensor Fusion and Spatial-topological AttentionAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610751(534-539)Online publication date: 8-Oct-2023
      • (2023)Ubiquitous Transportation Mode Estimation using Limited Cell Tower Information2023 IEEE 97th Vehicular Technology Conference (VTC2023-Spring)10.1109/VTC2023-Spring57618.2023.10200431(1-5)Online publication date: Jun-2023
      • (2023)DMSTL: A Deep Multi-Scale Transfer Learning Framework for Unsupervised Cross-Position Human Activity RecognitionIEEE Internet of Things Journal10.1109/JIOT.2022.320454210:1(787-800)Online publication date: 1-Jan-2023
      • (2023)Recognize Locomotion and Transportation Modes from Wi-Fi Traces via Lightweight Models2023 International Conference on Future Communications and Networks (FCN)10.1109/FCN60432.2023.10544151(1-6)Online publication date: 17-Dec-2023
      • (2023)A perspective on human activity recognition from inertial motion dataNeural Computing and Applications10.1007/s00521-023-08863-935:28(20463-20568)Online publication date: 31-Jul-2023
      • (2022)What Actually Works for Activity Recognition in Scenarios with Significant Domain Shift: Lessons Learned from the 2019 and 2020 Sussex-Huawei ChallengesSensors10.3390/s2210361322:10(3613)Online publication date: 10-May-2022
      • (2022)Selecting Resource-Efficient ML Models for Transport Mode Detection on Mobile Devices2022 IEEE International Conference on Internet of Things and Intelligence Systems (IoTaIS)10.1109/IoTaIS56727.2022.9976004(135-141)Online publication date: 24-Nov-2022
      • (2022) DeepMatch2Information Systems10.1016/j.is.2021.101927108:COnline publication date: 1-Sep-2022
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media