Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3594739.3610758acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Summary of SHL Challenge 2023: Recognizing Locomotion and Transportation Mode from GPS and Motion Sensors

Published: 08 October 2023 Publication History

Abstract

In this paper we summarize the contributions of participants to the fifth Sussex-Huawei Locomotion-Transportation (SHL) Recognition Challenge organized at the HASCA Workshop of UbiComp/ISWC 2023. The goal of this machine learning/data science challenge is to recognize eight locomotion and transportation activities (Still, Walk, Run, Bike, Bus, Car, Train, Subway) from the motion (accelerometer, gyroscope, magnetometer) and GPS (GPS location, GPS reception) sensor data of a smartphone in a user-independent manner. The training data of a “train” user is available from smartphones placed at four body positions (Hand, Torso, Bag and Hips). The testing data originates from “test” users with a smartphone placed at one, but unknown, body position. We introduce the dataset used in the challenge and the protocol of the competition. We present a meta-analysis of the contributions from 15 submissions, their approaches, the software tools used, computational cost and the achieved results. The challenge evaluates the recognition performance by comparing predicted to ground-truth labels at every 10 milliseconds, but puts no constraints on the maximum decision window length. Overall, five submissions achieved F1 scores above 90%, three between 80% and 90%, two between 70% and 80%, three between 50% and 70%, and two below 50%. While the task this year is facing the technical challenges of sensor unavailability, irregular sampling, and sensor diversity, the overall performance based on GPS and motion sensors is better than previous years (e.g. the best performance reported in SHL 2020, 2021 and 2023 are 88.5%, 75.4% and 96.0%, respectively). This is possibly due to the complementary between the GPS and motion sensors and also the removal of constraints on the decision window length. Finally, we present a baseline implementation to help understand the contribution of each sensor modality to the recognition task.

References

[1]
[1] S. Oh, et al. Multimodal sensor data fusion and ensemble modeling for human locomotion activity recognition. Proc. UbiComp/ISWC 2023.
[2]
[2] S. Huang, et al. User-independent Motion and Location Analysis for Sussex-Huawei Locomotion Data. Proc. UbiComp/ISWC 2023.
[3]
[3] M. Li, et al. Enhanced SHL recognition using machine learning and deep learning models with multi-source data. Proc. UbiComp/ISWC 2023.
[4]
[4] Y. Zhao, et al. Road network enhanced transportation mode recognition with an ensemble machine learning model. Proc. UbiComp/ISWC 2023.
[5]
[5] L. Alecci, et al. Enhancing XGBoost with heuristic smoothing for transportation mode and activity recognition. Proc. UbiComp/ISWC 2023.
[6]
[6] R. Sekiguchi, et al. Ensemble learning using motion sensors and location for human activity recognition. Proc. UbiComp/ISWC 2023.
[7]
[7] J. Deng, et al. Enhancing locomotion recognition with specialized features and map information via XGBoost. Proc. UbiComp/ISWC 2023.
[8]
[8] T. Hyugagi, et al. Moving state estimation by CNN from long time data of smartphone sensors. Proc. UbiComp/ISWC 2023.
[9]
[9] Z. Zeng, et al. An Ensemble framework based on fine multi-window feature engineering and vverfitting prevention for transportation mode recognition. Proc. UbiComp/ISWC 2023.
[10]
[10] R. Chen. Enhancing transportation mode detection using multi-scale sensor fusion and spatial-topological attention. Proc. UbiComp/ISWC 2023.
[11]
[11] G. Habault, et al. A classical machine learning method for locomotion and transportation recognition using both motion and location data. Proc. UbiComp/ISWC 2023.
[12]
[12] S. Huang, et al. A post-processing machine learning for activity recognition challenge with OpenStreetMap data. Proc. UbiComp/ISWC 2023.
[13]
[13] TeamX. Withdrawn.
[14]
[14] H. Yan, et al. AttenDenseNet for the Sussex-Huawei Locomotion-Transportation (SHL) recognition challenge. Proc. UbiComp/ISWC 2023.
[15]
[15] TeamY. Human activity recognition using accelerometer, gyroscope and magnetometer with LSTM. Withdrawn.
[16]
[16] J. Engelbrecht, M.J. Booysen, G. Rooyen, et al. Survey of smartphone-based sensing in vehicles for intelligent transportation system applications. IET Intelligent Transport Systems, 9(10): 924-935, 2015.
[17]
[17] Y. Vaizman, K. Ellis, G. Lanckriet. Recognizing detailed human context in the wild from smartphones and smartwatches. IEEE Pervasive Computing, 16(4): 62-74, 2017.
[18]
[18] E. Anagnostopoulou, J. Urbancic, E. Bothos, et al. From mobility patterns to behavioural change: leveraging travel behaviour and personality profiles to nudge for sustainable transportation. Journal of Intelligent Information Systems, 2018: 1-22, 2018.
[19]
[19] D.A. Johnson, M.M. Trivedi. Driving style recognition using a smartphone as a sensor platform. Proc. IEEE Conf. Intelligent Transportation Systems, 2011, 1609-1615.
[20]
[20] W. Brazil, B. Caulfield. Does green make a difference: The potential role of smartphone technology in transport behaviour. Transportation Research Part C: Emerging Technologies, 37: 93-101, 2013.
[21]
[21] J. Froehlich, T. Dillahunt, P. Klasnja, et al. Ubigreen: Investigating a mobile tool for tracking and supporting green transportation habits. Proc. SIGCHI Conf. Human Factors Computing Systems, 2009, 1043-1052.
[22]
[22] N.D. Lane, E. Miluzzo, H. Lu, et al. A survey of mobile phone sensing. IEEE Communications Magazine, 48(9): 140-150, 2010.
[23]
[23] S. C. Mukhopadhyay. Wearable sensors for human activity monitoring: A review. IEEE Sensors Journal, 15(3): 1321-1330, 2015.
[24]
[24] G. Castignani, T. Derrmann, R. Frank, T. Engel. Driver behavior profiling using smartphones: A low-cost platform for driver monitoring. IEEE Intelligent Transportation Systems Magazine. 7(1): 91-102, 2015.
[25]
[25] H. Xia, Y. Xiao, J. Jian, Y. Chang. Using smart phone sensors to detect transportation modes. Sensors, 14(11): 20843-20865, 2014.
[26]
[26] M.C. Yu, T. Yu, S.C. Wang, et al. Big data small footprint: the design of a low-power classifier for detecting transportation modes. Proc. Very Large Data Base Endowment, 2014, 1429-1440.
[27]
[27] Y. Zheng, L. Liu, L. Wang, X. Xie. Learning transportation mode from raw GPS data for geographic applications on the Web. Proc. Int. Conf. World Wide Web, 2008, 247-256.
[28]
[28] L. Stenneth, O. Wolfson, P. S. Yu, B. Xu. Transportation mode detection using mobile phones and GIS information. Proc. ACM SIGSPATIAL Int. Conf. Advances in Geographic Information Systems, 2011, 54-63.
[29]
[29] X. Yang, K. Stewart, L. Tang, et al. A review of GPS trajectories classification based on transportation mode. Sensors, 18(11): 3741, 2018.
[30]
[30] L. Gong, T. Morikawa, T. Yamamoto, H. Sato. Deriving personal trip data from GPS data: A literature review on the existing methodologies. Procedia-Social and Behavioral Sciences, 138: 557-565, 2014.
[31]
[31] S. Dabiri, K. Heaslip. Inferring transportation modes from GPS trajectories using a convolutional neural network. Transportation Research Part C: Emerging Technologies, 86: 360-371, 2018.
[32]
[32] M. Guo, S. Liang, L. Zhao, P. Wang. Transportation mode recognition with deep forest based on GPS data. IEEE Access, 8: 150891-150901, 2020.
[33]
[33] L. Wang, D. Roggen. Sound-based transportation mode recognition with smartphones. Proc. IEEE Int. Conf. Acoustics, Speech and Signal Processing, 2019, 930-934.
[34]
[34] S. Richoz, M. Ciliberto, L. Wang, et al. Human and machine recognition of transportation modes from body-worn camera images. Proc. Joint 8th Int. Conf. Informatics, Electronics & Vision and 3rd Int. Conf. Imaging, Vision & Pattern Recognition, 2019, 67-72.
[35]
[35] S. Richoz, L. Wang, P. Birch, D. Roggen. Transportation mode recognition fusing wearable motion, sound and vision sensors. IEEE Sensors Journal, 20(16): 9314-9328, 2020.
[36]
[36] T. Sohn, A. Varshavsky, A. LaMarca, et al. Mobility detection using everyday GSM traces. Proc. Int. Conf. Ubiquitous Computing, 2006, 212-224,
[37]
[37] T. Feng and H. J. Timmermans, Transportation mode recognition using GPS and accelerometer data. Transportation Research Part C: Emerging Technologies, 37: 118-130, 2013.
[38]
[38] B. D. Martin, V. Addona, J. Wolfson, G. Adomavicius, and Y. Fan. Methods for real-time prediction of the mode of travel using smartphone-based GPS and accelerometer data. Sensors 17(9): 2058, 2017.
[39]
[39] H. Wang, F. Calabrese, G. D. Lorenzo, C. Ratti. Transportation mode inference from anonymized and aggregated mobile phone call detail records. Proc. Int. Conf. Intelligent Transportation Systems, 2010, 318-323.
[40]
[40] V. C. Coroama, C. Turk, F. Mattern. Exploring the usefulness of bluetooth and wifi proximity for transportation mode recognition. Adjunct Proc. 2019 ACM Int. Joint Conf. Pervasive and Ubiquitous Computing and Proc. 2019 ACM Int. Symp. Wearable Computers, 2019, 37-40.
[41]
[41] L. Wang, H. Gjoreski, K. Murao, T. Okita, D. Roggen. Summary of the Sussex-Huawei locomotion-transportation recognition challenge. Proc. 2018 ACM Int. Joint Conf. and 2018 Int. Symp. Pervasive and Ubiquitous Computing and Wearable Computers, 2018, 1521-1530.
[42]
[42] L. Wang, H. Gjoreski, M. Ciliberto, P. Lago, K. Murao, T. Okita, D Roggen. Summary of the Sussex-Huawei locomotion-transportation recognition challenge 2019. Adjunct Proc. 2019 ACM Int. Joint Conf. Pervasive and Ubiquitous Computing and Proc. 2019 ACM Int. Symp. Wearable Computers, 2019, 849-856.
[43]
[43] L. Wang, H. Gjoreski, M. Ciliberto, P. Lago, K. Murao, T. Okita, D Roggen. Summary of the Sussex-Huawei locomotion-transportation recognition challenge 2020. Adjunct Proc. 2020 ACM Int. Joint Conf. Pervasive and Ubiquitous Computing and Proc. 2020 ACM Int. Symp. Wearable Computers, 2020, 351-358.
[44]
[44] L. Wang, M. Ciliberto, H. Gjoreski, P. Lago, K. Murao, T. Okita, D Roggen. Locomotion and transportation Mode Recognition from GPS and radio signals: Summary of SHL Challenge 2021. Adjunct Proc. 2021 ACM Int. Joint Conf. Pervasive and Ubiquitous Computing and Proc. 2021 ACM Int. Symp. Wearable Computers, 2021, 412-422.
[45]
[45] L. Wang, H. Gjoreski, M. Ciliberto, S. Mekki, S. Valentin, D. Roggen. Benchmarking the SHL recognition challenge with classical and deep-learning pipelines. Proc. 2018 ACM Int. Joint Conf. and 2018 Int. Symp. Pervasive and Ubiquitous Computing and Wearable Computers, 2018, 1626-1635.
[46]
[46] L. Wang, H. Gjoreski, M. Ciliberto, P. Lago, K. Murao, T. Okita, D Roggen. Three-year Review of the 2018-2020 SHL Challenge on Transportation and Locomotion Mode Recognition from Mobile Sensors. Frontiers in Computer Science, 3 (713719), 1-24, 2021.
[47]
[47] H. Gjoreski, M. Ciliberto, L. Wang, F.J.O. Morales, S. Mekki, S. Valentin, D. Roggen. The university of Sussex-Huawei locomotion and transportation dataset for multimodal analytics with mobile devices. IEEE Access, 2018, 42592-42604.
[48]
[48] L. Wang, H. Gjoreski, M. Ciliberto, S. Mekki, S. Valentin, D. Roggen. Enabling reproducible research in sensor-based transportation mode recognition with the Sussex-Huawei dataset. IEEE Access, 2019, 10870-10891.

Cited By

View all
  • (2024)A Hybrid Algorithmic Pipeline for Robust Recognition of Human Locomotion: Addressing Missing Sensor ModalitiesCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678462(591-596)Online publication date: 5-Oct-2024
  • (2024)Interpolation attention-based KAN for the Sussex-Huawei Locomotion-Transportation Recognition ChallengeCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678460(580-584)Online publication date: 5-Oct-2024
  • (2024)Ensemble Learning Approach for Human Activity Recognition Involving Missing Sensor DataCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678458(569-574)Online publication date: 5-Oct-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UbiComp/ISWC '23 Adjunct: Adjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing
October 2023
822 pages
ISBN:9798400702006
DOI:10.1145/3594739
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 October 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Activity recognition
  2. Deep learning
  3. Machine learning
  4. Mobile sensing
  5. Transportation mode recognition

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

UbiComp/ISWC '23

Acceptance Rates

Overall Acceptance Rate 764 of 2,912 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)277
  • Downloads (Last 6 weeks)29
Reflects downloads up to 10 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A Hybrid Algorithmic Pipeline for Robust Recognition of Human Locomotion: Addressing Missing Sensor ModalitiesCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678462(591-596)Online publication date: 5-Oct-2024
  • (2024)Interpolation attention-based KAN for the Sussex-Huawei Locomotion-Transportation Recognition ChallengeCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678460(580-584)Online publication date: 5-Oct-2024
  • (2024)Ensemble Learning Approach for Human Activity Recognition Involving Missing Sensor DataCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678458(569-574)Online publication date: 5-Oct-2024
  • (2024)TeLeGaIT: Transfer Learning on Fog for Generalizable and Real-Time Transport Mode Detection2024 9th International Conference on Fog and Mobile Edge Computing (FMEC)10.1109/FMEC62297.2024.10710203(196-203)Online publication date: 2-Sep-2024
  • (2023)An Ensemble Framework Based on Fine Multi-Window Feature Engineering and Overfitting Prevention for Transportation Mode RecognitionAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610756(563-568)Online publication date: 8-Oct-2023
  • (2023)A Post-processing Machine Learning for Activity Recognition Challenge with OpenStreetMap DataAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610755(557-562)Online publication date: 8-Oct-2023
  • (2023)Multimodal Sensor Data Fusion and Ensemble Modeling for Human Locomotion Activity RecognitionAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610753(546-550)Online publication date: 8-Oct-2023
  • (2023)Enhancing XGBoost with Heuristic Smoothing for Transportation Mode and Activity RecognitionAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610752(540-545)Online publication date: 8-Oct-2023
  • (2023)Road Network Enhanced Transportation Mode Recognition with an Ensemble Machine Learning ModelAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610750(528-533)Online publication date: 8-Oct-2023
  • (2023)Moving State Estimation by CNN from Long Time Data of Smartphone Sensors: Sussex-Huawei Locomotion Challenge 2023Adjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610749(523-527)Online publication date: 8-Oct-2023
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media