Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Sensor‐based activity recognition independent of device placement and orientation

Published: 12 April 2020 Publication History

Abstract

Human activity recognition (HAR) is a prominent subfield of pervasive computing and also provides context of many applications such as healthcare, education, and entertainment. Most wearable HAR studies assume that sensing device placement and orientation are fixed and never change. However, this condition is actually not always guaranteed in the real scenario and recognition result is influenced by the distortion as consequence. To handle this, our work proposes a new model based on convolutional neural network to extract robust features which are invariant of device placement and orientation, to train machine learning classifiers. We first carry out experiments to show negative effects of this problem. Then, we apply the convolutional neural network–based hybrid structure on the HAR. Results show that our method provides 15% to 40% accuracy promotion on public data set and 10% to 20% promotion on our own data set, both with distortion.

Graphical Abstract

In order to solve the problem brought by distortion on sensing device placement and orientation during the data collection stage, we propose a mothod combined deep learning model and machine learning algorithm. Results show that our method provides 15% to 40% accuracy promotion on public data set and 10% to 20% promotion on our own data set, both with distortion.

References

[1]
Cornacchia M, Ozcan K, Zheng Y, Velipasalar S. A survey on activity detection and classification using wearable sensors. IEEE Sens J. 2016;17(2):386‐403. https://doi.org/10.1109/jsen.2016.2628346
[2]
Chen L‐W, Ho Y‐F, Kuo W‐T, Tsai M‐F. Intelligent file transfer for smart handheld devices based on mobile cloud computing. Int J Commun Syst. 2017;30(1). https://doi.org/10.1002/dac.2947
[3]
Altun K, Barshan B. Human activity recognition using inertial/magnetic sensor units. In: Human Behavior Understanding: First International Workshop, HBU 2010, Istanbul, Turkey, August 22, 2010. Proceedings. Berlin, Germany: Springer‐Verlag; 2010:38‐51. https://doi.org/10.1007/978-3-642-14715-9_5
[4]
Bisio I, Delfino A, Lavagetto F, Sciarrone A. Enabling IoT for in‐home rehabilitation: accelerometer signals classification methods for activity and movement recognition. IEEE Internet Things J. 2017;4(1):135‐146. https://doi.org/10.1109/jiot.2016.2628938
[5]
Bisio I, Lavagetto F, Marchese M, Sciarrone A. Comparison of situation awareness algorithms for remote health monitoring with smartphones. Paper presented at: 2014 IEEE Global Communications Conference; 2014; Austin, TX. https://doi.org/10.1109/glocom.2014.7037176
[6]
Tran DN, Phan DD. Human activities recognition in Android smartphone using support vector machine. Paper presented at: 2016 7th International Conference on Intelligent Systems, Modelling and Simulation (ISMS); 2016; Bangkok, Thailand. https://doi.org/10.1109/isms.2016.51
[7]
Fullerton E, Heller B, Munoz‐Organero M. Recognising human activity in free‐living using multiple body‐worn accelerometers. IEEE Sens J. 2017;17(16):5290‐5297. https://doi.org/10.1109/jsen.2017.2722105
[8]
Chernbumroong S, Cang S, Yu H. A practical multi‐sensor activity recognition system for home‐based care. Decis Support Syst. 2014;66:61‐70. https://doi.org/10.1016/j.dss.2014.06.005
[9]
Guo H, Chen L, Chen G, Lv M. Smartphone‐based activity recognition independent of device orientation and placement. Int J Commun Syst. 2016;29(16):2403‐2415. https://doi.org/10.1002/dac.3010
[10]
Yurtman A, Barshan B. Activity recognition invariant to sensor orientation with wearable motion sensors. Sensors. 2017;17(8):1838. https://doi.org/10.3390/s17081838
[11]
Theekakul P, Thiemjarus S, Nantajeewarawat E, Supnithi T, Hirota K. A rule‐based approach to activity recognition. In: Knowledge, Information, and Creativity Support Systems: 5th International Conference, KICSS 2010, Chiang Mai, Thailand, November 25‐27, 2010, Revised Selected Papers. Berlin, Germany: Springer‐Verlag; 2011:204‐215. https://doi.org/10.1109/icpr.1996.546942
[12]
Förster K, Brem P, Roggen D, Tröster G. Evolving discriminative features robust to sensor displacement for activity recognition in body area sensor networks. Paper presented at: International Conference on Intelligent Sensors, Sensor Networks and Information Processing; 2009; Melbourne, Australia. https://doi.org/10.1109/issnip.2009.5416810
[13]
Akhavian R, Behzadan AH. Smartphone‐based construction workers' activity recognition and classification. Autom Constr. 2016;71:198‐209. https://doi.org/10.1016/j.autcon.2016.08.015
[14]
Wang R, Chen F, Chen Z, et al. StudentLife: assessing mental health, academic performance and behavioral trends of college students using smartphones. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing; 2014; Seattle, WA. https://doi.org/10.1145/2632048.2632054
[15]
Bisio I, Lavagetto F, Marchese M, Sciarrone A. Smartphone‐based user activity recognition method for health remote monitoring applications. In: Proceedings of the 2nd International Conference on Pervasive Embedded Computing and Communication Systems (PECCS); 2012; Rome, Italy. https://doi.org/10.5220/0003905502000205
[16]
Pansiot J, Stoyanov D, Mcllwraith D, Lo BPL, Yang GZ. Ambient and wearable sensor fusion for activity recognition in healthcare monitoring systems. In: 4th International Workshop on Wearable and Implantable Body Sensor Networks (BSN 2007): March 26 ‐ 28, 2007 RWTH Aachen University, Germany. Berlin, Germany: Springer; 2007:208‐212. https://doi.org/10.1007/978-3-540-70994-7_36
[17]
Maekawa T, Nakai D, Ohara K, Namioka Y. Toward practical factory activity recognition: unsupervised understanding of repetitive assembly work in a factory. In: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing; 2016; Heidelberg, Germany. https://doi.org/10.1145/2971648.2971721
[18]
Angelini L, Caon M, Carrino S, et al. Designing a desirable smart bracelet for older adults. In: Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication; 2013; Zürich, Switzerland. https://doi.org/10.1145/2494091.2495974
[19]
Xie Y, Zhang J, Xia Y, Fulham M, Zhang Y. Fusing texture, shape and deep model‐learned information at decision level for automated classification of lung nodules on chest CT. Information Fusion. 2017;42:102‐110. https://doi.org/10.1016/j.inffus.2017.10.005
[20]
Galea C, Farrugia RA. Matching software‐generated sketches to face photos with a very deep CNN, morphed faces, and transfer learning. IEEE Trans Inf Forensics Secur. 2017;13(6):1421‐1431. https://doi.org/10.1109/tifs.2017.2788002
[21]
Chen MC, Ball RL, Yang L, et al. Deep learning to classify radiology free‐text reports. Radiology. 2017;286(3):845‐852. https://doi.org/10.1148/radiol.2017171115
[22]
Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with deep convolutional neural networks. Paper presented at: Advances in Neural Information Processing Systems 25: 26th Annual Conference on Neural Information Processing Systems; 2012; Lake Tahoe, NV. https://doi.org/10.1145/3065386
[23]
He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. Paper presented at: IEEE Conference on Computer Vision and Pattern Recognition (CVPR); 2016; Las Vegas, NV. https://doi.org/10.1109/cvpr.2016.90
[24]
Yang J, Nguyen MN, San PP, Li XL, Krishnaswamy S. Deep convolutional neural networks on multichannel time series for human activity recognition. Paper presented at: International Conference on Artificial Intelligence; 2015; Buenos Aires, Argentina. https://doi.org/10.1109/cvpr.2016.90
[25]
Hammerla N, Halloran S, Ploetz T. Deep, convolutional, and recurrent models for human activity recognition using wearables. J Sci Comput. 2016;61(2):454‐476. https://arxiv.org/pdf/1604.08880.pdf
[26]
Liu L, Shao L, Rockett P. Human action recognition based on boosted feature selection and naive Bayes nearest‐neighbor classification. Signal Processing. 2013;93(6):1521‐1530. https://doi.org/10.1016/j.sigpro.2012.07.017
[27]
Sztyler T, Stuckenschmidt H, Petrich W. Position‐aware activity recognition with wearable devices. Pervasive Mob Comput. 2017;38:281‐295. https://doi.org/10.1016/j.pmcj.2017.01.008
[28]
Incel O. Analysis of movement, orientation and rotation‐based sensing for phone placement recognition. Sensors. 2015;15(10):25474‐25506. https://doi.org/10.3390/s151025474
[29]
Chen Z, Zhu Q, Soh YC, Zhang L. Robust human activity recognition using smartphone sensors via CT‐PCA and online SVM. IEEE Trans Ind Inform. 2017;13(6):3070‐3080. https://doi.org/10.1109/tii.2017.2712746
[30]
Chavarriaga R, Bayati H, Millán J. Unsupervised adaptation for acceleration‐based activity recognition: robustness to sensor displacement and rotation. Pers Ubiquit Comput. 2013;17(3):479‐490. https://doi.org/10.1007/s00779-011-0493-y
[31]
Liu M, Hao HQ, Xiong P, Lin F, Hou Z, Liu X. Constructing a guided filter by exploiting the butterworth filter for ecg signal enhancement. J Med Biol Eng. 2018;38(6):980‐992. https://doi.org/10.1007/s40846-017-0350-1
[32]
Altun K, Barshan B, Tunçel O. Comparative study on classifying human activities with miniature inertial and magnetic sensors. Pattern Recognition. 2010;43(10):3605‐3620. https://doi.org/10.1016/j.patcog.2010.04.019
[33]
Barshan B, Yüksek MC. Recognizing daily and sports activities in two open source machine learning environments using body‐worn sensor units. Comput J. 2014;57(11):1649‐1667. https://doi.org/10.1093/comjnl/bxt075
[34]
Xu C, He J, Zhang X. Hierarchical decision tree model for human activity recognition using wearable sensors. In: Advances in Intelligent Systems and Interactive Applications: Proceedings of the 2nd International Conference on Intelligent and Interactive Systems and Applications (IISA2017). Cham, Switzerland: Springer International Publishing; 2017:367‐372. https://doi.org/10.1007/978-3-319-69096-4_51
[35]
Bersch S, Azzi D, Khusainov R, Achumba I, Ries J. Sensor data acquisition and processing parameters for human activity classification. Sensors. 2014;14(3):4239‐4270. https://doi.org/10.3390/s140304239
[36]
Xue Y, Jin L. A naturalistic 3D acceleration‐based activity dataset & benchmark evaluations. Paper presented at: IEEE International Conference on Systems Man and Cybernetics; 2010; Istanbul, Turkey. https://doi.org/10.1109/icsmc.2010.5641790
[37]
Weiss GM, Timko JL, Gallagher CM, Yoneda K, Schreiber AJ. Smartwatch‐based activity recognition: a machine learning approach. Paper presented at: 2016 IEEE‐EMBS International Conference on Biomedical and Health Informatics (BHI); 2016; Las Vegas, NV. https://doi.org/10.1109/bhi.2016.7455925
[38]
Köping L, Shirahama K, Grzegorzek M. A general framework for sensor‐based human activity recognition. Comput Biol Med. 2018;95:248‐260. S0010482517304225. https://doi.org/10.1016/j.compbiomed.2017.12.025
[39]
Niu X‐X, Suen CY. A novel hybrid CNN–SVM classifier for recognizing handwritten digits. Pattern Recognition. 2012;45(4):1318‐1325. https://doi.org/10.1016/j.patcog.2011.09.021
[40]
Xue D‐X, Zhang R, Feng H, Wang Y‐L. CNN‐SVM for microvascular morphological type recognition with data augmentation. J Med Biol Eng. 2016;36(6):755‐764. https://doi.org/10.1007/s40846-016-0182-4
[41]
Simonyan K, Zisserman A. Very deep convolutional networks for large‐scale image recognition. 2014. Computer Science. https://arxiv.org/abs/1409.1556

Cited By

View all
  • (2022)Reliable Machine Learning for Wearable Activity MonitoringProceedings of the 41st IEEE/ACM International Conference on Computer-Aided Design10.1145/3508352.3549430(1-9)Online publication date: 30-Oct-2022
  • (2022)State-of-the-art survey on activity recognition and classification using smartphones and wearable sensorsMultimedia Tools and Applications10.1007/s11042-021-11410-081:1(1077-1108)Online publication date: 1-Jan-2022

Index Terms

  1. Sensor‐based activity recognition independent of device placement and orientation
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image Transactions on Emerging Telecommunications Technologies
        Transactions on Emerging Telecommunications Technologies  Volume 31, Issue 4
        April 2020
        263 pages
        EISSN:2161-3915
        DOI:10.1002/ett.v31.4
        Issue’s Table of Contents

        Publisher

        John Wiley & Sons, Inc.

        United States

        Publication History

        Published: 12 April 2020

        Qualifiers

        • Research-article

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 10 Nov 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2022)Reliable Machine Learning for Wearable Activity MonitoringProceedings of the 41st IEEE/ACM International Conference on Computer-Aided Design10.1145/3508352.3549430(1-9)Online publication date: 30-Oct-2022
        • (2022)State-of-the-art survey on activity recognition and classification using smartphones and wearable sensorsMultimedia Tools and Applications10.1007/s11042-021-11410-081:1(1077-1108)Online publication date: 1-Jan-2022

        View Options

        View options

        Get Access

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media