Recent Progress in Sensing and Computing Techniques for Human Activity Recognition and Motion Analysis
Abstract
:1. Introduction
2. Latest Progress in HAR-Related Applications
2.1. Overview of the Latest Technical Progress
- New sensing devices and methods: The acquisition of raw data is the first step for accurate and effective activity recognition. The cost reduction of the electronic devices has accelerated the pervasive sensing and computing systems, such as location and velocity tracking [32]. On account of the sensing techniques, many new techniques that were previously not possible for their cost, size, and technical readiness are now introduced for human activity related studies in addition to the traditional video cameras (including depth cameras), FMCW radar, CW-doppler radar, WiFi, ultrasound, radio frequency identification (RFID), and wearable IMU and electromyography (EMG) sensors, etc. [15,16,17,18]. Among the above candidates, FMCW radar, CW-doppler radar, WiFi, ultrasound, and RFID are both NLOS and contactless. Wearable IMU is NLOS and body-worn, for which the applications are not limited to specific areas. The micro-electro-mechanical system (MEMS) IMUs, due to their advantages in being low-power, low-cost, miniature-sized, as well as able to output rich sensing information, have become a dominant technical approach for HAR studies [33,34].
- Innovative of mathematical methods: To take advantage of the sensor data, an appropriate modeling of human body parts activities, the pre-processing of raw data, feature extraction, classification, and application-oriented analysis are pivotal enablers for the success of HAR-related functions. With respect to pre-processing, a number of lightweight, time-domain filtering algorithms, which are appropriate for resource-limited computing, including KF, extended Kalman filter (EKF), unscented Kalman filter (UKF), and Mahony complementary filter, are common alternatives that deal with wearable electronics [35,36]. In terms of feature extraction, the time-domain and frequency-domain features, including mean, deviation, zero crossing rate, and statistical characteristics of amplitude and frequency, are the most fundamental parameters [37]. Fast Fourier transform (FFT), as a classical algorithm, is the main solution for the frequency-domain feature analysis. With respect to classification, much effort in classical machine learning, including DT, BN, PCA, SVM, KNN, and ANN, and deep learning methods, including CNN and RNN, has been devoted to this particular area in recent years [25,38].
- Novel networking and computing paradigms: The sensing and computing platforms are crucial factors for effective human activity recognition. For visual-based solutions, high-performance graphics processing units (GPUs) have paved the way for the intensive computations in HAR [39]. For wearable sensing solutions, many new pervasive and mobile networking and computing techniques for battery-powered electronics have been investigated. Various wireless networking and communication protocols, including Bluetooth, Bluetooth low energy (BLE), Zigbee, and WiFi were introduced for building a body area network (BAN) for HAR [40]. New computation paradigms, such as mobile computing and wearable computing, were proposed to handle the location-free and resource-limited computing scenarios, and sometimes the computation is conducted on an 8-bit or 16-bit micro-controller unit (MCU) [41]. The novel networking and computing paradigms customized for HAR are more efficient and flexible for the related investigation and practices.
- Emerging consumer electronics for HAR: Another evidence of the progress in HAR is the usage of emerging HAR consumer electronics, including Kinect, Fibaro, the Mi band, and Fitbit. The Kinect-based somatosensory game is a typical use case of recognizing human motion as an input for human–computer interactions [42]. Fibaro detects human motion and changes in location for smart home applications [43]. Some wearable consumer electronics, such as the Mi band and Fitbit, can provide users with health and motion parameters, including heartbeats, intensity of exercises, walking or running steps, sleeping quality evaluation, etc. [44,45]. In addition, there are electronic devices such as MAX25205 (Maxim, San Jose, USA) used for gesture sensing in automotive applications using an IR-based sensor, where hand swipe gestures, finger, and hand rotation can be recognized [46]. These devices have stepped into people’s daily lives to assist people to better understand their health and physical activities, or to perform automatic control and intelligent recommendation via HAR.
- Convergence with different subject areas: The HAR techniques were also found to be converging with many other subject areas and thus continually allowing for new applications to be created. Typically, HAR is merging with medical care, and this has resulted in some medical treatment and physical rehabilitation methods of diseases such as stroke and Parkinson’s disease [10,12]. HAR has also been introduced in sports for sports analysis for the purpose of enhancing athletes’ performance [7,8]. HAR-assisted daily living is another example as a field of application, where power consumption, home appliance control, and intelligent recommendations can be implemented to customize the living environment to people’s preferences [2,47].
2.2. Widespread Application Domains
- Assisted daily living—HAR is used to recognize people’s gesture and motions for home appliance control, or analyze people’s activities and habits for living environment customizations and optimization.
- Human intent analysis—HAR is used to analyze people’s motions and activities in order to predict people’s intents in a passive way for particular applications, such as intelligent recommendation, crime detection in public areas, etc. [54].
3. Fundamentals
3.1. Common Methodology
- Data acquisition and pre-processing—Motion sensors such as 3D cameras, IMU, IR-radar, CW radar, and ultrasound arrays, either mounted onto a fixed location or body-worn, are used to obtain raw data including human activity information, such as position, acceleration, velocity, angular velocity, and angles. In addition, various noise cancellation techniques, either time-domain or frequency-domain ones, need to be applied to optimize the quality of the signal for further processing.
- Segmentation—Human motion is normally distributed in particular time spans, either occasionally or periodically. The signals need to be split into window segments for activity analysis, and the sliding window technique is usually used to handle the activities between segments.
- Feature extraction—The features of activities including joint angle, acceleration, velocity and angular velocity, relative angle and position, etc., can be directly obtained for further analysis. There may also be indirect features that can be used for the analysis of particular activities.
- Classification/model application—The extracted features can be used by various machine learning and deep learning algorithms for the classification and recognition of human activities. Some explicit models, such as skeleton segment kinematic models, can be applied directly for decision-making.
3.2. Modeling of Human Body Parts for HAR
3.3. Identifiable Human Activities
4. Novel Sensing Techniques
4.1. Taxonomy and Evaluation
4.2. Sensing Techniques with Example Applications
- Optical tracking—Optical tracking with one or a few common cameras is the most traditional way of human activity recognition. The temporal and spatial features of the human body and the activities can be extracted with image processing algorithms. The classification that takes advantage of the features can be sorted out with machine learning or deep learning techniques. This method is competitive in accuracy and wide application domains, including entertainment, industrial surveillances, public security, etc. With the progress in high-speed cameras and depth sensors in recent years, more innovative investigations and applications are found in both academic studies and practical applications. The strengths of optical tracking methods are high accuracy, contactless monitoring, and rich activity information, while the weaknesses are the fixed location of use and potential risk of privacy leak.
- RF and acoustic techniques—The sensing technique working under the principle of radar includes IR-UWB, FMCW radar, WiFi, and ultrasound. The motion of human body parts or the whole body can be perceived using Doppler frequency shift (DFS), and some time–frequency signatures can be obtained for further analysis. Machine learning and deep learning techniques are widely used for classifications of human activities. Evidently, the strengths of this method are contactless measurement, NLOS, and no risk of privacy leak. Specifically, the commodity WiFi takes advantage of the existing communication infrastructure without added cost. Their weaknesses are the monotonousness and implicity of information provided by the sensors which requires specialized processing.
- Inertia sensors—The inertial sensors, especially the MEMS IMUs, became a dominant technical approach for human activity recognition and analysis. They provide 3-axis acceleration, 3-axis angular velocity, and 3-axis magnetometer signals, which can be employed for the estimation of attitude and motion of human body parts by mounting the devices on them. The strengths of IMUs in HAR are their miniature size, low cost, low power, and rich information output, which make them competitive for wearable applications that can be used without location constraints. Their weakness is their contact measurement, which may be inconvenient to people for daily activities recognition.
- Force sensors and EMG sensors—Force sensors may include piezoelectric sensors, FSR, and some thin film pressure sensors of different materials. They may provide pressure or multiple axis forces of human gait, hand grasp, etc., for sports analysis, physical rehabilitation or bio-feedback. EMG sensors work in a similar way as force sensors do and implement similar functions by using the EMG signal. Since EMG provides the muscle activities, it may provide more useful information for medical care, such as rehabilitation and amputation. The strengths of force sensors and EMG sensors are their capability in obtaining useful information of body part local areas. Their weakness is also the contact measurement, which may be inconvenient to people for daily activities recognition.
- Multiple sensor fusion—In addition to the above, there are also multiple sensor fusion based techniques that combine more than one of the above alternatives. For instance, inertial and magnetic sensors are combined for body segments motion observation [24]; optical sensor, IMU, and force sensor are combined for pose estimation in human bicycle riding [66]; knee electro-goniometer, foot-switches, and EMG are combined for gait phase and events recognition [95]; and FMCW radar and wearable IMUs are combined for fall detection [96]. The purpose of the combination is to compensate the limitations of one sensor technology by taking advantage of another so as to pursue maximal performance. For example, IMU and UWB radar are usually combined with real-time data fusion to overcome the limitations in continuous error accumulation of IMU, and NLOS shadowing and random noise of UWB [23].
4.3. Body Area Sensor Networks
5. Mathematical Methods
5.1. Features and Feature Extraction
5.2. Classification and Decision-Making
- Threshold-based method—Threshold can be eaisly used to obtain many simple gestures, postures, and motions. For example, fall detection, hand shaking, and static standing can be recognized using the acceleration threshold calculated using equation , and walking or running can be recognized using Doppler frquency shift thresholds.
- Machine learning techniques—Machine learning is a method of data analysis that automates analytical model building, which is suitable for the classification of human activities using sensor data. Its feasibility and efficiency have been demonstrated by many published studies with accuracies over 95%. Typically, HMM achieves an accuracy of 95.71% for gesture recognition [101]; SVM and KNN are employed for human gait classification between walking and running with an accurate over 99% [102]; PCA and SVM recognize three different actions for sports training with an accuracy of 97% [9]; KNN achieves an accuracy of 95% for dynamic activity classification [65]. There are also peer investigations providing evaluations of different machine learning methods [9,88,103].
- Deep learning methods—Deep learning techniques take advantages of many layers of non-linear information processing for supervised or unsupervised feature extraction and transformation, as well as for pattern analysis and classification. They presents advantages over traditional approaches and have gained continual research interest in many different fields, including HAR in recent years. For example, CNN achieves an accuracy of 97% for the recognition of nine different activities in [31], an accuracy of 95.87% for arm movement classification [64], an accuracy of 99.7% for sEMG-based gesture recognition in [93], and an accuracy of 95% for RFID based in-car activity recognition in [81]. The past investigations have demonstrated that deep learning technique is an effective way of classification for human activity recognition. There are also investigations applying long short-term memory (LSTM) [104] or bidirectional LSTM (Bi-LSTM) [76,96], as well as recurrent neural networks in sports and motion recognition, which have exhibited competitive performance compared to CNN.
6. Discussions
6.1. Performance of the State-of-the-Art
6.2. Technical Challenges
- Limitation of sensing techniques for convenient uses—The sensing techniques, including optical tracking, radar, inertial measurement, force, and EMG sensors, all have their limitations in performing accurate, non-contact, location-free human activity sensing. Optical sensors are competitive in accuracy, but they are LOS and limited to specific areas. Radar techniques, such as FMCW, IR-UWB, WiFi, and ultrasound, are NLOS, but they work in specific areas, and they are not as accurate. IMUs, force data, and EMG are low cost and low power, making them suitable for wearable applications, but the battery power and wireless modules make the size of the devices inconvenient for daily use. They lack sensing techniques that do not interrupt people’s normal daily activities.
- Dependency on PCs for data processing—For the body-worn sensing devices, the data acquisition and pre-processing are usually completed with low power MCUs, and further processing such as feature extraction and classification are conducted on PCs. This results in the requirements of a high transmission data rate and the involvement of a powerful PC. The high data rate is also a challenge for the deployment of multiple sensor nodes to establish a BAN. The traditional way of communication and computation has constrained the application of body-worn HAR.
- Difficulties in minor gesture recognition—Most of the existing techniques are for the recognition of regular and large-scale human gesture, posture, and movements, such as hand gesture, arm swing, gait, and whole-body posture, with classification techniques. It is still a challenge to quantitively discriminate minor gesture variations, which may be potentially valuable for many applications.
- Specificity in human activity recognition and motion analysis—Most of the state-of-the-art may focus on the postures or activities for a particular body part, or a particular posture or activity of the whole body. The focus usually falls on the specific sensing devices and classification techniques, while lacks comprehensive posture and activity recognition and further understanding of the kinematics of human body and behaviors of people based on the in-depth analysis of their daily activities.
6.3. Future Perspective
- Unobtrusive human activity monitoring techniques—The sensing techniques that can perform an unobtrusive perception of human activities without or with minimum interferece to people’s normal activities, and that can also be used anywhere, need greater study. New investigations about minimizing the size of sensing devices and the way of powering the device may be new research interests.
- Optimization of sensing network and computation powers—It is critical to explore the optimal BAN of sensing devices for HAR, and to optimize the computation power between sensor nodes and the central processor to relieve the burden of the sensor network for data transmission. A new paradigm of wearable computing for wearable BAN may become an effort attracting topic.
- Further studies for minor gestures and posture recognition—Further investigation of the sensing techniques and feature extraction techniques for minor gesture recognitions may find applications in different fields, such as HMI. For example, UWB Doppler features based human finger motion recognition for the operation of handheld electronics may be one potential use case.
- Comprehensive recognition and understanding of human activities—Based on the recognition of human gestures and movements, the further positional and angular kinematics of the human body and human behaviors can be investigated for the further analysis. Research outputs may provide valuable reference for the studies of bionic robotics and for customized human intent estimation as well as smart recommendations in smart environments.
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Dian, F.J.; Vahidnia, R.; Rahmati, A. Wearables and the Internet of Things (IoT), Applications, Opportunities, and Challenges: A Survey. IEEE Access 2020, 8, 69200–69211. [Google Scholar] [CrossRef]
- Aksanli, B.; Rosing, T.S. Human Behavior Aware Energy Management in Residential Cyber-Physical Systems. IEEE Trans. Emerg. Top. Comput. 2017, 845–857. [Google Scholar] [CrossRef]
- Chen, L.; Hoey, J.; Nugent, C.D.; Cook, D.J.; Yu, Z. Sensor-based Activity Recognition. IEEE Trans. Syst. ManCybern. Part C Appl. Rev. 2012, 42, 790–808. [Google Scholar] [CrossRef]
- Ziaeefard, M.; Bergevin, R. Semantic Human Activity Recognition: A Literature Review. Pattern Recognit. 2015, 48, 2329–2345. [Google Scholar] [CrossRef]
- Zhang, F. Human–Computer Interactive Gesture Feature Capture and Recognition in Virtual Reality. Ergon. Des. Q. Hum. Factors Appl. 2020. [Google Scholar] [CrossRef]
- Vrigkas, M.; Nikou, C.; Kakadiaris, I.A. A review of Human Activity Recognition Methods. Front. Robot. AI 2015, 2, 28. [Google Scholar] [CrossRef]
- Kim, Y.J.; Kim, K.D.; Kim, S.H.; Lee, S.; Lee, H.S. Golf Swing Analysis System with a Dual Band and Motion Analysis Algorithm. IEEE Trans. Consum. Electron. 2017, 63, 309–316. [Google Scholar] [CrossRef]
- Dadashi, F.; Millet, G.P.; Aminian, K. Gaussian Process Framework for Pervasive Estimation of Swimming Velocity with Body-worn IMU. Electron. Lett. 2013, 49, 44–46. [Google Scholar] [CrossRef]
- Wang, Y.; Chen, M.; Wang, X.; Chan, R.H.M.; Li, W.J. IoT for Next Generation Racket Sports Training. IEEE Internet Things J. 2018, 5, 4559–4566. [Google Scholar] [CrossRef]
- Wang, L.; Sun, Y.; Li, Q.; Liu, T.; Yi, J. Two Shank-Mounted IMUs-based Gait Analysis and Classification for Neurological Disease Patients. IEEE Robot. Autom. Lett. 2020, 5, 1970–1976. [Google Scholar] [CrossRef]
- Connolly, J.; Condell, J.; O’Flynn, B.; Sanchez, J.T.; Gardiner, P. IMU Sensor-Based Electronic Goniometric Glove for Clinical Finger Movement Analysis. IEEE Sens. J. 2018, 18, 1273–1281. [Google Scholar] [CrossRef]
- Nguyen, H.; Lebel, K.; Bogard, S.; Goubault, E.; Boissy, P.; Duval, C. Using Inertial Sensors to Automatically Detect and Segment Activities of Daily Living in People with Parkinson’s Disease. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 197–204. [Google Scholar] [CrossRef] [PubMed]
- Mukhopadhyay, S.C. Wearable Sensors for Human Activity Monitoring: A Review. IEEE Sens. J. 2015, 15, 1321–1330. [Google Scholar] [CrossRef]
- Lara, O.D.; Labrador, M.A. A Survey on Human Activity Recognition Using Wearable Sensors. IEEE Commun. Surv. Tutor. 2013, 15, 1192–1209. [Google Scholar] [CrossRef]
- Rossol, N.; Cheng, I.; Basu, A. A Multisensor Technique for Gesture Recognition Through Intelligent Skeletal Pose Analysis. IEEE Trans. Hum. Mach. Syst. 2016, 46, 350–359. [Google Scholar] [CrossRef]
- Vaishnav, P.; Santra, A. Continuous Human Activity Classification with Unscented Kalman Filter Tracking Using FMCW Radar. IEEE Sens. J. 2020, 4, 7001704. [Google Scholar] [CrossRef]
- Rana, S.P.; Dey, M.; Ghavami, M.; Dudley, S. Non-Contact Human Gait Identification Through IR-UWB Edge-based Monitoring Sensor. IEEE Sens. J. 2019, 19, 9282–9293. [Google Scholar] [CrossRef]
- Wang, H.; Zhang, D.; Wang, Y.; Ma, Y.; Li, S. RT-Fall: A Real-Time a Contactless Fall Detection Systems with Commodity WiFi Devices. IEEE Trans. Mob. Comput. 2017, 16, 511–526. [Google Scholar] [CrossRef]
- Tariq, O.B.; Lazarescu, M.T.; Lavagno, L. Neural Networks for Indoor Human Activity Reconstructions. IEEE Sens. J. 2020. [Google Scholar] [CrossRef]
- Li, T.; Fong, S.; Wong, K.K.L.; Wu, Y.; Yang, X.-S.; Li, X. Fusing Wearable and Remote Sensing Data Streams by Fast Incremental Learning with Swarm Decision Table for Human Activity Recognition. Inf. Fusion 2020, 60, 41–64. [Google Scholar] [CrossRef]
- Paoletti, M.; Belli, A.; Palma, L.; Vallasciani, M.; Pierleoni, P. A Wireless Body Sensor Network for Clinical Assessment of the Flexion-Relaxation Phenomenon. Electronics 2020, 9, 1044. [Google Scholar] [CrossRef]
- Baldi, T.; Farina, F.; Garulli, A.; Giannitrapani, A.; Prattichizzo, D. Upper Body Pose Estimation Using Wearable Inertial Sensors and Multiplicative Kalman Filter. IEEE Sens. J. 2020, 20, 492–500. [Google Scholar] [CrossRef] [Green Version]
- Zhang, H.; Zhang, Z.; Gao, N.; Xiao, Y.; Meng, Z.; Li, Z. Cost-Effective Wearable Indoor Localization and Motion Analysis via the Integration of UWB and IMU. Sensors 2020, 20, 344. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Fourati, H.; Manamanni, N.; Afilal, L.; Handrich, Y. Complementary Observer for Body Segments Motion Capturing by Inertial and Magnetic Sensor. IEEE ASME Trans. Mechatron. 2014, 19, 149–157. [Google Scholar] [CrossRef] [Green Version]
- Ferrari, A.; Micucci, D.; Mobilio, M.; Napoletano, P. On the Personalization of Classification Models for Human Activity Recognition. IEEE Access 2020, 8, 32066. [Google Scholar] [CrossRef]
- Stikic, M.; Larlus, D.; Ebert, S.; Schiele, B. Weakly Supervised Recognition of Daily Life Activities with Wearable Sensors. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 2521–2537. [Google Scholar] [CrossRef]
- Ghazal, S.; Khan, U.S.; Saleem, M.M.; Rashid, N.; Iqbal, J. Human Activity Recognition Using 2D Skeleton Data and Supervised Machine Learning. IET Image Process. 2019, 13, 2572–2578. [Google Scholar] [CrossRef]
- Manzi, A.; Dario, P.; Cavallo, F. A human Activity Recognition System based on Dynamic Clustering of Skeleton Data. Sensors 2017, 17, 1100. [Google Scholar] [CrossRef] [Green Version]
- Plotz, T.; Guan, Y. Deep Learning for Human Activity Recognition in Mobile Computing. Computer 2018, 51, 50–59. [Google Scholar] [CrossRef]
- Xu, C.; Chai, D.; He, J.; Zhang, X.; Duan, S. InnoHAR: A Deep Neural Network for Complex Human Activity Recognition. IEEE Access 2019, 7, 9893. [Google Scholar] [CrossRef]
- Bianchi, V.; Bassoli, M.; Lombardo, G.; Fornacciari, P.; Mordonini, M.; Munari, I.D. IoT Wearable Sensor and Deep Learning: An Integrated Approach for Personalized Human Activity Recognition in a Smart Home Environment. IEEE Internet Things J. 2019, 6, 8553–8562. [Google Scholar] [CrossRef]
- Yuan, Q.; Chen, I.-M. Localization and Velocity Tracking of Human via 3 IMU Sensors. Sens. Actuators A Phys. 2014, 212, 25–33. [Google Scholar] [CrossRef]
- Tian, Y.; Meng, X.; Tao, D.; Liu, D.; Feng, C. Upper Limb Motion Tracking with the Integration of IMU and Kinect. Neurocomputing 2015, 159, 207–218. [Google Scholar] [CrossRef]
- Zihajehzadeh, S.; Loh, D.; Lee, T.J.; Hoskinson, R.; Park, E.J. A Cascaded Kalman Filter-based GPS/MEMS-IMU Integrated for Sports Applications. Measurement 2018, 73, 200–210. [Google Scholar] [CrossRef]
- Tong, X.; Li, Z.; Han, G.; Liu, N.; Su, Y.; Ning, J.; Yang, F. Adaptive EKF based on HMM Recognizer for Attitude Estimation Using MEMS MARG Sensors. IEEE Sens. J. 2018, 18, 3299–3310. [Google Scholar] [CrossRef]
- Enayati, N.; Momi, E.D.; Ferrigno, G. A Quaternion-Based Unscented Kalman Filter for Robust Optical/Inertial Motion Tracking in Computer-Assisted Surgery. IEEE Trans. Instrum. Meas. 2015, 64, 2291–2301. [Google Scholar] [CrossRef] [Green Version]
- Wagstaff, B.; Peretroukhin, V.; Kelly, J. Robust Data-Driven Zero-Velocity Detection for Foot-Mounted Inertial Navigation. IEEE Sens. J. 2020, 20, 957–967. [Google Scholar] [CrossRef] [Green Version]
- Jia, H.; Chen, S. Integrated Data and Knowledge Driven Methodology for Human Activity Recognition. Inf. Sci. 2020, 536, 409–430. [Google Scholar] [CrossRef]
- Khaire, P.; Kumar, P.; Imran, J. Combining CNN Streams of RGB-D and Skeletal Data for Human Activity Recognition. Pattern Recognit. Lett. 2018, 115, 107–116. [Google Scholar] [CrossRef]
- Xie, X.; Huang, G.; Zarei, R.; Ji, Z.; Ye, H.; He, J. A Novel Nest-Based Scheduling Method for Mobile Wireless Body Area Networks. Digit. Commun. Netw. 2020. [Google Scholar] [CrossRef]
- Zhou, X. Wearable Health Monitoring System based on Human Motion State Recognition. Comput. Commun. 2020, 150, 62–71. [Google Scholar] [CrossRef]
- Li, G.; Li, C. Learning Skeleton Information for Human Action Analysis Using Kinect. Signal Process. Image Commun. 2020, 84, 115814. [Google Scholar] [CrossRef]
- Motion Sensor—Motion, Light and Temperature Sensor. Available online: https://www.fibaro.com/en/products/motion-sensor/ (accessed on 12 August 2020).
- Technology That’s Inventing the Future. Available online: https://www.fitbit.com/us/technology (accessed on 12 August 2020).
- Mi Band—Understand Your Every Move. Available online: https://www.mi.com/global/miband (accessed on 12 August 2020).
- MAX25205—Gesture Sensor for Automotive Applications. Available online: https://www.maximintegrated.com/en/products/sensors/MAX25205.html (accessed on 12 August 2020).
- De, P.; Chatterjee, A.; Rakshit, A. Recognition of Human Behavior for Assisted Living Using Dictionary Learning Approach. IEEE Sens. J. 2018, 16, 2434–2441. [Google Scholar] [CrossRef]
- Lima, Y.; Gardia, A.; Pongsakornsathien, N.; Sabatini, R.; Ezer, N.; Kistan, T. Experimental Characterization of Eye-tracking Sensors for Adaptive Human-Machine Systems. Measurement 2019, 140, 151–160. [Google Scholar] [CrossRef]
- Shu, Y.; Xiong, C.; Fan, S. Interactive Design of Intelligent Machine Vision based on Human–Computer Interaction Mode. Microprocess. Microsyst. 2020, 75, 103059. [Google Scholar] [CrossRef]
- Anitha, G.; Priya, S.B. Posture based Health Monitoring and Unusual Behavior Recognition System for Elderly Using Dynamic Bayesian Network. Clust. Comput. 2019, 22, 13583–13590. [Google Scholar] [CrossRef]
- Wang, Z.; Wang, J.; Zhao, H.; Qiu, S.; Li, J.; Gao, F.; Shi, X. Using Wearable Sensors to Capture Posture of the Human Lumbar Spine in Competitive Swimming. IEEE Trans. Hum. Mach. Syst. 2019, 49, 194–205. [Google Scholar] [CrossRef]
- Concepción, M.A.; Morillo, L.M.S.; García, J.A.A.; González-Abril, L. Mobile Activity Recognition and Fall Detection System for Elderly People Using Ameva Algorithm. Pervasive Mob. Comput. 2017, 34, 3–13. [Google Scholar] [CrossRef] [Green Version]
- Hbali, Y.; Hbali, S.; Ballihi, L.; Sadgal, M. Skeleton-based Human Activity Recognition for Elderly Monitoring Systems. IET Comput. Vis. 2018, 12, 16–26. [Google Scholar] [CrossRef]
- Yu, Z.; Lee, M. Human Motion based Intent Recognition Using a Deep Dynamic Neural Model. Robot. Auton. Syst. 2015, 71, 134–149. [Google Scholar] [CrossRef]
- Bragança, H.; Colonna, J.G.; Lima, W.S.; Souto, E. A Smartphone Lightweight Method for Human Activity Recognition Based on Information Theory. Sensors 2018, 20, 1856. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cardenas, E.J.E.; Chavez, G.C. Multimodel Hand Gesture Recognition Combining Temporal and Pose Information based on CNN Descriptors and Histogram of Cumulative Magnitude. J. Vis. Commun. Image Represent. 2020, 71, 102772. [Google Scholar] [CrossRef]
- Lima, W.S.; Souto, E.; El-Khatib, K.; Jalali, R.; Gama, J. Human Activity Recognition Using Inertial Sensors in a Smartphone: An Overview. Sensors 2019, 19, 3213. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Smedt, Q.D.; Wannous, H.; Vandeborre, J.-P. Heterogeneous Hand Gesture Recognition Using 3D Dynamic Skeletal Data. Comput. Vis. Image Underst. 2019, 181, 60–72. [Google Scholar] [CrossRef] [Green Version]
- Muller, P.; Begin, M.-A.; Schauer, T.; Seel, T. Alignment-Free, Self-Calibrating Elbow Angles Measurement Using Inertial Sensors. IEEE J. Biomed. Health Inform. 2017, 21, 312–319. [Google Scholar] [CrossRef]
- Kianifar, R.; Lee, A.; Raina, S.; Kulic, N. Automated Assessment of Dynamic Knee Valgus and Risk of Knee Injury During the Single Leg Squat. IEEE J. Transl. Eng. Health Med. 2017, 5, 2100213. [Google Scholar] [CrossRef]
- Baghdadi, A.; Cavuoto, L.A.; Crassidis, J.H. Hip and Trunk Kinematics Estimation in Gait Through Kalman Filter Using IMU Data at the Ankle. IEEE Sens. J. 2018, 18, 4243–4260. [Google Scholar] [CrossRef]
- Zhu, C.; Sheng, W. Wearable Sensor-Based Hand Gesture and Daily Activity Recognition for Robot-Assisted Living. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2011, 41, 569–573. [Google Scholar] [CrossRef]
- Guo, Z.; Xiao, F.; Sheng, B.; Fei, H.; Yu, S. WiReader: Adaptive Air Handwriting Recognition Based on Commercial Wi-Fi Signal. IEEE Internet Things J. 2020. [Google Scholar] [CrossRef]
- Panwar, M.; Biswas, D.; Bajaj, H.; Jobges, M.; Turk, R.; Maharatna, K.; Acharyya, A. Rehab-Net: Deep Learning Framework for Arm Movement Classification Using Wearable Sensors for Stroke Rehabilitation. IEEE Trans. Med Eng. 2019, 66, 3026–3037. [Google Scholar] [CrossRef]
- Preece, S.J.; Goulermas, J.Y.; Kenney, L.P.J.; Howard, D. A Comparison of Feature Extraction Methods for the Classification of Dynamic Activities from Accelerometer Data. IEEE Trans. Biomed. Eng. 2009, 56, 871–879. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Chen, K.; Yi, J.; Liu, T.; Pan, Q. Whole-Body Pose Estimation in Human Bicycle Riding Using a Small Set of Wearable Sensors. IEEE ASME Trans. Mech. 2016, 21, 163–174. [Google Scholar] [CrossRef]
- Ross, G.; Dowling, B.; Troje, N.F.; Fischer, S.L.; Graham, R.B. Objectively Differentiating Whole-body Movement Patterns between Elite and Novice Athletes. Med. Sci. Sports Exerc. 2018, 50, 1457–1464. [Google Scholar] [CrossRef]
- Dahmani, D.; Larabi, S. User-Independent System for Sign Language Finger Spelling Recognition. J. Vis. Commun. Image Represent. 2014, 25, 1240–1250. [Google Scholar] [CrossRef]
- Ni, P.; Lv, S.; Zhu, X.; Cao, Q.; Zhang, W. A Light-weight On-line Action Detection with hand Trajectory for Industrial Surveillance. Digit. Commun. Netw. 2020. [Google Scholar] [CrossRef]
- Yagi, K.; Sugiura, Y.; Hasegawa, K.; Saito, H. Gait Measurement at Home Using A Single RGB Camera. Gait Posture 2020, 76, 136–140. [Google Scholar] [CrossRef] [PubMed]
- Devanne, M.; Berretti, S.; Pala, P.; Wannous, H.; Daoudi, M.; Bimbo, A.D. Motion Segment Decomposition of RGB-D Sequence of Human Behavior Understanding. Pattern Recognit. 2017, 61, 222–233. [Google Scholar] [CrossRef] [Green Version]
- Xu, W.; Su, P.; Cheung, S.S. Human Body Reshaping and Its Application Using Multiple RGB-D Sensors. Signal Process. Image Commun. 2019, 79, 71–81. [Google Scholar] [CrossRef]
- Wu, H.; Huang, Z.; Hu, B.; Yu, Z.; Li, X.; Gao, M.; Shen, Z. Real-Time Continuous Action Recognition Using Pose Contexts with Depth Sensors. IEEE Access 2018, 6, 51708. [Google Scholar] [CrossRef]
- Ding, C.; Zhang, L.; Gu, C.; Bai, L.; Liao, Z.; Hong, H.; Li, Y.; Zhu, X. Non-Contact Human Motion Recognition based on UWB Radar. IEEE J. Emerg. Sel. Top. Circuits Syst. 2018, 8, 306–315. [Google Scholar] [CrossRef]
- Kim, S.-H.; Geem, Z.W.; Han, G.-T. A Novel Human Respiration Pattern Recognition Using Signals of Ultra-Wideband Radar Sensors. Sensors 2019, 19, 3340. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shrestha, A.; Li, H.; Kernec, J.L.; Fioranelli, F. Continuous Human Activity Classification from FMCW Radar with Bi-LSTM Networks. IEEE Sens. J. 2020. [Google Scholar] [CrossRef]
- Wang, Y.; Zheng, Y. An FMCW Radar Transceiver Chip for Object Positioning and Human Limb Motion Detection. IEEE Sens. J. 2017, 17, 236–237. [Google Scholar] [CrossRef]
- Han, Z.; Lu, Z.; Wen, X.; Zhao, J.; Guo, L.; Liu, Y. In-Air Handwriting by Passive Gesture Tracking Using Commodity WiFi. IEEE Commun. Lett. 2020. [Google Scholar] [CrossRef]
- Li, C.; Liu, M.; Cao, Z. WiHF: Gesture and User Recognition with WiFi. IEEE Trans. Mob. Comput. 2020. [Google Scholar] [CrossRef]
- Oguntala, G.A.; Abd-Alhameed, R.A.; Hu, Y.-F.; Noras, J.M.; Eya, N.N.; Elfergani, I.; Rodriguez, J. SmartWall: Novel RFID-Enabled Ambient Human Activity Recognition Using Machine Learning for Unobtrusive Health Monitoring. IEEE Access 2019, 7, 68022. [Google Scholar] [CrossRef]
- Wang, F.; Liu, J.; Gong, W. Multi-Adversarial In-Car Activity Recognition Using RFIDs. IEEE Trans. Mob. Comput. 2020. [Google Scholar] [CrossRef]
- Jahanandish, M.H.; Fey, N.P.; Hoyt, K. Lower Limb Motion Estimation Using Ultrasound Imaging: A Framework for Assistive Device Control. IEEE J. Biomed. Health Inform. 2019, 23, 2505–2514. [Google Scholar] [CrossRef]
- Zhou, F.; Li, X.; Wang, Z. Efficient High Cross-User Recognition Rate Ultrasonic Hand Gesture Recognition System. IEEE Sens. J. 2020. [Google Scholar] [CrossRef]
- Ling, K.; Dai, H.; Liu, Y.; Liu, A.X. UltraGuest: Fine-Grained gesture sensing and recognition. In Proceedings of the 15th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON), Hong Kong, China, 11–13 June 2018. [Google Scholar]
- Vanrell, S.R.; Milone, D.H.; Rufiner, H.L. Assessment of Homomorphic Analysis Human Activity Recognition from Acceleration Signals. IEEE J. Biomed. Health Inform. 2018, 22, 1001–1010. [Google Scholar] [CrossRef]
- Hsu, Y.-L.; Yang, S.-C.; Chang, H.-C.; Lai, H.-C. Human Daily and Sport Activity Recognition Using a Wearable Inertial Sensor Network. IEEE Access 2018, 6, 31715. [Google Scholar] [CrossRef]
- Villeneuve, E.; Harwin, W.; Holderbaum, W.; Janko, B.; Sherratt, R.S. Reconstruction of Angular Kinenatics From Wrist-Worn Inertial Sensor Data for Smart Home Healthcare. IEEE Access 2017, 5, 2351. [Google Scholar] [CrossRef]
- Chelli, A.; Patzold, M. A Machine Learning Approach for Fall Detection and Daily Living Activity Recognition. IEEE Access 2019, 7, 38670. [Google Scholar] [CrossRef]
- Cha, Y.; Kim, H.; Kim, D. Flexible Piezoelectric Sensor-Based Gait Recognition. Sensors 2018, 18, 468. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wang, H.; Tong, Y.; Zhao, X.; Tang, Q.; Liu, Y. Flexible, High-Sensitive, and Wearable Strain Sensor based on Organic Crystal for Human Motion Detection. Org. Electron. 2018, 61, 304–311. [Google Scholar] [CrossRef]
- Redd, C.B.; Bamberg, S.J.M. A Wireless Sensory Feedback Device for Real-time Gait Feedback and Training. IEEE ASME Trans. Mechatron. 2012, 17, 425–433. [Google Scholar] [CrossRef]
- Jarrassé, N.; Nicol, C.; Touillet, A.; Richer, F.; Martinet, N.; Paysant, J.; Graaf, J.B. Classification of Phantom Finger, Hand, Wrist, and Elbow Voluntary Gestures in Transhumeral Amputees with sEMG. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 68–77. [Google Scholar] [CrossRef] [Green Version]
- Li, Z.; Guan, X.; Zou, K.; Xu, C. Estimation of Knee Movement from Surface EMG Using Random Forest with Principal Component Analysis. Electronics 2020, 9, 43. [Google Scholar] [CrossRef] [Green Version]
- Raurale, S.A.; McAllister, J.; Rincon, J.M. Real-Time Embedded EMG Signal Analysis for Wrist-Hand Pose Identification. IEEE Trans. Signal Process. 2020, 68, 2713–2723. [Google Scholar] [CrossRef]
- Di Nardo, F.; Morbidoni, C.; Cucchiarelli, A.; Fioretti, S. Recognition of Gait Phases with a Single Knee Electrogoniometer: A Deep Learning Approach. Electronics 2020, 9, 355. [Google Scholar] [CrossRef] [Green Version]
- Li, H.; Shrestha, A.; Heidari, H.; Kernec, J.L.; Fioranelli, F. Bi-LSTM Network for Multimodel Continuous Human Activity Recognition and Fall Detection. IEEE Sens. J. 2020, 20, 1191–1201. [Google Scholar] [CrossRef] [Green Version]
- Lee, H.-C.; Ke, K.-H. Monitoring of Large-Area IoT Sensors Using a LoRa Wireless Mesh Network System: Design and Evaluation. IEEE Trans. Instrum. Meas. 2018, 67, 2177–2187. [Google Scholar] [CrossRef]
- Kim, J.-Y.; Park, G.; Lee, S.-A.; Nam, Y. Analysis of Machine Learning-based Assessment of Elbow Spasticity Using Inertial Sensors. Sensors 2020, 20, 1622. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lee, J.K.; Han, S.J.; Kim, K.; Kim, Y.H.; Lee, S. Wireless Epidermal Six-Axis Inertial Measurement Units for Real-time Joint Angle Estimation. Appl. Sci. 2020, 10, 2240. [Google Scholar] [CrossRef] [Green Version]
- Preece, S.J.; Goulermas, J.Y.; Kenney, L.P.J.; Howard, D.; Meijer, K.; Crompton, R. Activity identification Using Body-mounted Sensors—A Review of Classification Techniques. Physiol. Meas. 2009, 30, 1–33. [Google Scholar] [CrossRef]
- Sun, H.; Lu, Z.; Chen, C.-L.; Cao, J.; Tan, Z. Accurate Human Gesture Sensing with Coarse-grained RF Signatures. IEEE Access 2019, 7, 81228. [Google Scholar] [CrossRef]
- Piris, C.; Gärtner, L.; González, M.A.; Noailly, J.; Stöcker, F.; Schönfelder, M.; Adams, T.; Tassani, S. In-Ear Accelerometer-Based Sensor for Gait Classification. IEEE Sens. J. 2020. [Google Scholar] [CrossRef]
- Vu, C.C.; Kim, J. Human Motion Recognition by Textile Sensors based on Machine Learning Algorithms. Sensors 2018, 18, 3109. [Google Scholar] [CrossRef] [Green Version]
- Barut, O.; Zhou, L.; Luo, Y. Multi-task LSTM Model for Human Activity Recognition an Intensity Estimation Using Wearable Sensor Data. IEEE Internet Things J. 2020. [Google Scholar] [CrossRef]
- Christ, M.; Braun, N.; Neuffer, J.; Kempa-Liehr, A.W. Time Series FeatuRe Extraction on basis of Scalable Hypothesis Tests (Tsfresh—A Python Package). Neurocomputing 2018, 307, 72–77. [Google Scholar] [CrossRef]
Human Body Parts | Activities | Example Applications |
---|---|---|
Hand | Finger movement | Rheumatoid arthritis treatment [11] |
Hand gesture | Robot-assisted living [62] | |
Hand movement | Handwriting recognition [63] | |
Upper limb | Forearm movement | Home health care [64] |
Arm swing | Golf swing analysis [7] | |
Elbow angle | Stroke rehabilitation [59] | |
Lower limb | Gait analysis | Gait rehabilitation training [10] |
Knee angles in movement | Risk assessment of knee injury [60] | |
Ankle movement | Hip and trunk kinematics [61] | |
Stairs ascent/descent | Human activity classification [65] | |
Spine | Swimming | Swimming motion evaluation [51] |
Whole body | Fall detection | Elderly care [18] |
Whole-body posture | Bicycle riding analysis [66] | |
Whole-body movement | Differentiating people [67] |
Type | Sensing Techniques | Typical Applications |
---|---|---|
Optical | RGB camera | Finger spelling recognition [68], detection of hand trajectories for industrial surveillance [69], gait measurement at home [70] |
RGB-D | Hand gesture recognition [51], human behavior understanding [71], human body reshaping [72] | |
Depth sensor | 3D dynamic gesture recognition [58], real-time continuous action recognition [73], full hand pose recognition [15] | |
RF | IR-UWB 1 | Non-contact human gait identification [17], non-contact human motion recognition [74], human respiration pattern recognition [75] |
FMCW radar | Continuous human activity classification [16,76], human limb motion detection [77] | |
Commodity WiFi | Contactless fall detection [18], handwritting recognition [63,78], gesture and user detection [79] | |
UHF 2 RFID | Ambient assisted living [80], in-car activity recognition [81] | |
Acoustic | Ultrasound | Lower limb motion estimation [82], hand gesture recognition [83], finger motion perception and recognition [84] |
Inertial | 3-axis accelerometer | Discrimination of human activity [85], arm movement classification for stroke rehabilitation [64] |
6- or 9-axis IMU | Recognition of daily and sport activity [86], hip and trunk motion estimation [61], assessment of leg squat [60], measurement of elbow angles for physical rehabilitation [59], reconstruction of angular kinematics for smart home healthcare [87], fall detection and daily activity recognition [88] | |
Force | Piezoelectric, FSR 3, thin film strain sensor, etc. | Wearable gait recognition [89], human motion detection [90], real-time gait feedback and training [91] |
EMG | EMG electrodes | Classification of finger, hand, wrist, and elbow gestures [92]; knee movement estimation [93]; wrist-hand pose identification [94] |
Standards | Range | Maximum Data Rate | Carrier Frequency | Energy Consumption | Network Topology |
---|---|---|---|---|---|
Bluetooth (Ver. 4.2) | 10 m | 2 Mbps | 2.4 GHz | Medium | P2P |
BLE | 10 m | 1 Mbps | 2.4 GHz | Low | Star, mesh |
ZigBee | 50 m | 250 Kbps | 2.4 GHz | Low | Star, tree, mesh |
WiFi (802.11 n) | 10–50 m | 450 Mbps | 2.4 GHz | High | Star, tree, mesh, P2P |
Types | Examples of Features |
---|---|
Heuristic | Minimum, maximum, zero-crossing, threshold amplitude, inclination angles, signal magnitude area (SMA), etc. |
Time domain | Mean, variance, standard deviation (SD), skewness, kurtosis, root mean square (RMS), median absolute square (MAS), interquartile range, minimum, interquartile range, auto-correlation, cross correlation, motion length, motion cycle duration, kinematic asymmetry, etc. |
Frequency domain | Spectral distribution, coefficients sum, spectral entropy, etc. |
Time–frequency | Approximation coefficients, detail coefficients, transition times |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Meng, Z.; Zhang, M.; Guo, C.; Fan, Q.; Zhang, H.; Gao, N.; Zhang, Z. Recent Progress in Sensing and Computing Techniques for Human Activity Recognition and Motion Analysis. Electronics 2020, 9, 1357. https://doi.org/10.3390/electronics9091357
Meng Z, Zhang M, Guo C, Fan Q, Zhang H, Gao N, Zhang Z. Recent Progress in Sensing and Computing Techniques for Human Activity Recognition and Motion Analysis. Electronics. 2020; 9(9):1357. https://doi.org/10.3390/electronics9091357
Chicago/Turabian StyleMeng, Zhaozong, Mingxing Zhang, Changxin Guo, Qirui Fan, Hao Zhang, Nan Gao, and Zonghua Zhang. 2020. "Recent Progress in Sensing and Computing Techniques for Human Activity Recognition and Motion Analysis" Electronics 9, no. 9: 1357. https://doi.org/10.3390/electronics9091357
APA StyleMeng, Z., Zhang, M., Guo, C., Fan, Q., Zhang, H., Gao, N., & Zhang, Z. (2020). Recent Progress in Sensing and Computing Techniques for Human Activity Recognition and Motion Analysis. Electronics, 9(9), 1357. https://doi.org/10.3390/electronics9091357