Authors:
Hisham Madcor
1
;
Osama Adel
1
and
Walid Gomaa
1
;
2
Affiliations:
1
Cyber-Physical Systems Lab, Department of Computer Science and Engineering, Egypt-Japan University of Science and Technology, Borg El-Arab, Alexandria, Egypt
;
2
Faculty of Engineering Alexandria University, Alexandria, Egypt
Keyword(s):
Sensor Position, Multi-sensory, Intertial Sensors, Human-centered Computing.
Abstract:
Human Activity Recognition has gained tremendous drive in recent years. This is due to the increasing ubiquity of all types of sensors in commodity devices such as smartphones, smart watches, tablets, etc. This has made available to the normal user a continuous stream of data including visual data, inertial motion data, audio, etc. In this paper we focus on data streamed from inertial motion units (IMUs). Such units are currently embedded on almost all wearable devices including smart watches, wrist bands, etc. In many research works, as well as in many real applications, different specialized IMU units are mounted on different body parts. In the current work, we try to answer the following question: given the streamed inertial signals of a gait pattern, as well as some other activities, determine which sensor location on the subject’s body generated this signal. We validate our work on several datasets that contain multi-dimensional measurements from a multitude of sensors mounted o
n different body parts. The main sensors used are the accelerometer and gyroscope. We use the Random Forest Classifier over the raw data without any prior feature extraction. This has proven yet very effective as evidenced by the results using different metrics including accuracy, precision, recall, F1-score, etc. An important application of such research can be in data augmentation of timeseries inertial data. This can be used as well for healthcare applications, for example, in treatment assessment for people with motion disabilities.
(More)