sensors
Article
Smart Wearables with Sensor Fusion for Fall Detection
in Firefighting
Xiaoqing Chai 1 , Renjie Wu 1 , Matthew Pike 1 , Hangchao Jin 2 , Wan-Young Chung 3
1
2
3
*
Citation: Chai, X.; Wu, R.; Pike, M.;
Jin, H.; Chung, W.-Y.; Lee, B.-G. Smart
Wearables with Sensor Fusion for Fall
Detection in Firefighting. Sensors
2021, 21, 6770. https://doi.org/
and Boon-Giin Lee 1, *
School of Computer Science, Faculty of Science and Engineering, The University of Nottingham Ningbo
China, Ningbo 315100, China; xiaoqing.chai@nottingham.edu.cn (X.C.); renjie.wu@nottingham.edu.cn (R.W.);
matthew.pike@nottingham.edu.cn (M.P.)
Ningbo Municipal Public Security Fire Brigade, Haishu Detachment, Ningbo 315100, China;
jhc_2006@163.com
Department of Electronic Engineering, Pukyong National University, Busan 48513, Korea;
wychung@pknu.ac.kr
Correspondence: boon-giin.lee@nottingham.edu.cn
Abstract: During the past decade, falling has been one of the top three causes of death amongst
firefighters in China. Even though there are many studies on fall-detection systems (FDSs), the
majority use a single motion sensor. Furthermore, few existing studies have considered the impact
sensor placement and positioning have on fall-detection performance; most are targeted toward fall
detection of the elderly. Unfortunately, floor cracks and unstable building structures in the fireground
increase the difficulty of detecting the fall of a firefighter. In particular, the movement activities of
firefighters are more varied; hence, distinguishing fall-like activities from actual falls is a significant
challenge. This study proposed a smart wearable FDS for firefighter fall detection by integrating
motion sensors into the firefighter’s personal protective clothing on the chest, elbows, wrists, thighs,
and ankles. The firefighter’s fall activities are detected by the proposed multisensory recurrent neural
network, and the performances of different combinations of inertial measurement units (IMUs) on
different body parts were also investigated. The results indicated that the sensor fusion of IMUs
from all five proposed body parts achieved performances of 94.10%, 92.25%, and 94.59% in accuracy,
sensitivity, and specificity, respectively.
10.3390/s21206770
Academic Editor: Jeffrey M.
Keywords: fall detection system; deep learning; wearable IOT technology; inertial measurement unit
(IMU); multisensory fusion
Hausdorff
Received: 14 September 2021
Accepted: 3 October 2021
1. Introduction
Published: 12 October 2021
Falling activities, including struck, dropping, and fainting, are some of the primary
causes of firefighter fatalities in China (Figure 1) [1]. In the United States in 2018, overexertion and stress were the leading causes of death in firefighters, according to a US National
Fire Protection Association (NFPA) report [2]. A major cause of this difference is that the
personal protective equipment (PPE) for fall detection still requires improvement to better
guarantee safety [3].
The current FDS used by firefighters in China is a standard personal-alert safety
system (PASS) device for detecting a firefighter’s immobility (see Figure 2). It produces a
high-volume sound if no motion is detected after a short period (typically 30 s). There are
concerns, however, that this delay may be a critical deciding factor between life and death
in a real emergency.
In addition, the lack of training and experience is also a major cause. According to the
2019 annual report from the International Association of Fire and Rescue Services (CTIF) [4],
among a total of 7,630,000 firefighters in China, only 130,000 are career firefighters, accounting for only 1.7%. The rest are all volunteer firefighters. However, the percentage of career
firefighters is approximately 33.2% in the USA. These young volunteer firefighters with
Publisher’s Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affiliations.
Copyright: © 2021 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Sensors 2021, 21, 6770. https://doi.org/10.3390/s21206770
https://www.mdpi.com/journal/sensors
Sensors 2021, 21, 6770
2 of 18
short training periods and little firefighting experience are often unable to master complex
firefighting skills, especially in dangerous situations, such as burning floors or unstable
building structures [3].
Figure 1. Comparison of the causes of firefighters’ fatalities in China (left) and in the USA (right),
2018 [1,2].
Figure 2. PASS device equipped by a firefighter in China.
In the majority of recorded fatalities on the fireground, the incident commander was
unaware of a fallen firefighter and, therefore, unable to instigate a rescue in time. The
smoke-filled environment also decreased firefighters’ ability to identify a peer’s injuries
and safety in a timely manner. In general, the firefighters’ education principle in China was
to learn and gain experience through actual firefighting tasks, with a high learning cost of
their life and safety [5]. These issues highlight the need for smart protection measurements
to detect firefighters’ falling activities to ensure their safety during firefighting missions.
Several studies [6–8] investigated the application of context-aware systems (CASs)
and wearable sensors for FDSs. A CAS generally integrates a set of vision systems with
cameras and other sensors, such as microphones or vibration sensors, placed in a welllit environment to monitor the user within the range of view [9]. Wearable sensors are
commonly used to analyze a human’s fall activities, based on the motion pattern [10]
and physiological status [11]. Even though FDSs have been widely deployed in the
healthcare sector, especially for elderly people, the system is targeted at detecting slowfalling movements, which is not suitable for firefighters.
The current PASS device has the following deficiencies: (1) a long delay of approximately 30 s to raise the alert; (2) audible alarms are insufficient, if a firefighter is far away
from his peers or in a noisy environment; and (3) the action commander cannot receive
the alert soon enough to carry out a timely rescue. This study aims to develop a smart
FDS, integrated with wearable sensors, for detecting the fall activities of firefighters, especially in harsh environments, and alert the action commander in time to organize a rescue.
The proposed FDS applied GA10 & CCC certified national personal-protective clothing
(PPC) with embedded motion sensors to gather firefighter moving-activity data. The key
innovative aspects of this study are as follows.
Sensors 2021, 21, 6770
3 of 18
•
•
•
Performance evaluation of firefighter fall detection based on motion sensors that are
placed in different parts of the body (PPC), including chest, elbows, wrists, thighs,
and ankles.
Aim to build a high realistic falling related movements dataset through collaboration
with real firefighters.
Proposes a novel fall-detection model which is trained with deep learning approach
that can classify actual falls and fall-like events.
The rest of the paper is organized as follows: Section 2 presents a literature review of
various FDS approaches, detailing the advantages and disadvantages; Section 3 describes
the proposed smart wearable PPC prototype design which is used in for data collection,
and discusses the details of the proposed FDS algorithms; Section 4 presents the evaluation
of FDS models with various combinations of IMUs, followed by the discussion of the fall
detection performance in each activity, and subsequently perform results comparison with
the existing works; Finally, we conclude with a discussion of our future work in Section 5.
2. Related Works
Fall-detection methods can be categorized into two approaches: vision-based and
non-vision-based. In a vision-based approach, fall detection is modeled based on images or
videos obtained from different types of cameras [9]. Iazzi et al. [12] proposed a vision-based
fall detector by isolating the different activities using a simple threshold method. The
approach first applied background subtraction to images obtained from an RGB camera
to compute a human silhouette. It then classified the activities, such as lying, bending,
sitting, and standing, by comparing the percentage of a human silhouette on the ground
with a predefined threshold. However, they also indicated that the system was inadequate
if multiple people were in the vision view. Moreover, a study by [13] presented an active
vision system for fall detection, based on skeleton movement captured by a 3D camera.
Several works [14,15] also analyzed head-moving patterns, where large head movements
indicated a high possibility of falling.
Nevertheless, vision-based solutions have several intractable issues in firefighting
applications, such as poor performance in dark, low-light, or smoke-filled environments,
limited range of vision, and views obstructed by obstacles in the field [16]. Hence, some
studies have explored the potential use of non-vision-based approaches to reduce the
constraints of vision-based approaches. The advantages of such approaches include lower
costs, in terms of algorithm computation and image processing, privacy protection, portability, and less affected by the environment [17].
IMUs embedded in mobile devices, such as smartphones and smartwatches, are
commonly used by many researchers for fall detection, typically in healthcare areas for
the elderly [18–23]. These studies extract motion data from a common nine degreesof-freedom (DOF) IMU for classifying fall activities, assuming that the smartphone is
stored in the pants pocket or a smartwatch is worn on the wrist. Nonetheless, carrying
a smartphone into a fireground can affect the performance of firefighting activities and,
thus, is prohibited. Hence, this approach is inapplicable and not considered in this study.
Instead, a sensor-fusion technique that integrates diverse wearable sensors for fall detection
is a better alternative.
Lee et al. [24] added an IMU with an RGB camera to improve the fall-recognition
rate and reduce the false-detection rate of falls. The study utilized a robot equipped with
a camera that could move toward the fall subject for further verification of a fall, if the
IMU-based classifier first predicted a fall-like activity. On the other hand, Kwolek et al. [25]
integrated a Kinect sensor and an IMU for fall detection, reducing the false fall alerts, which
improved the overall fall-detection performance from 90% (using only depth information
from the Kinect sensor) to 98.33%.
Sensor-fusion techniques with integrated vision-based sensors are not useful for
improving the fall detection of firefighters in smoke-filled and harsh environments. In
addition, Chen et al. [26] developed a shoe integrated with a barometer and IMU for a
Sensors 2021, 21, 6770
4 of 18
stair-based fall-risk detection system by measuring foot movements going upstairs or
downstairs. The study also revealed that the multisensory system performed better than a
single-sensor system in falling-risk evaluations.
Several studies have emphasized the use of PASS and PPC in fall detection for firefighters [27–30]. Van Thanh et al. [27] proposed a device wearable on the waist, integrated
with a triaxial accelerometer and a carbon monoxide (CO) sensor to monitor falls in fire
environments. The CO sensor is specifically utilized for detecting falls caused by broken
air-support devices. They improved their system in [28] with a sensor-fusion approach
using a 9-DOF IMU, a CO sensor, and a barometer. However, the datasets they collected
are not publicly available.
Geng et al. [29] proposed a novel health-monitoring system with electrocardiogram
(ECG), electroencephalogram (EEG), and blood-pressure measurements to recognize the
motion events of firefighters by extracting features obtained from the on-body radiofrequency (RF) channel. The average true classification rate is 88.69%.
Moreover, Blecha et al. [30] proposed functional wearable PPC for firefighters that
could monitor their physiological status (heart rate and temperature), detect firefighter
movements, and measure environmental information, such as the relative humidity and
concentration of toxic gases. In addition, they also designed a commander control unit as a
terminal to receive functional data from the PPC and alert the commander if any safety
risk to the firefighters was detected.
In summary, among existing fall-detection studies using wearable-type sensors, most
studies are targeted at detecting the falls of the elderly. Moreover, few to no public datasets
are available for the study and analysis of firefighter fall detection. Our work makes
important contributions that compensate for these shortcomings.
1.
2.
3.
Building a dataset by collecting motion data of actual firefighters, including falls and
fall-like activities, for academic research purposes,
Investigating the optimization of motion sensors in fall-activity classification, in terms
of their quantity and placement on firefighter protective clothing, and
Presenting a fall-detection framework applied for firefighters, especially when they
are often working in high-stress situations.
3. Materials and Methods
This section discusses using the proposed novel design of smart PPC for firefighters to
gather firefighter motion data. Next, the experimental setup for data collection is discussed,
followed by a detailed description of the proposed FDS.
3.1. Smart PPC Prototype
The placement of wearable sensors plays an essential role in recognizing falls with a
high accuracy rate [10]. The important motion data that contribute to fall-event detection
are associated with the moving patterns of the chest, elbow, and wrist from the upper
body, and the thigh and ankle from the lower part of the body. Hence, BNO055 IMUs [31],
consisting of a triaxial accelerometer, triaxial gyroscope, and triaxial magnetometer, are
integrated with wired connections on the back of the protective jacket (PJ) and protective
trousers (PT), as shown in Figure 3. The angular velocity ranges from ±125 deg/s to
±2000 deg/s with a low-pass filter bandwidth from 523 Hz to 12 Hz, while the acceleration
ranges from ±2 g to ±16 g with a low-pass filter bandwidth from 1 kHz to 8 Hz, and the
measurement range of the magnetometer is about ±4800 uT with a resolution of 0.3 uT.
The maximum output rate of 9-DOF fusion data is 100 Hz. However, considering fast data
transmission can result in low data receiving efficiency, both sampling rate of the 9-DOF
data and wireless transmission rate were set to 15 Hz in this study after several trials for
optimization. An IMU was not placed on the shoulder because shoulder movement is
always associated with chest movement.
Sensors 2021, 21, 6770
5 of 18
Figure 3. Placement of motion sensors that are mapped to the body parts where the motion data are
critical for fall-detection computation.
Two processing units are placed on the chest and waist for receiving and transmitting
IMU data from the PJ and PT, respectively. Each processing unit consists of a Seeeduino
XIAO micro-controller unit (MCU) with a 20 × 17.5 mm2 size and 3.3-V power consumption [32], a TCA9548A 1-to-8 I2C multiplexer [33] for multisensor connections, a Bluetooth
low-energy (BLE) 4.2 module [34], and a 3.7-V 400-mAh lithium-ion battery [35], as illustrated in Figure 4. The IMU sensors are connected to a processing unit with wires soldered
onto the PJ and PT.
Table 1 summarizes the components of the sensing module with their respective
specifications. To reduce the risk of destroying the components (including IMUs, processing
units, and wires) during the data collection, foam boards were placed on top of all the
components for protection, and rubber tape was used to secure the wire connections
between the processing unit and IMUs, as depicted in Figure 5. Finally, the IMU data from
the PJ and PT are transmitted to a terminal via BLE 4.2 for further processing, as shown
in Figure 6.
Table 1. Components and their respective specifications in the sensing module.
Components
Specification
IMU
Triaxial accelerometer
Triaxial gyroscope
Triaxial magnetometer
Operating voltage: 3 V to 5 V
Seeeduino XIAO MCU
Operating voltage: 3.3 V/5 V
CPU: 40 MHz ARM Cortex-M0+
Flash memory: 256 KB
RAM: 32 KB
Size: 20 × 17.5 × 3.5 mm
I2C: 1 pair
TCA29548A multiplexer
Operating voltage: 3V to 5V
I2C: 8 pairs
JDY-18 BLE
Operating voltage: 1.8 V to 3.6 V
BLE version: 4.2
Frequency: 2.4 GHz
Size: 27 × 12.8 × 1.6 mm
Lithium-lon battery
Power supply: 3.7 V
Capacity: 400 mAh
Sensors 2021, 21, 6770
6 of 18
Figure 4. Processing unit that consists of an MCU, an I2C multiplexer, a BLE 4.2 module, and a
lithium-ion battery.
Figure 5. (a) Foam board is placed on the IMU for component protection and (b) rubber tape is used
to secure the wire connections between the processing unit and IMUs.
Figure 6. Overall design of the communication framework from the PJ and PT to a terminal via BLE 4.2 wireless transmission.
Sensors 2021, 21, 6770
7 of 18
3.2. Dataset Collection
Yan et al. [36] states that falls may be categorized into four distinct types: basic forward,
backward, left, and right lateral falls. The fall-like activities can be further subdivided into
basic forward, backward, left, and right lateral falls. However, a fall event is much more
complicated during firefighting activities. Some specific activities, such as slipping, sliding,
and fainting, can also result in falls [37]. Several existing studies [36,38,39] demonstrate the
challenges of differentiating fall-like activities, such as sitting quickly, jumping onto a bed,
and lying down slowly, from an actual fall. However, these existing fall datasets are not
suitable for detecting the falls of firefighters because most of the recorded activities do not
closely simulate realistic falling events in a fireground, such as jumping onto a bed.
This study initiated a collaboration with firefighters from the Haishu District Fire
Brigade from Ningbo City, Zhejiang Province, China, to obtain realistic fall events by
firefighters, based on their experiences. Fourteen male firefighters (with one to three years
of firefighting experience, ages between 21 and 24 years old, with heights between 1.7 and
1.88 m) voluntarily participated in the data collection. Six of them were career firefighters
and the other eight were volunteer firefighters.
Six types of fall activities were collected, including a forward fall with the knees,
forward fall with the hands, left and right sides of inclined falls, backward fall, and a slow
forward fall with a crouch. Three other activities, including crouching, sitting, and walking
with a stoop, were also collected as fall-like activities. Each firefighter was requested to
put on the developed PJ and PT and simulate falls and fall-like activities, based on their
firefighting experience. The details of the falls and fall-like activities are illustrated in
Figure 7. The number of trials for each activity and total trials are summarized in Table 2.
Figure 7. Demonstration of a firefighter with the proposed PJ and PT performing different types of
falls and fall-like activities, including (a) walking to a mat before falling, (b) forward fall with the
knees, (c) forward fall with the hands, (d) left side of an inclined fall, (e) right side of an inclined fall,
(f) slow forward fall with a crouch first, (g) backward fall, (h) fall-like crouching, (i) fall-like sitting,
and (j) fall-like walking with a stoop.
Sensors 2021, 21, 6770
8 of 18
Table 2. Details of activities recorded in the dataset.
Code
F1
F2
F3
F4
Type
Activity
Trials for Each Subject
Total Trials
Falls
forward falls using knees
forward falls using hands
inclined falls left
inclined falls right
slow forward falls with
crouch first
backward falls
5
5
4
4
70
70
56
56
3
42
3
42
crouch
walk with stoop
sit
4
4
3
56
56
42
F5
F6
FL1
FL2
FL3
Fall-like
According to Figure 8, IMU data from the PJ and PT are transmitted to a laptop via
BLE 4.2 wireless communication with a 15-Hz sampling rate. A fall action consists of three
phases: early fall, impact, and recovery [40]. As the aim of this study is to target the falling
event, the recorded data of the early falling phase are labeled as falls, whereas the rest are
labeled as non-falls.
Figure 8. Data collection, via BLE 4.2 wireless transmission to a laptop, of IMU data from the PJ
and PT.
3.3. Framework
The proposed wearable FDS is composed of three modules: (1) sensing, (2) preprocessing, and (3) classification, as illustrated in Figure 9.
3.3.1. Global Calibration of IMUs
Each IMU has a tri-axial coordinate system, as depicted in Figure 10, which can be used
to determine the position and orientation of an object. In fact, a n FDS) with multiple IMUs
placed on different parts of the PPC needs further calibration to synchronize and unify the
coordinate system as a single entity. To achieve a global calibration for IMUs on the PJ and
PT, two common reference vectors are required, as proposed by O’Donovan et al. [37].
Sensors 2021, 21, 6770
9 of 18
Figure 9. Flowchart of the FDS algorithm, where A = triaxial accelerometer data, G = triaxial
gyroscope data, M = triaxial magnetometer data, Q = quaternion data, and E = Euler angles.
Figure 10. Local IMU coordinate system on PJ and PT.
The magnetic field vector (Vmag), which points to the north, is selected as one of the
common reference vectors, assuming no magnetic interference and that the magnetic field
in the vicinity of each IMU is the same. The other reference vector is the acceleration vector
(Vacc) in a quasi-static condition; it can be regarded as the gravity vector pointing to the
ground. The IMU local coordinates can thus be rotated to the north-east-up (NEU) earth
coordinate system (right-hand rule). Figure 11 presents the local and global coordinate
systems with two common reference vectors Vmag and Vacc. The calibration works in
the way that the subject should stand still initially, and it will take 1 second (15 samples)
to compute the average values of Vacc and Vmag. The main purpose of the calibration
is to convert the local coordinate system of each IMU to the unified NEU system. With
the values of Vacc and Vmag as the reference vectors, the rotation matrix of these two
coordinate systems can be computed, and hence, update the collected raw data of each
IMU referred to the NEU system.
Sensors 2021, 21, 6770
10 of 18
Figure 11. Global calibration. (a) Local coordinate system and vectors of Vacc and Vmag and (b) NEU
coordinate system and vectors of Vacc and Vmag.
3.3.2. Data Pre-Processing
The Mahony attitude and heading reference system (AHRS) [41] is utilized to compute
the quaternion and Euler angles from the raw data received from the terminal. The rotation
matrix of each IMU node can be derived as follows:
(1)
q = w + xi + yj + zk
2w2 + 2x2 − 1
2xy − 2zw
R(q) = 2xy + 2zw
2w2 + 2y2 − 1
2xz − 2yw
2yz + 2xw
2xz + 2yw
2yz − 2xw ,
2w2 + 2z2 − 1
(2)
where q, w, x, y, and z represent a quaternion value that consists of a real number (w) and
imaginary values on three imaginary axes (i, j, k).
Each IMU delivered 13 outputs, including tri-axial acceleration (m/s2 ), tri-axial angular rate (deg/s2 ), four-point quaternion data, and the derived roll, pitch, and yaw angles.
To represent the variation of the different movements of a firefighter, given a defined
period, features including the mean (µ) (Equation (3)), range (R) (Equation (4)), standard
deviation (σ) (Equation (5)), and mean absolute deviation (MAD) (Equation (6)), are extracted from the 13 outputs, and calculated using 0.5 s of data each, with a window size
of 0.1 s. As a result, each IMU generates 52 vectorized features (13 outputs × 4 arithmetic calculations), in which the total number of input features to the classifier is 468
features (52 features × 9 IMUs).
1
Σx [k]
N
R = max x − min x
r
1
Σ ( x [ k ] − µ )2
σ=
N
µ=
MAD =
1
Σx [k] − µ
N
(3)
(4)
(5)
(6)
3.3.3. Recurrent Neural Network Classifier
Figure 12 illustrates the neural network architecture, which includes three long shortterm memory (LSTM) layers with 128 units, 32 units, and 16 units, respectively, one dense
layer with an eight-unit rectified linear unit (ReLU) activation function (Equation (7)), and
a softmax activation function with two units (Equation (8)). This represents the probability
of non-fall and fall activities using one-hot encoding. The adaptive moment estimation
(Adam) algorithm with a learning rate of 0.01 is applied as the optimizer of the model, and
Sensors 2021, 21, 6770
11 of 18
the sparse categorical cross-entropy algorithm, shown in Equation (9), is used as the loss
function in the model. The batch size is set to 10, which represents the user’s action in one
second. The dataset is divided into 80% for training and 20% for testing.
ReLU = max(0, x )
p ( yi ) =
e yi
Σn
j =1 e
yi
loss(yi ) = − log p(yi )
(7)
(8)
(9)
Figure 12. LSTM network architecture.
4. Results
First, the performance of the trained model with the proposed LSTM architecture is
presented in Figure 13 for every 10 epochs. It uses an RTX2060 6G RAM GPU with an Intel
i7-9700 CPU (3.0 GHz). The results indicate a slow increment in the accuracy rate after
40 training epochs, with the highest accuracy of 99.95% obtained after 100 training epochs,
using all the sensor data.
Meanwhile, this study has allocated 30 different combinations of sensor’s placements
to further investigate the optimization of the sensors’ placements and the quantity of sensors allocated for the fall detection of firefighters. Five positions were coded, representing
the placement of the IMU on the protective clothing, as listed in Table 3, including the
chest, elbows, wrists, thighs, and ankles. The performance of each combination’s model
is evaluated, including the trained results using accuracy and loss, and the overall test
performance using the following widely used metrics:
•
•
•
•
AUC: Area under the receiver operating characteristic (ROC) curve.
Specificity (Sp): the ability to predict negative samples.
Sensitivity (Se): also called recall; the ratio means the accuracy among all predictions
of falling.
Accuracy (Ac): the accuracy among all predictions, both positive and negative.
The equations for specificity, sensitivity, and accuracy are shown in Equations (10)–
(12), respectively.
TP
(10)
SE =
TP + FN
TN
TN + FP
(11)
TP + TN
TP + TN + FP + FN
(12)
Sp =
Ac =
Sensors 2021, 21, 6770
12 of 18
Figure 13. Training results of the LSTM model using different epochs.
Table 3. Codes for the IMU locations.
Placement
Chest
Elbows
Wrists
Thighs
Ankles
Code
C
E
W
T
A
Table 4 illustrates the performance of these 30 models. The training accuracy of the
models after 100 epochs is nearly identical and most reach over 99.90%. In general, the best
fall-detection performance improved gradually with the addition of more IMUs placed on
different sections of the cloth, with the highest Ac, Se, and Sp achieved at 94.10%, 92.25%,
and 94.59%, respectively, for all IMUs included as proposed.
It is important to note that the IMU placed on the chest plays an important role in
detecting the fall of the firefighter, as the combination of EWTA, where the chest part is
excluded, has the lowest Ac, Se, and Sp of 90.32%, 90.72%, and 90.21%, respectively. In
addition, the IMU combinations of CET, CA, and a single C also achieved Acs of over 92%
(Ses over 90% and Sps over 92%). This further emphasized that the placement of an IMU
on the chest is essential in detecting falls. In the group of two placements, combinations
that involved the chest achieved fairly high Acs of over 90% and Sps of over 90%, but Ses
as low as 86.94%. Similarly, with only one IMU placement, the chest also presents a much
higher efficiency than the others.
It is also interesting to note that the combinations of EA and ET, with IMUs placed
on the elbows (PJ) and either the ankles or thighs (PT), also have Acs of over 90%, outperforming the rest of the IMU combinations. Moreover, the fall-detection performance
based on PJ only (CEW) and PT only (TA) achieved Acs of 91.26% and 89.20%, respectively,
indicating that adding IMUs from PT (either the thighs or ankles) can improve the overall
fall-detection Ac by at least 2%.
In summary, the results indicated that the IMU combinations of CEWTA, CEWT, and
CET had the best performance among all metrics. Moreover, the chest position proves to
be the most important placement of the IMU in fall detection for firefighters.
Sensors 2021, 21, 6770
13 of 18
Table 4. Performance of 30 IMU combinations.
IMU Quantity
Combination
AUC
Se
Sp
Ac
9
7
7
7
7
8
5
5
5
6
6
5
5
5
5
3
3
3
3
4
4
4
4
4
4
2
2
2
2
1
CEWTA
CEWT
CEWA
CETA
CWTA
EWTA
CEW
CEA
CWT
EWA
EWT
CWA
CET
ETA
WTA
CE
CW
CT
CA
TA
ET
EA
WT
WA
EW
E
W
T
A
C
0.97
0.98
0.95
0.95
0.95
0.94
0.94
0.95
0.98
0.93
0.96
0.96
0.97
0.96
0.93
0.96
0.95
0.95
0.94
0.92
0.91
0.95
0.91
0.90
0.94
0.91
0.84
0.88
0.89
0.96
92.25%
91.22%
89.04%
88.01%
90.35%
90.72%
88.72%
88.39%
88.39%
85.06%
89.14%
90.84%
91.61%
90.54%
90.54%
92.88%
90.96%
86.94%
90.23%
83.92%
85.34%
87.40%
85.99%
81.98%
83.01%
85.08%
71.99%
78.56%
73.97%
92.82%
94.59%
94.72%
94.25%
95.37%
94.21%
90.21%
91.94%
92.24%
92.24%
92.42%
91.42%
93.02%
94.06%
92.30%
92.30%
89.92%
93.49%
92.97%
93.97%
90.61%
92.19%
93.49%
84.76%
90.72%
90.75%
88.70%
80.65%
86.56%
92.44%
92.43%
94.10%
93.98%
93.15%
93.82%
93.38%
90.32%
91.26%
91.43%
91.43%
90.87%
90.94%
92.56%
93.55%
91.93%
91.93%
90.54%
92.96%
91.70%
93.18%
89.20%
90.75%
92.21%
85.02%
88.88%
89.14%
87.94%
78.83%
84.87%
88.55%
92.51%
Furthermore, the most efficient combinations of each quantity group were evaluated.
Figure 14 presents the results of the trained models in terms of accuracy and loss. The results
illustrate that the fewer the placements, the lower the efficiency in the small-epoch training
model, although the accuracy and loss were quite similar after training for 100 epochs.
Moreover, Figure 15 illustrates the ROC curves and AUC values of these five models.
The results show that all five models perform well and have similar ROC curves. To
further evaluate the performance of these models, the efficiencies of each collected activity
were compared.
Figure 14. Performance of the five trained models, evaluated with accuracy (left) and loss (right).
Sensors 2021, 21, 6770
14 of 18
Figure 15. ROC curves and AUC values of the five models.
Table 5 presents the detailed performance of these models for each activity. To clarify,
the sensitivity values are zero for fall-like activities because no falling happens; hence, no
true falling labels appear. According to the average results of these five models, F1 and F6
in the falling activities have lower sensitivities than the other four types. This is mainly
because these two activities have an action to reduce the impact before falling, while the
others fall directly to the ground. In F1, the knees touch the ground first before falling, and
the firefighter sits on the ground, when simulating a backward fall in F2.
Meanwhile, for the fall-like activities, the model is more likely to wrongly predict
FL2 (walking with a stoop) as falling. This is because the physical status of the upper
body is quite like a fall; hence, it is more difficult to distinguish FL2, if fewer of the lower
body’s features are utilized. It is also important to notice that the specificities of the falls
are much higher than those of the fall-like activities, which illustrates that fall-like motions
indeed have some similarities with falling motions, while normal walking is very different
from falling.
In terms of the performance of each model, all five models are sufficient for fall
detection (Se over 90% and Ac over 92%); however, the CA and C models are insufficient
for a specific fall activity (F2 and F4, respectively). This illustrates that it is difficult to detect
some of the falling activities with fewer IMUs and fewer features. Moreover, CEWTA shows
a better ability to distinguish falls and non-falls because the specificity of each activity
is generally higher than that of the other models. This illustrates that the fall-detection
system can indeed be improved by using more IMUs in different positions. In general, the
IMU combinations of CEWTA, CEWT, and CET performed the best in each activity.
Van Thanh et al. devoted considerable effort to FDSs for firefighters. Their group
presented an acceleration-based algorithm in [27] and proposed four improved algorithms
in [28]. The four improved algorithms all have reduced physical performance because of
a double-check method used to reduce false detections. Algorithm 1 utilized all features,
while the others used parts of the features. Table 6 shows a performance comparison with
their studies, as well as some recent FDS studies in other fields.
Sensors 2021, 21, 6770
15 of 18
Table 5. Detection efficiency for each fall activity.
Activity
F1
F2
F3
F4
F5
F6
FL1
FL2
FL3
Total
Activity
F1
F2
F3
F4
F5
F6
FL1
FL2
FL3
Total
Se
CEWTA
Sp
Ac
Se
CEWT
Sp
Ac
Se
CET
Sp
Ac
91.45%
94.48%
96.55%
95.02%
96.62%
78.38%
0%
0%
0%
92.25%
96.87%
98.34%
97.20%
98.72%
97.40%
99.05%
90.31%
89.92%
87.79%
94.59%
95.42%
97.15%
97.00%
97.62%
97.22%
92.80%
90.31%
89.92%
87.79%
94.10%
87.28%
93.54%
95.69%
94.91%
98.46%
80.70%
0%
0%
0%
91.22%
98.40%
98.26%
97.58%
99.02%
97.22%
99.27%
90.06%
84.17%
89.87%
94.72%
95.42%
96.81%
97.07%
97.79%
97.50%
93.66%
90.06%
84.17%
89.87%
93.98%
89.11%
95.22%
97.09%
95.37%
97.23%
76.71%
0%
0%
0%
91.61%
98.25%
98.38%
97.10%
98.72%
97.22%
99.11%
89.38%
82.21%
87.60%
94.06%
95.80%
97.41%
97.10%
97.72%
97.22%
92.33%
89.38%
82.21%
87.60%
93.55%
Se
CA
Sp
Ac
Se
C
Sp
Ac
Se
Average
Sp
Ac
91.25%
81.84%
93.64%
94.61%
98.46%
78.38%
0%
0%
0%
90.23%
94.31%
95.89%
97.44%
98.08%
96.32%
97.60%
83.72%
86.88%
89.99%
93.97%
93.49%
91.58%
96.27%
94.07%
96.80%
91.79%
83.72%
86.88%
89.99%
93.18%
90.84%
97.66%
99.68%
84.14%
96.61%
88.55%
0%
0%
0%
92.82%
95.31%
95.81%
95.80%
96.61%
97.04%
97.88%
99.61%
80.88%
93.05%
92.43%
94.11%
96.38%
97.00%
92.90%
96.94%
95.06%
99.61%
80.88%
93.05%
92.51%
89.99%
92.55%
96.53%
92.81%
97.48%
80.54%
0%
0%
0%
/
96.63%
97.34%
97.02%
98.23%
97.04%
98.58%
90.62%
84.81%
89.66%
/
94.85%
95.87%
96.89%
96.02%
97.14%
93.13%
90.62%
84.81%
89.66%
/
Table 6. Comparison of fall-detection results with the proposed FDS and some previously developed FDSs, where SR, Se,
Sp, and Acc represent the sampling rate of IMUs, sensitivity, specificity, and accuracy, respectively.
Reference
Application
Methodology
Algorithm
Firefighters
1 3-DOF accelerometer
and 1 barometer on
the thigh pocket, and
1 CO sensor on the
mask (they raised 4
algorithms in [27] and
1 algorithm in [28])
Algorithm 1
Algorithm 2
Algorithm 3
Algorithm 4
Van et al.
(2018) [27]
Van et al.
(2018) [28]
SR
100Hz
Algorithm 1
Se
Sp
Ac
100%
100%
100%
100%
100%
94.44%
90.74%
91.67%
100%
95.83%
93.05%
93.75%
88.9%
94.45%
91.67%
Shi et al.(2020) [42]
AnkFall (2021) [43]
Elderly
1 IMU on waist
1 IMU on ankle
/
/
100 Hz
100 Hz
95.54%
76.8%
96.38%
92.8%
95.96%
/
Kiprijanovska et al.
(2020) [44]
Ordinary
being
2 IMUs in 2 smartwatches
/
100 Hz
90.6%
86.2%
88.9%
Proposed method
Firefighters
9 9-DOF IMUs on the chest,
wrists, elbows, thighs and
ankles
/
15 Hz
92.25%
94.59%
94.10%
First, the proposed method only utilized a low data-sampling rate (15 Hz) to achieve
higher performance in sensitivity, specificity, and accuracy rate, compared to the other
studies (100 Hz), which indicates a cost reduction in the data-processing complexity and
power consumption. Compared with the results of the Van Thanh group, the improved
algorithms have higher sensitivities than the proposed method, while the one without the
double-check algorithm is less efficient. Meanwhile, the algorithm presented by [27] also
indicated that fall detection with a single sensor showed worse performance, compared
with our multisensory approach.
In comparison with the FDSs in other fields, the results illustrated that IMUs on
the ankles [43] and wrists [44] are less efficient than our proposed method in all metrics.
Sensors 2021, 21, 6770
16 of 18
However, the study in [42] performed better than our method, which also suggests that
a higher sampling rate (100 Hz) could improve the fall-detection accuracy. Moreover,
according to [42] and our results, the optimal placement of the IMU could not only be on
the chest, but also on the trunk. In general, this proposed study delivers high fall-detection
performance by considering the motions from both the upper and lower parts of the body,
which increased the detection probability.
5. Conclusions
This paper proposed a novel wearable FDSfall-detection system for firefighters by
embedding motion sensors on the firefighting PPC. The study revealed that the classification of falls and fall-like activities can be distinguished with an accuracy of approximately
94% with all nine IMUs embedded. The study also revealed that an IMU placed on the
chest was critical for achieving the best fall-detection performance. Furthermore, the study
also concluded that placing IMUs on the chest, elbows, and thighs could also achieve an
acceptable fall-detection performance with higher cost efficiency.
In this study, the simulated falling events were based on the experiences and feedback
collected from firefighters. This preliminary study denoted the potential of wearable
embedded motion sensors for identifying the falling activities of firefighters. Furthermore,
the proposed study achieved results similar to those of existing studies, but with a lower
sampling rate; hence, reducing the computation cost in general.
A detail plan that covers the ethics and safety issues are currently ongoing and
higher realistic falling events will be collected in the future, potentially during actual
firefighting rescue missions. Meanwhile, alternative snesors to reduce the false detection
will be evaluatedFuture studies include fall detection in actual firegrounds, and exploring
alternatives to reduce false fall detections, such as utilizing firefighters’ physiological
signals (heart rate, brainwave signal) and fireground environmental conditions (CO2
concentration level, etc.). The proposed smart wearable PPC can also be expanded to detect
the micro-activities of firefighters, which may increase their safety. In addition, future work
also considers the new design of wireless communication framework and infrastructure
that can meet the requirements of firefighting activities. Moreover, the sensor components
and circuits need further improvements in the aspects of heat-proof, water-proof and
washable to overcome the damage issues under the harsh environment in fireground.
Author Contributions: Conceptualization, X.C. and B.-G.L.; methodology, X.C., R.W. and B.-G.L.;
software, X.C.; validation, X.C. and R.W.; formal analysis, X.C., R.W., M.P. and B.-G.L.; investigation,
X.C.; resources, H.J. and B.-G.L.; data curation, X.C., R.W. and H.J.; writing—original draft preparation, X.C.; writing—review and editing, X.C., M.P. and B.-G.L.; visualization, X.C.; supervision,
M.P. and B.-G.L.; project administration, B.-G.L.; funding acquisition, W.-Y.C. and B.-G.L. All authors
have read and agreed to the published version of the manuscript.
Funding: This research was supported by the Zhejiang Provincial Natural Science Foundation
of China under Grant No. LQ21F020024. This research was also funded by a National Research
Foundation of Korea (NRF) grant (2019R1A2C1089139) funded by the Korean Government (MIST).
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Not applicable.
Conflicts of Interest: The authors declare no conflict of interest.
References
1.
2.
3.
China Fire Protection Yearbook; Yunnan People’s Publishing House: Kunming, China, 2018.
Fahy, R.F.; Molis, J.L. Firefighter Fatalities in the US-2018. Available online: https://www.nfpa.org/-/media/Files/News-andResearch/Fire-statistics-and-reports/Emergency-responders/2019FFF.ashx (accessed on 19 May 2021).
Sun, Z. Research on safety safeguard measures for fire fighting and Rescue. Fire Daily 2001, 3, 52–58. (In Chinese)
Sensors 2021, 21, 6770
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
17 of 18
Brushlinsky, N.; Ahrens, M.; Sokolov, S.; Wagner, P. World Fire Statistics. 2021. https://ctif.org/sites/default/files/2021-06/
CTIF_Report26_0.pdf (accessed on 5 August 2021).
Fan, M.; Yang, Q.; Feng, S.; Zhao, C.; Pu, J. Research on Casualties of Chinese Firefighters in Various Firefighting and Rescue
Tasks. Ind. Saf. Environ. Prot. 2015.
Zhu, L.; Zhou, P.; Pan, A.; Guo, J.; Sun, W.; Wang, L.; Chen, X.; Liu, Z. A Survey of Fall Detection Algorithm for Elderly Health
Monitoring. In Proceedings of the 2015 IEEE Fifth International Conference on Big Data and Cloud Computing, Dalian, China,
26–28 August 2015, pp. 270–274. [CrossRef]
Mubashir, M.; Shao, L.; Seed, L. A Survey on Fall Detection: Principles and Approaches. Neurocomputing 2013, 100, 144–152.
[CrossRef]
Noury, N.; Fleury, A.; Rumeau, P.; Bourke, A.; ÓLaighin, G.; Rialle, V.; Lundy, J.E. Fall detection – Principles and Methods. In
Proceedings of the 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon,
France, 22–26 August 2007; pp. 1663–1666. [CrossRef]
Casilari, E.; Santoyo-Ramon, J.A.; Cano-Garcia, J.M. Analysis of Public Datasets for Wearable Fall Detection Systems. Sensors
2017, 17, 1513. [CrossRef]
Li, C.; Teng, G.; Zhang, Y. A survey of fall detection model based on wearable sensor. In Proceedings of the 2019 12th International
Conference on Human System Interaction (HSI), Richmond, VA, USA, 25–27 June 2019; pp. 181–186. [CrossRef]
Ramachandran, A.; Ramesh, A.; Karuppiah, A. Evaluation of Feature Engineering on Wearable Sensor-based Fall Detection.
In Proceedings of the 2020 International Conference on Information Networking (ICOIN), Barcelona, Spai, 7–10 January 2020;
pp. 110–114. [CrossRef]
Iazzi, A.; Rziza, M.; Oulad Haj Thami, R. Fall Detection System-Based Posture-Recognition for Indoor Environments. J. Imag.
2021, 7, 42. [CrossRef]
Diraco, G.; Leone, A.; Siciliano, P. An active vision system for fall detection and posture recognition in elderly healthcare. In
Proceedings of the 2010 Design, Automation Test in Europe Conference Exhibition (DATE 2010), Dresden, Germany, 8–12 March
2010; pp. 1536–1541. [CrossRef]
Rougier, C.; Meunier, J. Demo: Fall detection using 3D head trajectory extracted from a single camera video sequence. J. Telemed.
Telecare 2005, 11, 7–9.
Jansen, B.; Deklerck, R. Context aware inactivity recognition for visual fall detection. In Proceedings of the 2006 Pervasive Health
Conference and Workshops, Innsbruck, Austria, 29 November–1 December 2006; pp. 1–4. [CrossRef]
Lin, C.L.; Chiu, W.C.; Chu, T.C.; Ho, Y.H.; Chen, F.H.; Hsu, C.C.; Hsieh, P.H.; Chen, C.H.; Lin, C.C.K.; Sung, P.S.; et al. Innovative
Head-Mounted System Based on Inertial Sensors and Magnetometer for Detecting Falling Movements. Sensors 2020, 20, 5774.
[CrossRef]
Waheed, M.; Afzal, H.; Mehmood, K. NT-FDS—A Noise Tolerant Fall Detection System Using Deep Learning on Wearable
Devices. Sensors 2021, 21, 2006. [CrossRef]
Vavoulas, G.; Pediaditis, M.; Spanakis, E.G.; Tsiknakis, M. The MobiFall dataset: An initial evaluation of fall detection algorithms
using smartphones. In Proceedings of the 13th IEEE International Conference on BioInformatics and BioEngineering, Chania,
Greece, 10–13 November 2013; pp. 1–4. [CrossRef]
Medrano, C.; Igual, R.; Plaza, I.; Castro, M. Detecting Falls as Novelties in Acceleration Patterns Acquired with Smartphones.
PLoS ONE 2014, 9, e94811. [CrossRef]
Wertner, A.; Czech, P.; Pammer, V. An Open Labelled Dataset for Mobile Phone Sensing Based Fall Detection. In Proceedings of
the ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), Gent, Belgium, 22–24 July
2015. [CrossRef]
Vavoulas, G.; Chatzaki, C.; Malliotakis, T.; Pediaditis, M.; Tsiknakis, M. The MobiAct Dataset: Recognition of Activities of Daily
Living using Smartphones. In Proceedings of the International Conference on Information and Communication Technologies for
Ageing Well and e-Health–ICT4AWE, (ICT4AGEINGWELL 2016), Rome, Italy, 21–22 April 2016; pp. 143–151. [CrossRef]
Micucci, D.; Mobilio, M.; Napoletano, P. UniMiB SHAR: A Dataset for Human Activity Recognition Using Acceleration Data
from Smartphones. Appl. Sci. 2017, 7, 1101. [CrossRef]
Martinez-Villasenor, L.; Ponce, H.; Brieva, J.; Moya-Albor, E.; Nunez-Martínez, J.; Penafort-Asturiano, C. UP-Fall Detection
Dataset: A Multimodal Approach. Sensors 2019, 19, 1988. [CrossRef] [PubMed]
Lee, D.W.; Jun, K.; Naheem, K.; Kim, M.S. Deep Neural Network–Based Double-Check Method for Fall Detection Using IMU-L
Sensor and RGB Camera Data. IEEE Access 2021, 9, 48064–48079. [CrossRef]
Kwolek, B.; Kepski, M. Improving fall detection by the use of depth sensor and accelerometer. Neurocomputing 2015, 168, 637–645.
[CrossRef]
Chen, X.; Jiang, S.; Lo, B. Subject-Independent Slow Fall Detection with Wearable Sensors via Deep Learning. In Proceedings of
the 2020 IEEE SENSORS, Online, 25–28 October 2020; pp. 1–4. [CrossRef]
Van Thanh, P.; Nguyen, T.; Nga, H.; Thi, L.; Ha, T.; Lam, D.; Chinh, N.; Tran, D.T. Development of a Real-time Supported
System for Firefighters in Emergency Cases. In Proceedings of the International Conference on the Development of Biomedical
Engineering in Vietnam, June Ho Chi Minh, Vietnam, 27–29 June 2017.
Van Thanh, P.; Le, Q.B.; Nguyen, D.A.; Dang, N.D.; Huynh, H.T.; Tran, D.T. Multi-Sensor Data Fusion in A Real-Time Support
System for On-Duty Firefighters. Sensors 2019, 19, 4746. [CrossRef]
Sensors 2021, 21, 6770
29.
30.
31.
32.
33.
34.
35.
36.
37.
38.
39.
40.
41.
42.
43.
44.
18 of 18
Geng, Y.; Chen, J.; Fu, R.; Bao, G.; Pahlavan, K. Enlighten Wearable Physiological Monitoring Systems: On-Body RF Characteristics
Based Human Motion Classification Using a Support Vector Machine. IEEE Trans. Mob. Comput. 2016, 15, 656–671. [CrossRef]
Blecha, T.; Soukup, R.; Kaspar, P.; Hamacek, A.; Reboun, J. Smart firefighter protective suit - functional blocks and technologies.
In Proceedings of the 2018 IEEE International Conference on Semiconductor Electronics (ICSE), Kuala Lumpur, Malaysia, 15–17
August 2018; p. C4. [CrossRef]
BNO055 Inertial Measurement Unit. Available online: https://item.taobao.com/item.htm?spm=a230r.1.14.16.282a69630O3V2r&
id=541798409353&ns=1&abbucket=5#detail (accessed on 10 May 2021).
Seeeduino XIAO MCU. Available online: https://detail.tmall.com/item.htm?id=612336208350&spm=a1z09.2.0.0.48d12e8d1z1
cxA&_u=ajtqea1c090 (accessed on 3 June 2021).
TCA9548A I2C Multiplexer. Available online: https://detail.tmall.com/item.htm?id=555889112029&spm=a1z09.2.0.0.48d12e8d1
z1cxA&_u=ajtqea191ec (accessed on 19 May 2021).
JDY-18 Bluetooth Low Energy 4.2 Module. Available online: https://detail.tmall.com/item.htm?id=561783372873&spm=a1z09.2.
0.0.48d12e8d1z1cxA&_u=ajtqea15f8b (accessed on 1 June 2021).
3.7 V 400 mAh Lthium-Lon Battery. Available online: https://item.taobao.com/item.htm?id=619553965700&ali_refid=a3_4
30008_1006:1102265936:N:%2BblvRi4iO%2FgjtUw1Rz5DMnH2RFqSzBpj:1cecdd7aee090b757014f8c1916c98e1&ali_trackid=1_
1cecdd7aee090b757014f8c1916c98e1&spm=a230r.1.0.0 (accessed on 22 May 2021).
Yan, Y.; Ou, Y. Accurate fall detection by nine-axis IMU sensor. In Proceedings of the 2017 IEEE International Conference on
Robotics and Biomimetics (ROBIO), Macao, China, 5–8 December 2017; pp. 854–859. [CrossRef]
O’Donovan, K.J.; Kamnik, R.; O’Keeffe, D.T.; Lyons, G.M. An inertial and magnetic sensor based technique for joint angle
measurement. J. Biomech. 2007, 40, 2604–2611. doi: 10.1016/j.jbiomech.2006.12.010. [CrossRef]
Li, Q.; Stankovic, J.A.; Hanson, M.A.; Barth, A.T.; Lach, J.; Zhou, G. Accurate, Fast Fall Detection Using Gyroscopes and
Accelerometer-Derived Posture Information. In Proceedings of the 2009 Sixth International Workshop on Wearable and
Implantable Body Sensor Networks, Berkeley, CA, USA, 3–5 June 2009; pp. 138–143. [CrossRef]
Wu, F.; Zhao, H.; Zhao, Y.; Zhong, H. Development of a Wearable-Sensor-Based Fall Detection System. Int. J. Telemed. Appl. 2015,
2015, 576364. [CrossRef]
Ahn, S.; Kim, J.; Koo, B.; Kim, Y. Evaluation of Inertial Sensor-Based Pre-Impact Fall Detection Algorithms Using Public Dataset.
Sensors 2019, 19, 774. [CrossRef]
Mahony, R.; Hamel, T.; Pflimlin, J.M. Nonlinear Complementary Filters on the Special Orthogonal Group. IEEE Trans. Autom.
Control. 2008, 53, 1203–1218. [CrossRef]
Shi, J.; Chen, D.; Wang, M. Pre-Impact Fall Detection with CNN-Based Class Activation Mapping Method. Sensors 2020, 20, 4750.
[CrossRef]
Luna-Perejon, F.; Munoz-Saavedra, L.; Civit-Masot, J.; Civit, A.; Dominguez-Morales, M. AnkFall—Falls, Falling Risks and
Daily-Life Activities Dataset with an Ankle-Placed Accelerometer and Training Using Recurrent Neural Networks. Sensors 2021,
21, 1889. [CrossRef] [PubMed]
Kiprijanovska, I.; Gjoreski, H.; Gams, M. Detection of Gait Abnormalities for Fall Risk Assessment Using Wrist-Worn Inertial
Sensors and Deep Learning. Sensors 2020, 20, 5373. [CrossRef] [PubMed]