Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Next Article in Journal
A Two-Stage Correspondence-Free Algorithm for Partially Overlapping Point Cloud Registration
Next Article in Special Issue
One-Step Initial Alignment Algorithm for SINS in the ECI Frame Based on the Inertial Attitude Measurement of the CNS
Previous Article in Journal
An Improved Network Time Protocol for Industrial Internet of Things
Previous Article in Special Issue
Research on the Error of Global Positioning System Based on Time Series Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Method for Autonomous Multi-Motion Modes Recognition and Navigation Optimization for Indoor Pedestrian

1
College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
2
Navigation Research Center, School of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China
3
School of Railway Transportation, Shanghai Institute of Technology, Shanghai 201418, China
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(13), 5022; https://doi.org/10.3390/s22135022
Submission received: 1 June 2022 / Revised: 22 June 2022 / Accepted: 1 July 2022 / Published: 3 July 2022

Abstract

:
The indoor navigation method shows great application prospects that is based on a wearable foot-mounted inertial measurement unit and a zero-velocity update principle. Traditional navigation methods mainly support two-dimensional stable motion modes such as walking; special tasks such as rescue and disaster relief, medical search and rescue, in addition to normal walking, are usually accompanied by running, going upstairs, going downstairs and other motion modes, which will greatly affect the dynamic performance of the traditional zero-velocity update algorithm. Based on a wearable multi-node inertial sensor network, this paper presents a method of multi-motion modes recognition for indoor pedestrians based on gait segmentation and a long short-term memory artificial neural network, which improves the accuracy of multi-motion modes recognition. In view of the short effective interval of zero-velocity updates in motion modes with fast speeds such as running, different zero-velocity update detection algorithms and integrated navigation methods based on change of waist/foot headings are designed. The experimental results show that the overall recognition rate of the proposed method is 96.77%, and the navigation error is 1.26% of the total distance of the proposed method, which has good application prospects.

1. Introduction

In recent years, with the development of society, pedestrian navigation technology has received more and more attention and research from researchers in the fields of disaster relief and rescue, medical search and rescue, public safety, fitness consumption, counterterrorism, and so on. Traditional pedestrian navigation and positioning methods mainly rely on the global navigation satellite system (GNSS) [1]. In real life, pedestrians are not always in an environment with good GNSS signals. When pedestrians are in buildings, on viaducts, in forests with poor GNSS signals, or even in underground parking lots, indoor, tunnels and other environments without satellite signals, GNSS will not be able to provide accurate navigation and positioning functions for pedestrians. In addition, the indoor motion modes of pedestrians are not only limited to walking, but also common motion modes such as running, going upstairs, and going downstairs. Especially in a special application environment such as during a rescue, it is necessary to monitor the movement status of special operators in real time for the command and dispatching center to perceive their motion mode and trajectory. It is convenient for decision makers to adjust task strategies in time according to their own conditions and the positions of operators. Therefore, there is an urgent need to carry out research on autonomous recognition and navigation optimization methods for multi-motion modes for indoor pedestrians in the satellite failure environment to meet the needs of practical applications.
At present, indoor pedestrian navigation technology mainly includes active navigation algorithms and passive navigation algorithms. Active algorithms mainly include using ultra-wide band (UWB) [2], Wi-Fi [3,4], visual sensor [5], Bluetooth, ZigBee, radio frequency identification (RFID) and other multi-source information for fusion positioning. The navigation and positioning errors of this information do not accumulate with the passage of time, and the accuracy is also high. A passive navigation algorithm is a pedestrian navigation method that only uses IMU. This kind of method does not need to set up reference facilities in advance, nor does it need to send signals to the outside world. It only needs to fix the IMU on feet, legs, waist, and other parts of the human body, and conduct pedestrian navigation and positioning by collecting IMU data. IMU has the advantages of strong autonomy, convenient installation, and low cost. However, the accuracy of IMU is limited, and the navigation error will diverge with time.
There are two common pedestrian passive navigation methods based on IMU. One is the pedestrian dead reckoning (PDR) technology, which calculates step length and heading of a pedestrian through acceleration and angular velocity and calculates pedestrian position; The second method is the strapdown inertial navigation algorithm based on zero-velocity update (ZUPT). Many people have carried out research on pedestrian navigation algorithms based on PDR algorithm. In 2019, Xu l [6] proposed a PDR navigation algorithm based on handheld mobile phones, which improves navigation accuracy through a zero angular velocity heading correction algorithm and lateral velocity constraint algorithm. In 2020, Ding, Y. [7] proposed a navigation algorithm based on three node IMUs of legs and waist, and built an inertial map on the basis of PDR for secondary correction. However, the PDR algorithm needs to estimate the accurate step and heading to provide accurate navigation and positioning services for pedestrians.
So, many people have carried out research on pedestrian navigation algorithms based on a ZUPT algorithm. Eric Foxlin [8] first studied the pedestrian navigation technology based on ZUPT of foot, which can effectively suppress the error divergence of inertial navigation system; In 2016, Tian, X. [9] et al. studied and established the functional relationship between the zero-velocity detection threshold and the step frequency, which can accurately recognize the zero-velocity interval of walking under different step frequencies; In 2020, Jesus D. Ceron et al. [10] proposed a foot-mounted navigation and positioning algorithm for walking, running and jogging. The positioning accuracy of the algorithm using complementary filtering is 8.8%; In 2021, Li, W. [11] studied a ZUPT navigation algorithm based on the constraint of biped maximum distance inequality. The algorithm was applicable to walking, running and their mixed movements. The pedestrian’s position and posture are calculated through acceleration, angular velocity, magnetic intensity, air pressure and other data. The navigation parameters are modified by using the zero-velocity information of foot during walking. In the research of passive pedestrian navigation based on IMU, course correction is always a problem to be solved; In 2015, Xu, Y. et al. [12] studied the pedestrian navigation method based on dual IMUs on shoulder and foot, which corrected the heading of the navigation system through the heading difference between shoulder and foot; In 2015, Min, L. et al. [13] studied the navigation algorithm of heading of waist and magnetic data to assist feet; In 2018, Wang, H. et al. [14] studied the recognition of straight line and turning motion, extracted the true heading from the accelerometer and magnetometer, and reduced the heading deviation in straight-line motion; In 2019, Shi, L. F. et al. [15] proposed an adaptive error estimation method of IMU and a heading correction method using geomagnetic correction. Using low-precision IMU can also maintain a certain positioning accuracy. The ZUPT method proposed in that paper is suitable for running. Because there are many indoor magnetic interferences, the effect of using magnetism to correct indoor pedestrian navigation heading will be greatly reduced; In 2019, Niu, X. [16] et al. proposed an equality constraint method based on pedestrian track, which constrained navigation information through biped distance. But it only studied the walking state. For the indoor pedestrian navigation algorithm based on ZUPT, two-dimensional navigation is mainly studied. At the same time, researchers have carried out some research on different installation parts of IMU, but there are few research studies on navigation in common mixed motion modes. Some algorithms use geomagnetic data for heading correction, which is susceptible to interference.
There are many common modes of movement for indoor pedestrians. Accurate detection of pedestrian motion modes is an important basis for improving the accuracy of indoor navigation. Different motion modes use different zero-velocity detection algorithms. Pedestrian motion mode recognition methods mainly include traditional threshold method and the machine learning method. The parameters of traditional threshold method are fixed values. Even in the same motion mode, the motion parameters of pedestrians are not exactly the same. Therefore, using the threshold method [17] to identify the modal adaptability is insufficient; In the machine learning classification method, the machine learning model is trained according to the data eigenvalues of different motion modes to realize pedestrian motion modes recognition. Common pedestrian modal recognition methods with machine learning include support vector machine (SVM) [18,19], k-nearest neighbor (KNN), artificial neural networks (ANN) [20], decision tree (DT) [21], etc. The conventional SVM algorithm has high accuracy for binary classification. For many types of classification, there are problems of reduced calculating speed and accuracy; KNN algorithm needs to set the K value in advance and balance the number of various samples; ANN training models need a large number of parameters, take a long time to learn, and even have the possibility of unsuccessful learning. The DT algorithm has the problem of over fitting. In addition, different mounting positions have different effects on different motion mode recognition. Since 2015, the German Aerospace Center (DLR) has carried out research on running, climbing, lying and other motion modes recognition with multi IMUs [22], and found that an inertial measurement unit (IMU) worn on leg has a good effect in gait segmentation research. In 2016, Liu, Y. [17] et al. fixed the IMU at the waist and studied a classification method based on the time domain eigenvalue threshold method, which can recognize 8 daily motion modes such as static, walking, running, and jumping. In 2017, Wagstaff, B. [18] et al. studied a motion classification method based on a single IMU sensor mounted on foot. The method recognized six motion modes of walking, jogging, running, sprinting, crouching walking and climbing stairs with SVM, and studied the navigation and positioning error algorithm for walking and running. In addition, literatures [19,20,21,23] also studied the motion mode recognition algorithm based on handheld IMU, respectively, and obtained useful conclusions. Since the threshold value in the threshold method is fixed, and the machine learning method is more tolerant, so the machine learning method is selected to recognize pedestrian motion modes. Pedestrian motion has time-series characteristics, but general machine learning methods are not sensitive to time-series characteristics. When a machine learning algorithm is used to recognize motion modes, appropriate eigenvalues of motion should be selected.
In order to effectively solve the needs of autonomous navigation for special tasks such as emergency rescue and disaster relief, medical search and rescue, a multi-motion mode pedestrian navigation method based on multi-inertial sensors is proposed for indoor special operators with multiple motion modes. In this paper, an inertial sensor network based on four-node inertial sensors of waist, legs and right foot is designed. According to the characteristics of four common pedestrian motion modes of indoor walking, running, going upstairs, and going downstairs, the feature values of the legs are extracted to segment each step of the movement. The appropriate eigenvalues are selected for each step, and the four motion modes are recognized by Long short-term memory (LSTM). In view of the short zero-velocity interval in the running mode, a set of zero-velocity detection methods cannot be simply used for the four motion modes. The motion modes are divided into two categories: walking, going upstairs, and going downstairs as one category, and running as one category. Each type of motion uses a set of zero-velocity detection algorithms; For passive navigation, the heading of foot inertial navigation system based on ZUPT still diverges. At the beginning and end of a step, the change of the waist heading, and the change of the foot heading should both be 0°. The difference between the change of the waist heading and the change of the foot heading is constructed as a virtual observation to assist in the heading correction of pedestrians. The experimental results show that the method proposed in this paper can effectively recognize the motion modes of normal walking, running, going upstairs, and going downstairs. The proposed navigation optimization method based on waist/foot angular velocity to assist in correcting heading errors can meet the needs of autonomous navigation for special tasks.
The following sections of the paper are organized as follows: The general framework of the algorithm is introduced in Section 2; Section 3 introduces the autonomous method of multi-motion modes recognition for indoor pedestrians; Section 4 describes the ZUPT/waist/foot navigation optimization algorithm model; In Section 5, the experimental results are systematically analyzed. Section 6 presents a discussion of the method.

2. Navigation Scheme Based on Multi-Node Inertial Sensor Network

The traditional motion mode recognition system uses the fixed threshold method to recognize motion modes, and the recognition rate is high when the motion velocity is close. However, when the pedestrian motion changes, there will be missed detection and false detection in motion mode recognition; In addition, the traditional indoor ZUPT algorithm can only correct the position and velocity information of pedestrians. Without appropriate observable information to correct the heading, it will diverge after a long time, which will lead to excessive position error. Generally, IMUs are mainly installed on foot, leg, waist, shoulder, and other parts. Because the motion characteristics of various parts of the human body are different, IMUs installed on different positions of a human body have different advantages. Therefore, according to the characteristics of different parts of the human body and different motion modes, the advantages of IMUs on different parts of the human body are fully utilized to overcome the shortcomings of the traditional fixed threshold method for recognizing pedestrian motion modes, as well as the problem of no heading observation for indoor pedestrian. A navigation implementation scheme based on a four-node inertial sensor network is proposed. The designed four-node inertial sensor network framework is shown in Figure 1.
The four nodes of the inertial sensor network are, respectively, arranged on the right foot/left leg/right leg/waist. The pedestrian four-node inertial sensing network will be divided into two sub-inertial systems: the two IMUs installed on the legs are used as a sub-inertial system 1, and the two IMUs installed on the right foot and waist are used as the sub-inertial system 2. In Figure 1, IMU1, IMU2, IMU3, and IMU4 represent inertial sensors installed on left leg, right leg, right foot, and waist, respectively. f 1 , f 2 , f 3 , f 4 , ω 1 , ω 2 , ω 3 , ω 4 are the acceleration and angular velocity of inertial sensors IMU1, IMU2, IMU3, and IMU4, respectively. The order of x, y, and z axes of the body coordinate system of the sensor is right, forward, and upward, which satisfies the right-hand rule. The navigation coordinate system is east, north, and universe (ENU).
The algorithm proposed in this paper is as follows: First, the Euclidean Distance is used to identify static and dynamic modes; For dynamic modes, the inertial data of legs are used for gait segmentation, and the motion mode is recognized by the LSTM algorithm based on the motion features of each step; Then, according to different motion modes, zero-velocity interval of foot is adaptively detected; Finally, the pedestrian navigation system is corrected according to ZUPT and difference between waist/foot heading of a foot of each step.

3. Autonomous Method of Pedestrian Indoor Multi-Motion Modes Recognition

3.1. Definition of Multi-Motion Modes

Indoor multi-motion modes of pedestrians are defined as shown in Figure 2. (a) Static: stand upright and perpendicular to the ground, keep your feet close to the ground and do not shake your body; (b) Walking: the body moves forward at a low speed; (c) Running: the body moves forward at a fast speed; (d) Going upstairs: walk up the stairs; (e) Going downstairs: walk down the stairs.

3.2. An Autonomous Method for Multi-Motion Modes Recognition

For motion mode recognition, traditional methods employ windowing studies for each data point. When there are many motion modes to be recognized, the recognition accuracy is insufficient.
For the four motion modes studied in this paper: walking, running, going upstairs and going downstairs, it is necessary to use the characteristics of different installation positions of IMUs to realize recognization. During the movement, leg movement has strong regularity and attitude angle changes are relatively stable, so leg angular velocity can be used for motion mode recognition. The motion mode recognition method is to first roughly divide the pedestrian motion modes into static and dynamic, and then subdivide the four modes under motion modes. Since running is relatively faster than walking, going upstairs, and going downstairs, and the zero-velocity interval of running of feet is short, the motion modes can be divided into two categories: walking, going upstairs and going downstairs as steady motion, and running as non-steady motion. The two categories of motion modes adopt different zero-velocity detection methods. The overall algorithm scheme of mode recognition is shown in Figure 3.
There are two data selection methods for motion mode recognition, the first is based on each data point, and the second is based on the data within each step. The advantage of the first recognition method is that it is simple and fast, but the eigenvalues extracted from each moment may appear in several motion modes or several states have similar eigenvalues, so this method has too many limitations. The second recognition method is based on the inertial sensor information characteristics of various gaits. Firstly, each step in the moving process is accurately segmented to provide the basis for the eigenvalues of each step, and then the threshold method or machine learning method is used to classify the motion modes. This paper adopts the second method, which is based on gait segmentation for pedestrian motion mode recognition.

3.2.1. Data Preprocessing Algorithm

(1)
Projection of gravity component
Due to the installation method of the IMU, the three axes x, y, and z of the IMU body coordinate system do not coincide with the east, north, and universe directions of the navigation coordinate system, and the gravity has components in the three axes. Before the motion mode recognition, the components of gravity in the three axes are deducted, and the obtained horizontal acceleration and vertical acceleration data can better reflect the motion characteristics of pedestrians. The acceleration measurement data saved by the IMU is denoted as a = f x , f y , f z . The mean value of the static three-axis acceleration data is denoted as a ¯ = f ¯ x , f ¯ y , f ¯ z . The linear acceleration minus the gravitational component is denoted as a l .
a l = a a ¯
Decompose a l into vertical and horizontal directions. The vertical projection of the linear acceleration a l on the reference gravity vector is denoted as a v .
a v = a l a a a a
The horizontal projection of linear acceleration a l with respect to the reference gravity vector a ¯ is denoted by a h . a h is obtained by vector subtraction, which represents the component of acceleration on the horizontal plane.
a h = a l a v
The data of leg acceleration minus gravity component is denoted as a = ( a v x , a v y , a h ) ,
(2)
Filter algorithm
In the process of pedestrian movement, the leg changes are periodic. The periodicity is mainly reflected in the acceleration and angular velocity of IMU. The raw data collected by IMU contains a lot of noise and errors, mainly including two types of sources: one is the noise of the IMU itself, and the other is the noise caused by the unstable movement of the experimenter. For angular velocity, it needs to be smoothed and denoised before gait segmentation. In this paper, a Savitzky–Golay (SG) filter is used for data preprocessing, which is a polynomial smoothing algorithm based on the local polynomial least squares principle in the time domain. The SG filter has the advantage of removing high-frequency noise while keeping the shape and width of the original signal unchanged. The basic principle of the SG filter is as follows:
The length of the window is set as n = 2 m + 1 . The n measuring points within the window are shown in Equation (4), and the measuring points are in sequence
x = m , m + 1 , 1 , 0 , 1 , , m 1 , m
The least squares fitting is performed on the data in the window using k 1 degree polynomial, as shown in the following Equation (5):
y = a 0 + a 1 x + a 2 x 2 + + a k 1 x k 1
In the equation, x is the variable, a i is the polynomial coefficient, and y is the central point data.
Using Equation (5), n equations are established, forming a k element linear equation system, which is represented by a matrix as Y = X A + ε . Where Y is the window data vector, X is the independent variable vector, A is the polynomial coefficient matrix, ε is the residual vector, The following equation holds:
y 1 y m 1 y m y m + 1 y n = 1 x x 2 x k 1 1 x x 2 x k 1 1 x x 2 x k 1 1 x x 2 x k 1 1 x x 2 x k 1 a 0 a 1 a 2 a k 1 + ε 1 ε m 1 ε m ε m + 1 ε n
In order to have a solution to the system of equations, n k is required, and n > k is generally selected. The polynomial coefficient matrix A ^ and the filter value Y ^ are determined by least squares fitting, and the calculation equation is as follows:
A ^ = X T X 1 X T Y
Y ^ = X A ^ = X X T X 1 X T Y
In the process of SG filter, it is found that for window size n , the larger n is, the more effective signals are lost. For the fitting order k , the larger k is, the worse the denoising ability is.
(3)
Zero Bias Compensation of Gyroscope
Due to the characteristics of IMU, the gyroscope has drift error. In this paper, the method of deducting zero deviation is used to reduce the drift error of gyroscope. Relative to the experimental time, it is considered that the zero bias of the gyroscope remains unchanged during the experimental time. Gyroscope zero bias is calculated using angular velocity during a period of static time before navigation begins. After the navigation data is updated at each moment, the zero offset is deducted from the angular velocity of the IMU. Angular velocity measurement data saved by the IMU is denoted as ω = ω x , ω y , ω z ; The mean value of the three-axis angular velocity data in the static interval, that is, the angular velocity zero offset, is recorded as ω ¯ = ω ¯ x , ω ¯ y , ω ¯ z ; The angular velocity after deducting the zero offset is ω ˜ .
ω ˜ = ω ω ¯

3.2.2. Recognition Algorithms for Dynamic and Static Mode

In the process of pedestrian movement, the difference of the pitch angle of both legs is periodic, which is reflected in the difference of the right angular velocity of the two legs.
Δ ω l e g = ω l l ω r l
In Equation (10), ω l l is the right angular velocity of the left leg, and ω r l is the right angular velocity of the right leg. The leg angular velocity and gait segmentation are shown in Figure 4. In Figure 4a, the angular velocity of both legs is zero when a pedestrian is in a static state. When the pedestrian is moving, Δ ω l e g is 0 only when the sensors of both legs swing to be perpendicular to the ground again. In Figure 4b, the red line is the difference between the right angular velocities of the two legs. When Δ ω l e g is near the zero-crossing point, that means it has moved one step.
Through research, it is found that the Euclidean Distance of the acceleration and angular velocity of the legs can obviously distinguish the static and motion modes in static and small movement. The combined Euclidean Distance ρ of acceleration and angular velocity is calculated as
ρ = ρ 1 ρ 1 T + ρ 2 ρ 2 T ρ 1 = f l l f r l ρ 2 = ω l l ω r l
In Equation (11), f , ω are the acceleration and angular velocity information respectively, and l l , r l represent the left and right legs, respectively. According to ρ , the discrimination between stillness and motion is carried out, as shown in Equation (12). When it is less than the threshold T h 1 , it’s in a static state, and the flag is set to 1. When it is greater than the threshold T h 1 , it’s in a dynamic state, and the flag is set to 0.
f l a g = 1 ρ < T h 1 0 o t h e r s
The recognition effect of the static and dynamic states is shown in Figure 5.

3.2.3. Recognition Algorithms for Dynamic State

The traditional classification methods are mainly classified by the threshold method. When there are many motion modes, a lot of logical judgments need to be made, and the robustness of the recognition algorithm is poor when the features change slightly in the process of moving. Therefore, this paper studies the method of pedestrian motion mode recognition based on machine learning. The commonly used classification methods of machine learning mainly include supervised learning, unsupervised learning, semi-supervised learning, and reinforcement learning. Supervised learning is to build a model from labeled training data and then input test set data into the model to predict its labels. The research on pedestrian motion modes recognition is applicable to supervised learning algorithms. Supervised learning algorithms mainly include KNN, SVM, DT, ANN, and other algorithms. Since the movement of pedestrians is a time series process with sequence characteristics, the movement data of one single moment cannot truly reflect the motion mode. Traditional machine learning methods are insensitive to sequence data. Recurrent Neural Network (RNN) has memory and has advantages in learning nonlinear features of data, but RNN has the problems of gradient disappearance and gradient explosion during backpropagation or training. In this paper, the LSTM algorithm is selected for motion mode recognition, which is a special RNN algorithm. LSTM inherits most of the features of the RNN model, while solving the gradient problem of RNN. LSTM can learn the time series rules of data more effectively, enhance the time dependence between data features, and has a certain memory ability for sequence data.
For the training set, the original data is preprocessed, then the classification eigenvalues are calculated, and the eigenvalues are input into the LSTM network to generate the classification network. For the test set, the original data are similarly processed, and the eigenvalues are calculated. Then the eigenvalues are input into the trained network to obtain the motion modes.
For the four common pedestrian motion modes of indoor walking, running, going upstairs, and going downstairs, firstly extract the eigenvalues to segment each step of the motion. The traditional method is to extract eigenvalues from the data in each step in the form of a sliding window. Usually, the recognition performance is improved by increasing the window length. However, there is a critical point in the window length. After the critical point is exceeded, the operation speed becomes slower, and the recognition accuracy does not improve much. For pedestrian motion mode recognition based on machine learning, the key technology is to find the eigenvalues with obvious features.
In this paper, the time-domain eigenvalues, such as mean value and mean square deviation, are extracted from the IMU data of two legs, and the calculated eigenvalues are coupled to generate new eigenvalues. The specific formula is as follows:
The mean value calculation expression is as Equation (13).
x ¯ = 1 N i = 1 N x i
In the equation, N is the number of samples, and x i is the sample value.
The mean square error is calculated as Equation (14).
σ = 1 ( N 1 ) i = 1 N ( x i x ¯ ) 2
The mean square error is calculated as Equation (15).
n o r m = i = 1 N x i 2
The calculation formula of kurtosis is shown in Equation (16).
K u r t o s i s = i = 1 N x i x ¯ 4 N 1 σ 4 3
The calculation formula of skewness is shown in Equation (17).
S K = i = 1 N x i x ¯ 3 N 1 σ 3
The product of maximum and minimum values of forward acceleration of leg is calculated as Equation (18).
P a r a 1 = A c c y max A c c y min
The product of maximum and minimum values of right angular velocity of one leg is calculated as Equation (19).
P a r a 2 = ω x max ω x min
There are also the maximum leg forward acceleration A c c y max , and the maximum leg right angular velocity ω x max .

4. Integrated Navigation Optimization Algorithm Based on ZUPT/Waist/Foot Heading Change

4.1. Navigation State and Measurement Modeling Based on ZUPT

The position, velocity, and attitude calculated by the pure SINS recursive algorithm will diverge with time, so other auxiliary methods need to be introduced to suppress the accumulation of errors. In this paper, the ZUPT method and the combination of waist/foot heading changes are used to periodically correct the navigation error, which can effectively improve the accuracy of the navigation system.
During the movement of pedestrians, the motion state of foot can be roughly divided into the following four stages: static, heel off the ground, swing in the air, and toe to the ground. When the toes are off the ground, the heel has an obvious rotation process; After the foot lift, the foot swings in the air, and the body moves forward; After the leg swing is completed, it enters the heel landing stage, and then the forefoot touches the ground, and the acceleration value has a small amplitude pulse concussion. When entering the zero-velocity stage, the velocity is close to zero, and the sum of the acceleration modulus values is approximately the gravitational acceleration. The process of foot movement is shown in Figure 6. Through experimental research, the inertial sensor is installed on the instep, which can most sensitively sense the movement changes of the foot, and can reflect the movement process of the foot more realistically.
The 18-dimensional linear Kalman filter state quantity is constructed, as shown in equation (20). φ e , φ n , φ u are the platform angle error; δ V e , δ V n , δ V u are the velocity error; δ L , δ λ , δ h are the position error; ε b x , ε b y , ε b z are the gyro random constant; ε r x , ε r y , ε r z are the gyro first-order Markov process random error; And r x , r y , r z are the accelerometer first-order Markov process random error.
X = φ e φ n φ u δ V e δ V n δ V u δ L δ λ δ h ε b x ε b y ε b z ε r x ε r y ε r z r x r y r z T
The error state equation of the established system is shown in Equation (21).
X ˙ ( t ) 18 × 1 = A ( t ) 18 × 18 X ( t ) + G ( t ) 18 × 9 W ( t ) 9 × 1
Among them, A is the system matrix, X is the system state vector, G is the system noise matrix, and W is the system noise.
Based on the inertial/zero-velocity integrated navigation system, the velocity v I N S and position p I N S of the inertial navigation output at the current time and the corresponding velocity v L a s t and position p L a s t at the previous sampling time are measured by the difference as quantity. In the zero-velocity interval, Equation (22) is theoretically established.
Δ v = v I N S v L a s t = 0 Δ p = p I N S p L a s t = 0
When the zero-velocity interval is detected, Z Z U P T ( t ) is the measurement information, and the measurement equation is shown in Equation (23).
Z Z U P T ( t ) = v e I N S v e L a s t v n I N S v n L a s t v u I N S v u L a s t ( L I N S L L a s t ) R n ( λ I N S λ L a s t ) R e cos L h I N S h L a s t = H Z U P T t X t + N Z U P T t
In Equation (23), H is measurement matrix, X is the state vector, N is measurement noise, and e, n and u, respectively, represent east, north and universe direction. v e L a s t , v n L a s t , v u L a s t are zero-velocity, namely 0, 0 and 0. L L a s t , λ L a s t , h L a s t are the position information solved by inertial navigation, L L a s t , λ L a s t , h L a s t are the position information solved by inertial navigation at the last moment. R n , R e , respectively, represent meridian and unitary radius, and L is latitude.

4.2. Measurement Modeling Based on Waist/Foot Heading Changes

The waist of pedestrians also changes periodically when they move. The decomposition diagram of waist motion is shown in Figure 7. According to the motion characteristics of waist angular velocity, it is found that the z-axis acceleration and z-axis angular velocity change the most, that is, the pedestrian waist heading has the most obvious change. The orientation of the IMU mounted on the waist returns to the original orientation twice for every two steps.
This paper studies pedestrian navigation based only on its own inertial sensors. The position and velocity errors can be corrected by relying on the ZUPT algorithm, but there is no external observable information to correct the pedestrian’s heading. When a person is moving, heading is calculated according to the pure inertial data of waist and foot.
Figure 8 are the comparison diagrams of heading of waist and heading of foot when walking a closed rectangular path. In the figures, for the convenience of observation, the initial heading of waist and foot are given the same value during parameter initialization. Figure 8 reflects the fact that the heading of waist is relatively more stable than foot. Figure 8c does not present a straight line as in Figure 8a due to foot swinging in the air. It can be seen from the enlarged detail of Figure 8d that when the foot is in the zero-speed interval, the heading of foot tends to be stable.
The movement of waist is smoother than that of foot, and the error of heading is smaller. However, in the actual moving process, the movement directions of waist and foot are not completely consistent, so heading of waist cannot be directly used as an observation. Theoretically, the variation of heading of waist Δ ψ w a i s t and heading of foot Δ ψ f o o t should both be 0° for each step of walking, and the difference Δ ψ between Δ ψ w a i s t and Δ ψ f o o t can be constructed as virtual observation. The observation information is shown in Equation (24).
Δ ψ = Δ ψ ¯ w a i s t Δ ψ ¯ f o o t Δ ψ w a i s t = ψ ¯ w a i s t k ψ ¯ w a i s t k 1 Δ ψ f o o t = ψ ¯ f o o t k ψ ¯ f o o t k 1
In the equation, ψ ¯ w a i s t k and ψ ¯ w a i s t k 1 , respectively, represent the mean value of waist heading of the current step and previous step, and ψ ¯ f o o t k and ψ ¯ f o o t k 1 , respectively, represent the mean value of foot heading of the current step and previous step.
The measurement equation is shown in Equation (25).
Z ψ ( t ) = Δ ψ ¯ w a i s t Δ ψ ¯ f o o t = H ψ t X t + N ψ t
The state equation is shown in Equation (26).
H ψ t = sin ψ * sin θ cos θ cos ψ * sin θ cos θ 1 0 1 × 15
where θ is the pitch angle.
To sum up, the measurement equation of the combined system is shown in the Equation (27).
Z ( t ) = v e I N S v e L a s t v n I N S v n L a s t v u I N S v u L a s t ( L I N S L L a s t ) R n ( λ I N S λ L a s t ) R e cos L h I N S h L a s t Δ ψ ¯ w a i s t Δ ψ ¯ f o o t = H t X t + N t = H z u p t ( t ) H ψ ( t ) X ( t ) + N z u p t ( t ) N ψ ( t )

5. Experiment and Result Analysis

5.1. Experimental Conditions

The MTW inertial sensor suit from the Netherlands Xsens company is used in the experiment. The appearance and orientation of the sensor are shown in Figure 9a, and the wearing positions of the inertial sensor network are shown in Figure 9b. The sampling frequency of the accelerometer and gyroscope is 100 Hz, and the sampling frequency of the barometer is 50 Hz. The mass of the single sensor is 27 g and the size is 34.5 × 57.8 × 14.58 mm. The performance parameters of MTW are shown in Table 1.
Comprehensive tests are selected in indoor scenarios. The location is from the third floor to the fifth floor of the no.1 Building of the College of Automation Engineering. The indoor corridor scene is shown in Figure 10.

5.2. Analysis of Experimental Results

In order to verify the validity of the motion mode recognition algorithm of the inertial sensor network and ZUPT + WF navigation algorithm proposed in this paper, two groups of repeated experiments were done. In this experiment, walking is the main motion mode. The three-dimensional path of the experiment is shown in Figure 11a. The start point starts from the third floor, then walks and goes upstairs to the fifth floor. Then run, walk, go downstairs to the third floor, and walk back to the origin. In order to evaluate the positioning error, the motion trajectory is projected onto a two-dimensional plane, and four reference points are set on the experimental path to calculate the error, as shown in Figure 11b. ①, ②, ③, and ④ represent the sequence of walking paths respectively. The coordinates of the reference point are (0, 0), (54.6, 0), (54.6, 47.7), (0, 47.7), and the unit is meters. The total length of the four straight lines of the corridor is 204.6 m. The vertical distance from the third floor to the fifth floor is 7.59 m.

5.2.1. The Effect of Data Preprocessing

Every time the IMU is powered on to record motion data, keep the body still for 5 s to deduct the gravity component. The comparisons before and after deducting the gravity component by projection are shown in Figure 12.
SG filter is performed on the angular velocity of legs, and the comparison diagrams before and after filtering are shown in Figure 13. After filtering, the angular velocity data curve is smooth, with obvious periodicity and sinusoidal distribution.

5.2.2. Algorithm Verification of Pedestrian Gait Segmentation and Motion Mode Recognition

In the two groups of experiments, the first experiment is taken as an example to illustrate in detail. In motion mode recognition, one single leg swing is taken as a step. According to the experimental data, the actual total number of movement steps is 434. Overall, the misrecognized gait segmentation is mainly in the start and stop phases of movement. In these two stages, the movement of a single leg is small, and the eigenvalues are relatively close, which is prone to misjudgment as shown in Figure 14.
After statistics, the gait segmentation recognition rates of the two groups of experiments are shown in Table 2.
Since the motion amplitude of one leg is not large and the eigenvalues are close, the motion modes are also easily confused at the start and stop stages. However, in any continuous motion mode, it is not easy to appear eigenvalues close to other motion modes, and the mode recognition rate is high in the motion process. In running mode, the pedestrian’s velocity gradually increases when he starts running, and slows down when he is about to stop running. Two steps before running and two steps after running can be considered as walking mode, so it is not included in error detection. For going upstairs from the third floor to the fifth floor and going downstairs from the fifth floor to the third floor, these two motion modes are performed continuously. In the process of going upstairs and going downstairs, it also moves when passing through the platform at the corner. The modes in continuous motion are modified by filtering and smoothing algorithms. Table 3 shows the comparative statistics of various accuracy rates of motion modes recognition. Figure 15 shows the recognition results of multi-motion modes of the human body based on the LSTM algorithm. KNN algorithm has the lowest recognition rate for going downstairs. DT algorithm is the best for walking, but poor for recognizing going downstairs; The overall accuracy of the SVM algorithm exceeds 90%; Overall, the LSTM algorithm has the highest accuracy. Among the misrecognized points, no running is misrecognized as other sports, nor other sports are misrecognized as running. Therefore, it does not affect the zero-velocity detection in the subsequent navigation algorithm research.

5.2.3. ZUPT Algorithm and Verification

Based on the requirement of pedestrian navigation accuracy, it is necessary to accurately recognize the zero-velocity interval. Aiming at the problem of insufficient accuracy and weak robustness of recognizing zero-velocity detection with a single threshold value, a multi-threshold combinating zero-velocity detection method is constructed. In this method, discriminant eigenvalues are used for recognizing the zero-velocity interval with acceleration mean square error, acceleration modulus, angular velocity mean square error and angular velocity modulus in a sliding window. The specific sub-zero-velocity discrimination method is as follows:
a i = f x 2 + f y 2 + f z 2 σ a = 1 k + 1 i = n n + k ( a i a ¯ ) 2 λ 1 ( k ) = 1       σ a < ε a 1 0       o t h e r s
a v = a l a a a a
a v = a l a a a a
a v = a l a a a a
In Equations (28)–(31), k represents sliding window size, f represents the acceleration, ω represents angular velocity; x, y, and z represent the three axes of the accelerometer and gyroscope sensor; ε a 1 and ε a 2 represent the set acceleration mean square error threshold and acceleration modulus threshold, respectively. ε ω 1 , ε ω 2 , respectively, represent the set gyro mean square error threshold and gyro modulus threshold. The zero-velocity interval is detected by the combination of these four kinds of zero-velocity detection thresholds, and the detection accuracy is improved.
In view of the fast velocity in the running state, the zero-velocity interval is shorter than the other three motion modes, and the eigenvalues change greatly. The characteristics of smooth movement (walking, going upstairs, going downstairs) and non-smooth movement (running) are analyzed, and different zero-velocity detection methods are set up, respectively. The comprehensive judgment scheme of zero-velocity detection is shown in Equation (32), when the motion mode is walking, going upstairs, or going downstairs.
Z 1 ( k ) = 1 λ 1 ( k ) & & λ 2 ( k ) & & λ 3 ( k ) & & λ 4 ( k ) 0                                             o t h e r s
When the motion mode is running,
Z 2 ( k ) = 1 ( λ 1 ( k ) | | λ 3 ( k ) ) & & ( λ 2 ( k ) | | λ 4 ( k ) ) 0                                           o t h e r s
The detection figures of zero-velocity detection are shown in Figure 16. In order to facilitate viewing the zero-velocity interval in the figure, the value of the flag of the zero-velocity interval is temporarily increased here. The zero-velocity detection method studied in this paper can accurately recognize the zero-velocity interval and provide a basis for subsequent pedestrian navigation based on ZUPT.
The statistics of the number of movement steps for the two groups of comparative experiments are shown in Table 4. For the same experimenter, the stride and velocity of the movement are similar, so the number of steps in the two groups is almost the same. When making statistics, one step is taken as the number of swings of right foot, and a zero-velocity interval is between two steps. Compared with the number of steps in motion mode recognition, the data in the zero-velocity interval is reduced by half, and the recognition rates of the two groups in the zero-velocity interval are 99.5% and 100%, respectively.

5.2.4. Verification of Combined Algorithm Based on Zero-Velocity/Waist Heading Change Constraints

In order to ensure the reliability of the experimental results, the coordinate points (0, 0), (54.6, 0), (54.6, 47.7), (0, 47.7) at the four corners of the experimental path are used as reference points to calculate the error between the position solved by the algorithm and the actual position. Root Mean Squared Error (RMSE) is used as the error evaluation method, and RMSE is calculated by Equation (34):
R M S E = 1 n i = 1 n X o b s , i X r e f , i 2
Among them, n is 4, X o b s , i is the estimated position of the algorithm, X r e f , i is the reference point position, and RMSE is the final positioning error.
The LSTM + ZUPT + WF navigation algorithm proposed in this paper is compared with the traditional LSTM + ZUPT algorithm. The positioning results are shown in Figure 17, and the error comparison of the positioning results is shown in Table 5. In Figure 17a, the red line represents the proposed LSTM + ZUPT + WF algorithm, the blue line represents the traditional LSTM + ZUPT algorithm, and the green line represents the SVM + ZUPT + WF algorithm. It can be seen that the proposed LSTM + ZUPT + WF navigation algorithm has higher precision and lower RMSE value than the traditional LSTM + ZUPT algorithm and SVM + ZUPT + WF. In Figure 17b, the blue line represents the 3D trajectory of the LSTM + ZUPT + WF navigation algorithm.

6. Discussion

The purpose of this paper is to develop an autonomous multi-motion modes recognition and navigation optimization method suitable for indoor pedestrians. In this paper, the mode recognition algorithm based on gait segmentation is studied, which can achieve accurate segmentation of the four motion modes of walking, running, going upstairs, and going downstairs. According to the motion characteristics in each step of the four motion modes, the highly differentiated motion characteristic values are selected to train the machine learning network of LSTM for motion mode recognition. Because the traditional zero-velocity interval detection only studies the walking state, the zero-velocity interval judgment method is not applicable to multi-motion modes. Combined with the characteristic that the zero-velocity interval in the running state is shorter than the other three motion modes, on the basis of motion mode recognition, the motion modes are divided into two categories: walking, going upstairs, and going downstairs as one category, and running as the other category, a hierarchical zero-velocity interval recognition framework is constructed. For the traditional LSTM + ZUPT-based pedestrian navigation algorithm, since there is no external heading information correction, the moving heading will drift gradually over time. Therefore, a navigation method based on waist/foot heading information is studied, and a pedestrian navigation method based on LSTM + ZUPT + WF is proposed. The virtual observation is constructed by the difference between the pedestrian’s own waist heading and foot heading to correct the navigation heading. In order to verify the effectiveness of the method proposed in this paper, two sets of indoor comprehensive experiments are conducted. The experiments include four motion modes: walking, running, going upstairs, and going downstairs. The walking mode accounts for the highest proportion in the experiment. Through the experimental data, it is verified that the accuracy of the proposed gait segmentation algorithm exceeds 98.60%, the comprehensive recognition rate of the four motion modes is 96.77%, and the detection rate of the zero-velocity interval is more than 99.00%. Compared with the traditional LSTM + ZUPT navigation algorithm, The RMSE of the navigation algorithm based on LSTM + ZUPT + WF is reduced by 59.70%. For the horizontal projection trajectory of pedestrian movement, the difference between the start point and the end point is 2.59 m, and the error is 1.26% of the total distance. Reference [18] uses the SVM algorithm for recognition. Compared with SVM + ZUPT + WF, LSTM + ZUPT + WF has better classification accuracy and navigation accuracy than SVM + ZUPT + WF. Without relying on external sensors, the proposed method can provide high-precision navigation and positioning information for indoor pedestrians with multi-motion modes, which is of great significance for the practical application of pedestrian inertial navigation.
Although the algorithm proposed in this paper can provide high-precision navigation and positioning information, there are still some limitations. First, the method does not use external sensor assistance, and the initial heading of the pedestrian needs to be set manually; Secondly, only four common motion modes have been studied so far, and special indoor tasks may also include jumping, retreating, sideways movement, or even crawling and climbing, etc. Further research can be carried out according to actual needs in the future; Then, if the heading cannot be corrected for a longer period of movement, UWB anchors can be deployed in advance at special positions during the movement, such as a corner of the corridor; Finally, only single pedestrian navigation technology is studied. If special tasks are performed indoors, indoor features can be fully utilized, or information can be shared among task team members to further improve the navigation and positioning accuracy. Only single pedestrian navigation technology is studied. If indoor special tasks are carried out, indoor features can be fully utilized, or information can be shared among task group members to further improve navigation and positioning accuracy. The algorithm proposed in this paper has high accuracy in practical navigation and positioning and has good research value. It can be used in multi-person cooperative navigation research in the future.

Author Contributions

Conceptualization, Z.W., Z.X. and L.X.; methodology, Z.W.; validation, Z.W., Y.D. and Y.S.; formal analysis, Z.W.; software, Z.W.; data curation, Y.D. and Y.S.; supervision, Z.X. and L.X., writing—original draft, Z.W., Z.X. and L.X.; writing—review and editing, Z.W., Z.X., Y.D., Y.S. and L.X. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China, grant number 61873125; The National Natural Science Foundation of China, grant number 62073163; The National Natural Science Foundation of China, grant number 62103285; Support for projects in special zones for national defense science and technology innovation; The advanced research project of the equipment development grant number 30102080101; National Basic Research Program, grant number JCKY2020605C009; The Natural Science Fund of Jiangsu Province, grant number BK20181291; The Aeronautic Science Foundation of China, grant number ASFC-2020Z071052001; The Aeronautic Science Foundation of China, grant number 202055052003; The Fundamental Research Funds for the Central Universities, grant number NZ2020004; Shanghai Aerospace Science and Technology Innovation Fund, grant number SAST2019-085; Introduction plan of high end experts, grant number G20200010142. Foundation of Key Laboratory of Navigation, Guidance and Health-Management Technologies of Advanced Aerocraft (Nanjing Univ. of Aeronautics and Astronautics), Ministry of Industry and Information Technology, Jiangsu Key Laboratory “Internet of Things and Control Technologies” & the Priority Academic Program Development of Jiangsu Higher Education Institutions, Science and Technology on Avionics Integration Laboratory. Supported by the 111 Project, grant number B20007.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Liu, Y.; Cai, T.; Yang, H.; Liu, C.; Song, J.; Yu, M. The Pedestrian Integrated Navigation System with micro IMU/GPS/magnetometer/barometric altimeter. Gyroscopy Navig. 2016, 7, 29–38. [Google Scholar] [CrossRef]
  2. Fan, Q.; Sun, B.; Sun, Y.; Zhuang, X. Performance Enhancement of MEMS-Based INS/UWB Integration for Indoor Navigation Applications. IEEE Sens. J. 2017, 17, 3116–3130. [Google Scholar] [CrossRef]
  3. Chen, J.; Ou, G.; Peng, A.; Zheng, L.; Shi, J. An INS/WiFi Indoor Localization System Based on the Weighted Least Squares. Sensors 2018, 18, 1458. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Zhu, N.; Zhao, H.; Feng, W.; Wang, Z. A novel particle filter approach for indoor positioning by fusing WiFi and inertial sensors. Chin. J. Aeronaut. 2015, 28, 1725–1734. [Google Scholar] [CrossRef] [Green Version]
  5. Zhao, Q.; Zhang, B.X.; Lyu, S.C.; Zhang, H.; Sun, D.; Li, G.; Feng, W. A CNN-SIFT hybrid pedestrian navigation method based on first-person vision. Remote Sens. 2018, 10, 1229. [Google Scholar] [CrossRef] [Green Version]
  6. Xu, L.; Xiong, Z.; Liu Wang, Z.; Ding, Y.A. Novel Pedestrian Dead Reckoning Algorithm for Multi-Mode Recognition Based on Smartphones. Remote Sens. 2019, 11, 294. [Google Scholar] [CrossRef] [Green Version]
  7. Ding, Y.; Xiong, Z.; Li, W.; Cao, Z.; Wang, Z. Pedestrian Navigation System with Trinal-IMUs for Drastic Motions. Sensors 2020, 20, 5570. [Google Scholar] [CrossRef] [PubMed]
  8. Foxlin, E. Pedestrian tracking with shoe-mounted inertial sensors. IEEE Comput. Graph. Appl. 2005, 25, 38–46. [Google Scholar] [CrossRef] [PubMed]
  9. Tian, X.; Chen, J.; Han, Y.; Shang, J.; Li, N. A Novel Zero Velocity Interval Detection Algorithm for Self-Contained Pedestrian Navigation System with Inertial Sensors. Sensors 2016, 16, 1578. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Ceron, J.D.; Martindale, C.F.; Diego, M.; López, D.M.; Kluge, F.; Eskofier, B.M. Indoor Trajectory Reconstruction of Walking, Jogging, and Running Activities Based on a Foot-Mounted Inertial Pedestrian Dead-Reckoning System. Sensors 2020, 20, 651. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Li, W.; Xiong, Z.; YDing Cao, Z.; Wang, Z. Lower Limb Model Based Inertial Indoor Pedestrian Navigation System for Walking and Running. IEEE Access 2021, 9, 42059–42070. [Google Scholar] [CrossRef]
  12. Xu, Y.; Xi-yuan, C. Indoor pedestrian navigation based on double-IMU framework. J. Chin. Inert. Technol. 2015, 23, 71. [Google Scholar]
  13. Lee, M.S.; Ju, H.; Song, J.W.; Park, C.G. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking. Sensors 2015, 15, 28129–28153. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Wang, H.; Cong, L.; Qin, H. A Real-Time Pedestrian Dead Reckoning System With FM-Aided Motion Mode Recognition. Sens. J. IEEE 2018, 19, 3020–3032. [Google Scholar] [CrossRef]
  15. Shi, L.F.; Zhao, Y.L.; Liu, G.X.; Chen, S.; Wang, Y.; Shi, Y.F. A Robust Pedestrian Dead Reckoning System Using Low-Cost Magnetic and Inertial Sensors. IEEE Trans. Instrum. Meas. 2019, 68, 2996–3003. [Google Scholar] [CrossRef]
  16. Niu, X.; Li, Y.; Kuang, J.; Zhang, P. Data Fusion of Dual Foot-Mounted IMU for Pedestrian Navigation. IEEE Sens. 2019, 19, 4577–4584. [Google Scholar] [CrossRef]
  17. Liu, Y.; Yu, Y.; Lu, Y.L.; Guo, J.Q.; Di, K.; Chen, Y.W. Real-time human activity recognition based on time-domain features of multi-sensor. Zhongguo Guanxing Jishu Xuebao J. Chin. Inert. Technol. 2017, 25, 455–460. [Google Scholar]
  18. Wagstaff, B.; Peretroukhin, V.; Kelly, J. Improving Foot-Mounted Inertial Navigation Through Real-Time Motion Classification. In Proceedings of the 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sapporo, Japan, 18–21 September 2017. [Google Scholar]
  19. Wang, B.; Liu, X.; Yu, B.; Jia, R. Posture Recognition and Adaptive Step Detection Based on Hand-held Terminal. In Proceedings of the 2018 Ubiquitous Positioning, Indoor Navigation and Location-Based Services (UPINLBS), Wuhan, China, 22–23 March 2018. [Google Scholar]
  20. Kasebzadeh, P.; Radnosrati, K.; Hendeby, G.; Gustafsson, F. Joint Pedestrian Motion State and Device Pose Classification. IEEE Trans. Instrum. Meas. 2019, 69, 5862–5874. [Google Scholar] [CrossRef]
  21. Elhoushi, M.; Georgy, J.; Noureldin, A.; Korenberg, M.J. Motion Mode Recognition for Indoor Pedestrian Navigation Using Portable Devices. IEEE Trans. Instrum. Meas. 2016, 65, 208–221. [Google Scholar] [CrossRef]
  22. Ahmed, D.B.; Frank, K.; Heirich, O. Recognition of Professional Activities with Displaceable Sensors. In Proceedings of the 2015 IEEE 82nd Vehicular Technology Conference (VTC2015-Fall), Boston, MA, USA, 6–9 September 2015; pp. 1–5. [Google Scholar] [CrossRef] [Green Version]
  23. Yiyan, L.; Fang, Z.; Wenhua, S.; Haiyong, L. An hidden Markov model based complex walking pattern recognition algorithm. In Proceedings of the 2016 Fourth International Conference on Ubiquitous Positioning, Indoor Navigation and Location Based Services (UPINLBS), Shanghai, China, 9 January 2017. [Google Scholar]
Figure 1. Four-node inertial sensor network and pedestrian navigation system.
Figure 1. Four-node inertial sensor network and pedestrian navigation system.
Sensors 22 05022 g001
Figure 2. Schematic diagram of multi-motion modes. (a) Static; (b) Walking; (c) Running; (d) Going upstairs; (e) Going downstairs.
Figure 2. Schematic diagram of multi-motion modes. (a) Static; (b) Walking; (c) Running; (d) Going upstairs; (e) Going downstairs.
Sensors 22 05022 g002
Figure 3. Scheme of the autonomous method for multi-motion modes recognition and zero-velocity detection for indoor pedestrian.
Figure 3. Scheme of the autonomous method for multi-motion modes recognition and zero-velocity detection for indoor pedestrian.
Sensors 22 05022 g003
Figure 4. Digram of angular velocity of legs and gait segmentation: (a) X-axis angular velocity of legs; and (b) gait segmentation.
Figure 4. Digram of angular velocity of legs and gait segmentation: (a) X-axis angular velocity of legs; and (b) gait segmentation.
Sensors 22 05022 g004
Figure 5. Recognition diagram of dynamic and static state.
Figure 5. Recognition diagram of dynamic and static state.
Sensors 22 05022 g005
Figure 6. Decomposition diagram of foot motion: (a) static; (b) heel off the ground; (c) swing in the air; (d)toe to the ground; and (e) static.
Figure 6. Decomposition diagram of foot motion: (a) static; (b) heel off the ground; (c) swing in the air; (d)toe to the ground; and (e) static.
Sensors 22 05022 g006
Figure 7. Decomposition diagram of waist motion.
Figure 7. Decomposition diagram of waist motion.
Sensors 22 05022 g007
Figure 8. Comparison figures of heading of waist between heading of foot: (a) heading of waist; (b) enlarged view of heading of waist; (c) heading of foot; and (d) enlarged view of heading of foot.
Figure 8. Comparison figures of heading of waist between heading of foot: (a) heading of waist; (b) enlarged view of heading of waist; (c) heading of foot; and (d) enlarged view of heading of foot.
Sensors 22 05022 g008
Figure 9. Appearance and installation configuration of MTW suit: (a) sensor appearance and object coordinate system; and (b) four-node inertial sensor network configuration.
Figure 9. Appearance and installation configuration of MTW suit: (a) sensor appearance and object coordinate system; and (b) four-node inertial sensor network configuration.
Sensors 22 05022 g009
Figure 10. Indoor experimental scene.
Figure 10. Indoor experimental scene.
Sensors 22 05022 g010
Figure 11. Trajectory of the experiment: (a) three-dimensional trajectory; and (b) two-dimensional trajectory projection.
Figure 11. Trajectory of the experiment: (a) three-dimensional trajectory; and (b) two-dimensional trajectory projection.
Sensors 22 05022 g011
Figure 12. Comparison diagrams before and after deducting gravity component. (a) Y-axis acceleration before and after deducting gravity component; (b) Z-axis acceleration before and after deducting gravity component.
Figure 12. Comparison diagrams before and after deducting gravity component. (a) Y-axis acceleration before and after deducting gravity component; (b) Z-axis acceleration before and after deducting gravity component.
Sensors 22 05022 g012
Figure 13. Comparison diagrams before and after SG filter: (a) X-axis angular velocity before and after SG filter; and (b) Y-axis angular velocity before and after SG filter.
Figure 13. Comparison diagrams before and after SG filter: (a) X-axis angular velocity before and after SG filter; and (b) Y-axis angular velocity before and after SG filter.
Sensors 22 05022 g013
Figure 14. Diagram of gait segmentation.
Figure 14. Diagram of gait segmentation.
Sensors 22 05022 g014
Figure 15. Recognition of multi-motion modes with LSTM for pedestrians.
Figure 15. Recognition of multi-motion modes with LSTM for pedestrians.
Sensors 22 05022 g015
Figure 16. Detections of zero-velocity interval: (a) walking; (b) running; (c) going upstairs; and (d) going downstairs.
Figure 16. Detections of zero-velocity interval: (a) walking; (b) running; (c) going upstairs; and (d) going downstairs.
Sensors 22 05022 g016
Figure 17. Pedestrian experimental trajectory: (a) two-dimensional trajectory projection; and (b) three-dimensional trajectory.
Figure 17. Pedestrian experimental trajectory: (a) two-dimensional trajectory projection; and (b) three-dimensional trajectory.
Sensors 22 05022 g017
Table 1. Parameters of MTW.
Table 1. Parameters of MTW.
/Full ScaleBias Stability
Acceleration±160 m/s2/
Angular velocity±1200 deg/s20 deg/h
Pressure300–1100 mBar100 Pa/year
Table 2. Recognition rate of gait segmentation.
Table 2. Recognition rate of gait segmentation.
/Number of Real StepsNumber of Steps DetectedAccuracy
Experiment 143443799.31%
Experiment 243043698.60%
Table 3. Recognition results of multi-motion modes.
Table 3. Recognition results of multi-motion modes.
Motion ModeReal StepsRecognition Rate with KNN (%)Recognition Rate with DT (%)Recognition Rate with SVM (%)Recognized Steps with LSTMRecognition Rate with LSTM (%)
Walking27095.5698.5293.3326096.30
Running5296.1598.0896.1552100.00
Going upstairs5694.2391.0792.8656100.00
Going downstairs5682.1426.7989.295292.86
Subtotal43493.7888.2593.5542096.77
Table 4. Detection of zero-velocity interval.
Table 4. Detection of zero-velocity interval.
/Motion ModesReal StepsDetecting StepsAccuracy
Experiment 1Walking + going upstairs + going downstairs18218399.85%
Running2525
Experiment 2Walking + going upstairs + going downstairs182182100%
Running2525
Table 5. Error comparison of positioning results.
Table 5. Error comparison of positioning results.
/Distance between the Start Point and End Point with LSTM + ZUPT (Meter)RMSE with LSTM + ZUPT (Meter)Distance between the Start Point and End Point with SVM + ZUPT + WF (Meter)RMSE with SVM + ZUPT + WF (Meter)Distance between the Start Point and End Point with LSTM + ZUPT + WF (Meter)RMSE with LSTM + ZUPT + WF (Meter)
Experiment 14.217.713.214.882.592.67
Experiment 24.8910.234.725.092.764.12
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wang, Z.; Xiong, Z.; Xing, L.; Ding, Y.; Sun, Y. A Method for Autonomous Multi-Motion Modes Recognition and Navigation Optimization for Indoor Pedestrian. Sensors 2022, 22, 5022. https://doi.org/10.3390/s22135022

AMA Style

Wang Z, Xiong Z, Xing L, Ding Y, Sun Y. A Method for Autonomous Multi-Motion Modes Recognition and Navigation Optimization for Indoor Pedestrian. Sensors. 2022; 22(13):5022. https://doi.org/10.3390/s22135022

Chicago/Turabian Style

Wang, Zhengchun, Zhi Xiong, Li Xing, Yiming Ding, and Yinshou Sun. 2022. "A Method for Autonomous Multi-Motion Modes Recognition and Navigation Optimization for Indoor Pedestrian" Sensors 22, no. 13: 5022. https://doi.org/10.3390/s22135022

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop