Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Next Article in Journal
Laplacian Eigenmaps Network-Based Nonlocal Means Method for MR Image Denoising
Previous Article in Journal
Adaptive Motion Artifact Reduction Based on Empirical Wavelet Transform and Wavelet Thresholding for the Non-Contact ECG Monitoring Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

SINS/Landmark Integrated Navigation Based on Landmark Attitude Determination

1
College of Liberal Arts and Sciences, National University of Defense Technology, Changsha 410073, China
2
Beijing Institute of Spacecraft System Engineering, China Academy of Space Technology, Beijing 100094, China
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(13), 2917; https://doi.org/10.3390/s19132917
Submission received: 30 May 2019 / Revised: 22 June 2019 / Accepted: 25 June 2019 / Published: 1 July 2019
(This article belongs to the Section Remote Sensors)

Abstract

:
Based on the situation that the traditional SINS (strapdown inertial navigation system)/CNS (celestial navigation system) integrated navigation system fails to realize all-day and all-weather navigation, this paper proposes a SINS/Landmark integrated navigation method based on landmark attitude determination to solve this problem. This integrated navigation system takes SINS as the basic scheme and uses landmark navigation to correct the error of SINS. The way of the attitude determination is to use the landmark information photographed by the landmark camera to complete feature matching. The principle of the landmark navigation and the process of attitude determination are discussed, and the feasibility of landmark attitude determination is analyzed, including the orthogonality of the attitude transform matrix, as well as the influences of the factors such as quantity and geometric position of landmarks. On this basis, the paper constructs the equations of the SINS/Landmark integrated navigation system, testifies the effectiveness of landmark attitude determination on the integrated navigation by Kalman filter, and improves the navigation precision of the system.

1. Introduction

In view of the advantages of strong autonomy, anti-interference and good concealment, SINS and CNS are widely used [1,2]. Inertial devices can provide the navigation data of attitude, velocity and position. Star sensors measure the information of astronomical object. This information can be combined and used to obtain more precise navigation results [3,4,5,6]. However, in real application, CNS is unable to work continuously throughout the day. In addition, it can be affected by bad weather, thus SINS/CNS integrated navigation cannot realize a continual navigation for a long time [7].
Recently, as a new navigation technology, landmark navigation has received wide attention globally. As can be seen, landmark navigation is technically different from CNS. It is based on the visualization of the landmarks, which can solve the drawbacks of star sensors and navigate the aircraft by using the geographical features of the planet’s surface [8,9]. To compensate the application limitation, this paper combines SINS with landmark navigation to construct a pattern with high precision to meet navigation requirements.
Currently, some studies have been conducted on the theory and application of landmark navigation. For the single landmark navigation, Costello and Castro [10] extracted landmark characteristics from 2D sensor images, and combined them with the landmark database to estimate the latitude, longitude and attitude of the vehicles on the ground or targets at space. Cesetti et al. [11] proposes a vision-based method of applying UAV (unmanned aerial vehicle) for landing. The basic principle of the guiding method is to define the target area by satellite images or antenna with high resolution . The key of the navigation strategy is to locate proper natural landmarks by feature matching algorithm.
For the applications of landmark navigation in practice, He et al. [12] proposed a method to obtain the information of the space target emitter by landmarks. The purpose is to obtain the azimuth angle of the target, and then calculate the attitude by planarization algorithm and linearization technique. In Khuller et al. [13], landmark navigation is applied in the field of robot industry. Robots can realize the position determination through the landmarks based on the direction information provided by visual inspection. It proposes the minimum set of landmarks for the robot position determination, and the quantity of this minimum set of landmarks is called “measurement dimension”.
However, landmark navigation is also limited by the weather and environment conditions, thus it should be integrated with other navigation methods to realize a more precious navigation. In the integration algorithm between the landmark navigation and other autonomous navigation methods, Kim and Hwang [14] raised the integration of SINS and landmark navigation. Its navigation mode is roughly the same as the one in this paper. It is designed to solve the problem of visual navigation under poor sight. When the environment makes it hard to observe landmarks, the landmark visual navigation is forced to interrupt; the inertial devices continue working and providing position, velocity and attitude information. This method can make up the shortfalls of single landmark navigation and realize a continuous navigation.
When the landmark navigation is combined with other non-autonomous navigations, Babel [15] proposed a navigation algorithm to determine the shortest route of UAV(unmanned aerial vehicle). The landmark navigation is used to compensate the navigation interruption of the GPS. Meanwhile, the method regularly updates the landmark base to realize long-term landmark navigation.
The above schemes all use landmarks for position determination, and the precision of their navigation results can be improved greatly. Under the practical research background, both the aircraft and the astronomical object are relatively rotating to the inertial system, thus it is hard to identify the real attitude of the aircraft to the fixed coordinate system of astronomical object. To achieve the best navigation result, the landmark information should be fully used to accomplish the attitude determination. Therefore, the key of the landmark navigation is to determine the attitude based on landmark characteristics.
In view of the autonomous navigation around the Earth, this paper proposes a SINS/Landmark integrated navigation method with high precision based on landmark attitude determination to meet the demands of the all-day and all-weather autonomous navigation. During the landmark navigation, landmark camera photographs the landmarks at its sampling frequency to capture the landmark characteristics. Then, the characteristics are recognized and matched with the landmark database to obtain the position and attitude information of the aircraft. Position and attitude information by the landmark navigation is combined with the measurement by inertial devices to get more precise navigation results. This paper also introduces the landmark navigation’s principle and attitude determination process, constructs the equations for SINS/Landmark integrated navigation system, compares the navigation results with those have non-attitude determination in the simulations, and testifies the effectiveness of landmark attitude determination on integrated navigation.
The main contributions of this paper include the following. (1) Combine landmark navigation with SINS: This paper proposes SINS/Landmark integrated navigation mode, which can continue navigating and realizing an all-day and all-weather navigation when the SINS/CNS mode is not working. (2) This method uses the landmark features obtained by the landmark camera to accomplish attitude determination, and combines the attitude information with the measurement of SINS to realize an integrated navigation with high precision. (3) This method applies Kalman filter to the linear navigation system, and illustrates the influence of the attitude determination on the navigation accuracy by comparison in the simulations.
The remainder of the paper is structured as follows. Section 2 introduces the principle of landmark navigation, including the acquisition and matching of landmark information. Section 3 introduces the process of determining the attitude by using the landmarks. In this process, the computability of the rotation matrix is verified, and the feasibility of the landmark attitude determination is theoretically explained. Section 4 introduces the SINS/Landmark integrated navigation and gives the system equations. In Section 5, the effectiveness of the SINS/Landmark integrated navigation with landmark attitude determination is verified by experimental simulation. Section 6 is the conclusion.

2. Principles of Landmark Navigation

2.1. Acquisition of Landmark Information

The landmark information can be obtained from the landmark camera, and the operation process is mainly divided into two parts: the shooting process of the landmark camera and the matching process of the landmark feature database.
It is known that the imaging principle of the camera is “small hole imaging”, and the landmark position vector obtained by the landmark information is actually the mapping point of the camera imaging focus, which is similar to the star sensor. However, in practical applications, the feature points of the mapping area may not be obvious and easy to extract. Therefore, the camera chooses the landmarks with clear features as the selection set, which is used to replace the virtual landmark of the mapping point. As for the equations of the position and attitude determination, there is no difference between the two selections.
As shown in Figure 1, when the aircraft (navigation target) orbits around the Earth, the landmark camera first captures the landmark images, and the images are transmitted to the information processing module. Then, the features are compared and analyzed, including the position information of the landmarks on the Earth, and then follows up. To keep the navigation going, the duration of the matching process should be less than the shooting time of the camera.
As shown in Figure 1, there are the processes of shooting, landmark capture and landmark information acquisition, respectively.
(a) Firstly, shoot the landmarks by the landmark camera. It should be noted that only one photo can be taken at a time depending on the shooting frequency of the camera.
(b) Secondly, the landmark camera obtains the images of the landmarks. At this time, multiple landmarks and landmark feature points are obtained according to the photos, and these feature points participate in the next step for matching.
(c) Finally, the landmark feature points in the photo are extracted and then matched with the feature points database. When the matching is completed, the aircraft’s space model is established for navigation. The space model is shown in the third picture of Figure 1. Since the vectors in the figure are all referenced to the Earth-centered-Earth-fixed system, the superscript of the vectors e, ρ 1 e and ρ 2 e are the position coordinates of the landmarks; ρ 0 e is the vector from Landmark 1 to Landmark 2; r e is the position coordinates of the aircraft; and r 1 e and r 2 e are, respectively, the vectors from Landmark 1 and Landmark 2 to the aircraft, i.e., the relative position model of the aircraft and landmarks at space.
When the aircraft is not parallel to the landmark flat, the captured image of the landmark is prone to rotation and deformation as shown IN Figure 2:
When the landmark camera is not parallel to the landmark flat, the segment a b of the landmark is imaged as g on the imaging area of the camera. According to the principle that the proportion of the similar triangle is constant, the imaging of the segment a b in the virtual flat should be h, and then the image compresses in the O direction, the compression proportion κ is as Equation (1):
κ = g h
When the landmark camera is fixed on the aircraft, the transformation matrix between the landmark camera coordinate system and the aircraft coordinate system is a constant matrix. When the landmark camera changes direction with the flight of the aircraft, the transformation matrix will also be changed. In practice, the transformation matrix needs to be selected and calculated according to the situation.

2.2. Matching Process of the Landmark Images

The image rotation and deformation should be considered when matching the landmarks, which will take more time. In practice, this process can be operated by using sparse representation method or feature point detection method.
The sparse representation method is to express few original signals to the total by using a linear combination of these original signals [16,17,18], but it takes a long time, which cannot meet the real-time requirement. The feature point detection method is widely used in the visual-based celestial landing navigation. SIFT (Scale-Invariant Feature Transform) is one of the popular methods with preferable application effect [19,20], which realizes the identification and matching of landmark images by key point detection, description, matching and elimination of mismatch points. This algorithm has good discriminability, strong scalability and better robustness in the case of target rotation and deformation, and it can extract multiple features of the target.
For the matching process of landmarks, the operational process of the SIFT is shown in Figure 3:
The scale space theory is used to search the feature points [21], and the Gaussian pyramid [22] and the Gaussian difference pyramid need to be established to compare with the pixels. The gradient of neighborhood feature points needs to be calculated and the main direction leads to the peak. The KD tree [23] is used for the matching process and the threshold should be set by the circumstances.

3. Position and Attitude Determination of the Landmark Navigation

3.1. Attitude Determination of the Landmark Navigation

3.1.1. Calculation for Attitude Angle

The coordinate systems used in this paper are as follows: i is the inertial coordinate system; e is the Earth-centered-Earth-fixed system; b is the aircraft body coordinate system; and l i is the launching inertial system.
Assume that the position of the aircraft is r e = x e , y e , z e in the Earth-centered-Earth-fixed system, the position of the observed landmark is ρ i e = a i e , b i e , c i e i = 1 , , τ , representing that τ landmark can be observed. Assume that the distance between the aircraft and each landmark point is r i , and α i , β i , η i is the angle between the axes of the aircraft body coordinate system and the vector from the aircraft to the landmark. C e b is the transformation matrix from the Earth-centered-Earth-fixed system to the aircraft body coordinate system, then:
C e b x e a i e r i y e b i e r i z e c i e r i T = cos α i cos β i cos η i
Let
x e a i e r i y e b i e r i z e c i e r i T = X i , cos α i cos β i cos η i = Z i
Because of the measurement error, according to Equation (2), it can be written as Equation (4):
Z = C e b X + ε
where X = X 1 , X τ T , Z = Z 1 , Z τ T , ε = ε 1 , ε τ T N 0 , σ 2 . If τ 3 , the transformation matrix can be calculated by the least squares (LS) method as Equation (5):
C e b = Z X T X X T 1
The Launch point’s coordinate in the geodetic coordinate system is B , L , H , and the angle of incidence is A, then the transformation matrix between the launch inertial system and the geocentric inertial system can be calculated as Equation (6):
C l i i = sin L cos L 0 cos L sin L 0 0 0 1 1 0 0 0 cos B sin B 0 sin B cos B sin A 0 cos A 0 1 0 cos A 0 sin A
The transformation matrix between the geocentric inertial system and the Earth-centered-Earth-fixed system is
C i e = cos ω T sin ω T 0 sin ω T cos ω T 0 0 0 1
where ω is the angular velocity rate of the Earth’s rotation, and T is the rotation time of the Earth. At this point, the transformation matrix between the launch inertial system and the aircraft body coordinate system can be obtained as Equation (8):
C l i b = C e b C i e C l i i
Then, the pitch angle, yaw angle and roll angle of the aircraft can be obtained, separately.
φ = arctan C b l i 2 , 1 C b l i 2 , 1 C b l i 1 , 1 C b l i 1 , 1 ϕ = arcsin C b l i 3 , 1 γ = arctan C b l i 3 , 2 C b l i 3 , 2 C b l i 3 , 3 C b l i 3 , 3
C b l i is the transposition of C l i b , representing the transformation matrix from the aircraft body coordinate system to the launch inertial system.

3.1.2. Computability of the Transformation Matrix

It is necessary to prove the computability of the transformation matrix, but its orthogonal should be proved first. In theory, the transformation matrix is orthogonal obviously, representing the rotation relationship between the two coordinate systems. However, in this paper, the transformation matrix C e b is obtained by the distance scalar r i of the aircraft and each landmark point, as well as the angle α i , β i , η i between the coordinate axes of the aircraft body coordinate system, and the vector from the aircraft to the landmark, thus it is significant to prove its orthogonality.
The computability of the transformation matrix between the launch inertial system and the aircraft body coordinate system is equivalent to that of the transformation matrix C e b from the Earth-centered-Earth-fixed system to the aircraft body coordinate system. That is, Equation (5) can be calculated.
According to Equation (5), the matrix X must be full row rank, that is, the vector from the aircraft to the landmark point can be expanded into the entire space.
Proof for the orthogonality of matrix C e b :
(1) Invariance of the mold length: Let r i be the mold length of x e a i e y e b i e z e c i e T , and it can be known that the mold length of the column vectors in matrix X are all 1, and α i , β i , η i is the angle between the three axes of the aircraft body coordinate system and the vector from the aircraft to the landmark. The column vector mold length of Z is also 1.
(2) Invariability of the angles: For any two landmarks (set to be the ith and the jth), the vectors from aircraft to the two landmarks can be set as r i e and r j e of the Earth-centered-Earth-fixed system, respectively. r i b and r j b are, respectively, the two landmarks of the aircraft body coordinate system, satisfying
r i b = C e b r i e r j b = C e b r j e
It is known that the same angle does not change in size at different coordinates, and consider that
r i e r j e = r i e r j e cos r i e , r j e = cos r i e , r j e r i b r j b = r i b r j b cos r i b , r j b = cos r i b , r j b
Then,
r i e r j e = r i b r j b
For convenience, taking three sets of linear independent column vectors in two coordinate systems, Equation (4) can be rewritten as Equation (13):
C e b X ˜ = Z ˜
where X ˜ = r k 1 e , r k 2 e , r k 3 e T , Z ˜ = r k 1 b , r k 2 b , r k 3 b T , k 1 , k 2 , k 3 are three randomly selected landmarks. According to Equation (12),
X ˜ T X ˜ = r k i e · r k j e i = 1 , 2 , 3 j = 1 , 2 , 3 = r k i b · r k j b i = 1 , 2 , 3 j = 1 , 2 , 3 = Z ˜ T Z ˜
Therefore,
Z ˜ X ˜ 1 = Z ˜ 1 T X ˜ T
Calculating the inverse matrix of C e b , Equation (16) can be obtained after transposition:
C e b 1 T = Z ˜ 1 T X ˜ T
Considering Equation (13),
C e b = Z ˜ 1 T X ˜ T
According to Equations (16) and (17),
C e b T = C e b 1
Therefore, the matrix C e b is orthogonal. Moreover, it is computable.

3.1.3. The Relationship between the Number, Relevance and Accuracy of Landmarks

The selection of landmarks will affect the accuracy of navigation. The influential factors mainly contain the number of landmarks and the relevance of landmark locations (whether they are collinear or not), etc. The two aspects are discussed and the analysis results are given as follows.
From Equation (4), the estimation error of the transformation matrix C e b is as Equation (19):
C o v C e b = X X T 1 σ 2
The relationship between the number of landmarks and the accuracy:
According to Gorbenko and Popov [24], for the set of landmarks, there is usually a minimum set of them. Meanwhile, the completeness can be satisfied. In the process of the position determination, the purpose is to obtain the coordinate of the aircraft. In general, a three-dimensional coordinate has three unknowns elements, and this means the number of observable landmarks should be three or more. The specific proof and analysis can refer to Section 3.2.
However, for the process of attitude determination, since the solution of the transformation matrix is calculated by the LS method using multiple samples, the relationship between the selection of the landmarks and the accuracy of the attitude needs to be determined according to Equation (19).
The following part is the analysis of the number of selected landmarks in the attitude determination.
Assuming that a new landmark is added on the basis of several original landmarks, Equation (19) can be rewritten as
C o v C e b = X X τ + 1 X T X τ + 1 T 1 σ 2 = X X T + X τ + 1 X τ + 1 T 1 σ 2
Since the eigenvalues of the matrix X X T + X τ + 1 X τ + 1 T 1 are smaller than the eigenvalues of the matrix X X T 1 , within the calculation amount, a larger number of landmarks means higher accuracy. Since the landmark camera can only get one picture at a time according to its shooting frequency, and the number of landmarks in the picture is limited, it can be concluded that the more landmarks there are, the higher is the accuracy of the attitude that can be achieved.
The relationship between the landmark correlation and the accuracy:
If there is a strong correlation between the landmarks, even the landmarks are selected as points near a line, the minimum eigenvalue of the matrix X X T tends to zero. Make an eigenvalue decomposition (ED) of the matrix:
X X T 1 = P Λ 1 P 1
Then, the mean square error (MSE) of the transformation matrix C e b is as Equation (22):
M S E C e b = σ 2 tr X X T 1 = σ 2 tr P Λ 1 P 1 = σ 2 tr Λ 1 P P 1 = σ 2 tr Λ 1 = σ 2 i = 1 3 λ i 1
Since the landmarks are considered in three-dimensional space, the eigenvalue matrix is Λ = λ 1 λ 2 λ 3 , and P is the eigenvector matrix.
Since λ 3 approaches 0, when the matrix is inverted, M S E C e b tends to infinity. At this time, the calculation error is particularly large, which affects the calculation accuracy.
The simulation data were used to give an explanation.
Assuming that the measurement angle error obeys the normal distribution with a zero mean value and a standard deviation of 30”, the errors can be calculated under the different number of landmarks and correlation. In each case, Monte Carlo simulation was performed for 100 times to obtain the mean error. The accuracy results are shown in Table 1:
The following can be seen in Table 1:
(a) When there is a strong correlation between the landmarks, the angle error becomes larger, especially the pitch angle error.
(b) Without considering the correlation of the landmarks, more landmarks make the angle error smaller. That is, the more accurate is the transformation matrix, the higher is the accuracy of the attitude determination. This shows that, in the actual navigation process, more landmarks lead to higher accuracy.
(c) In the case that a gradual increase emerges in the number of landmarks, whether there is a correlation relationship between landmarks has less and less influence on the accuracy of the transformation matrix. This shows that, when the number increases, the collinear problem inevitably appears. How to balance the number and the location of the landmarks is the key to improving the navigation accuracy.

3.2. Position Determination of Landmark Navigation

It is shown in Section 3.1 that the position of the ith navigation landmark in the Earth-centered-Earth-fixed system is ρ i e = a i e , b i e , c i e , i = 1 , , τ , the vector of the aircraft under the Earth-centered-Earth-fixed system is r e = x e , y e , z e , and the relative position of the aircraft to the landmark is r i e . Let the transformation matrix of the aircraft body coordinate system to the Earth-centered-Earth-fixed system be C b e = C e b T , and C e b can be obtained from Section 3.1.1. Then,
r i e = x e a i e y e b i e z e c i e T
Suppose that ρ i b is the position vector of the ith landmark in the aircraft body coordinate system, which can be obtained by the landmark camera. Then, it can be obtained as Equation (24):
r i e = C b e ρ i b x e a i e y e b i e z e c i e = C b e a i e b i e c i e
Suppose that x e , y e , z e are three unknown vectors, and the coordinate of the aircraft in the Earth-centered-Earth-fixed system can be obtained when there are three or more landmarks can be observed, that is τ 3 .

4. SINS/Landmark Integrated Navigation Model

The operation process of SINS/Landmark integrated navigation is based on SINS, and the landmark information is used to correct the error of the SINS. The measurement equation is established according to the position and attitude information obtained by SINS and landmark navigation. The specific navigation process is shown in Figure 4.

4.1. State Equation of Integrated Navigation

Combined with the SINS mathematical platform angle error, velocity error, position error, gyroscope and the accelerometer error models, the state equation of the integrated navigation system can be obtained as Equation (25):
X ˙ t = F t X t + G t W t
Take the state parameter of the system as 15 dimensions, and record it as:
X t = ϕ x , ϕ y , ϕ z , δ v x , δ v y , δ v z , δ x , δ y , δ z , ε x , ε y , ε z , x , y , z T
where ϕ x ϕ y ϕ z T denotes the three mathematical platform angles error on three axes; δ v x δ v y δ v z T denotes the velocity error; δ x δ y δ z T denotes the position error; ε x ε y ε z denotes the three random constant drift of gyroscope; x y z T denotes the three random constant bias of accelerometer; F t denotes the process input matrix; and G t denotes the process noise matrix.
F t = 0 3 × 3 0 3 × 3 0 3 × 3 C b l i 0 3 × 3 F b 0 3 × 3 F a 0 3 × 3 C b l i 0 3 × 3 I 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 15 × 15
G t = C b l i 0 3 × 3 0 3 × 3 C b l i 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 15 × 6
where C b l i is the transformation matrix from the aircraft body coordinate system to the launch inertial system, and
C b l i = cos θ cos φ cos θ sin φ sin γ sin θ cos γ cos θ sin φ cos γ + sin θ sin γ sin θ cos φ sin θ sin φ sin γ + cos θ cos γ sin θ sin φ cos γ cos θ sin γ sin φ sin γ cos φ cos γ cos φ
where φ , θ , γ are the yaw angle, pitch angle and roll angle of the aircraft measured by the gyroscope, respectively.
F a = f 14 f 15 f 16 f 24 f 25 f 26 f 34 f 35 f 36 . The parameters f 14 f 36 are the derivative of gravitational acceleration on position coordinates, and they vary with the position of the aircraft.
F b = 0 a z a y a z 0 a x a y a x 0 . a x , a y , a z are the components of the apparent acceleration along the three axes according to the accelerometer. The process noise of the navigation system is as Equation (30):
W t = w ε x w ε y w ε z w x w y w z T
where w ε x w ε y w ε z T is the random noise of gyro; and w x w y w z T is the random noise of accelerometer. The noise covariance matrix of W t is
Q t = d i a g σ ε x 2 σ ε y 2 σ ε z 2 σ x 2 σ y 2 σ z 2

4.2. Measurement Equation of Integrated Navigation

According to the landmark navigation process, the output is the attitude and position information of the aircraft, which subtracts the attitude and position information measured from the SINS to obtain the measurement equation. Therefore, the observation of the measurement equation is the transformed platform angle error (the navigation frame misalignment angle) ϕ x , ϕ y , ϕ z and position error δ x , δ y , δ z .
Solution process of the mathematical platform misalignment angle ϕ x , ϕ y , ϕ z :
In the SINS/Landmark integrated navigation process, SINS obtains the pitch angle θ 0 , yaw angle φ 0 and roll angle γ 0 of the aircraft through the strap down solution, and the landmark camera obtains the pitch angle θ , yaw angle φ and roll angle γ , subtracting the two sets of angles to obtain the three-axis attitude error as Equation (32):
Δ a = a θ a φ a γ = θ θ 0 φ φ 0 γ γ 0
Since the composition of the attitude error equation of SINS is the platform angle error (the navigation frame misalignment angle), it is necessary to convert the attitude error angle of Equation (32) into the platform angle error to establish the measurement equation. The conversion relationship is as Equation (33):
Δ a = M 1 Δ a
where M 1 is the attitude angle error transformation matrix, and Δ a = ϕ x ϕ y ϕ z T is the platform angle error.
Solution process of the position error δ x , δ y , δ z :
The position coordinate r e = x e y e z e T of the aircraft under the Earth-centered-Earth-fixed system can be obtained by the position determination process of landmark navigation as in Section 3.2. The coordinate of the aircraft under the inertial system of the Earth’s core, i.e., r i = x i y i z i T , can be calculated by the inertial device. Then,
r ˜ i = C e i r e
The position error under the inertial system is as Equation (35):
δ r i = δ x i δ y i δ z i = r i r ˜ i
Converting it to the mathematical platform coordinate system, the conversion relationship is as Equation (36):
δ r = M 2 δ r i
where M 2 is the position error conversion matrix, and δ r = δ x δ y δ z T is the position error angle.
Therefore, the measurement equation is as Equation (37):
Z t = ϕ x ϕ y ϕ z δ x δ y δ z = HX t + V t
where H = I 3 × 3 , 0 3 × 3 , I 3 × 3 , 0 3 × 3 T , V = δ Δ x , δ Δ y , δ Δ z , δ ξ x , δ ξ y , δ ξ z T , δ Δ x , δ Δ y , δ Δ z T denotes the difference between the attitude measurement noise of the landmark camera and the constant drift error of the gyroscope. δ ξ x , δ ξ y , δ ξ z T is the difference between position measurement noise of the landmark camera and the constant error of the accelerometer.

4.3. Integrated Navigation Filtering Algorithm

Consider the integrated navigation system according to Equations (25) and (37), discretize it and use Kalman filter to estimate it. The system equation can be transformed as Equation (38):
X k + 1 = Φ k X k + Γ W k Z k = H X k + V k
Assuming that T is the sampling interval, then
Φ k = I + F k 1 T + 1 2 ! F k 1 2 T 2 Γ k 1 = T I + 1 2 ! F k 1 T + 1 3 ! F k 1 2 T 2 G k
The Kalman filter algorithm is expressed as follows:
The one-step prediction equation:
X ^ k + 1 | k = Φ k X k
The state estimation:
X ^ k + 1 = X ^ k + 1 k + K k Z k + 1 H X ^ k + 1 | k
The gain of the filter:
K k = P k + 1 | k H T H P k + 1 | k H T + R k 1
The MSE of the one-step prediction:
P k + 1 | k = Φ k P k Φ k T + Γ k Q k + 1 Γ k T
The MSE of the estimation:
P k + 1 = I K k + 1 H P k + 1 | k
The above are the basic equations of the discrete Kalman filter. When the initial value X ^ 0 and P 0 are given, combined with the measurement Z k of the k moment, the state estimation X ^ k at k moment can be obtained through iteration.

5. Simulation and Analysis

The Shandong Peninsula, the Liaodong Peninsula and the Yangtze River Delta were selected as the navigation landmarks. The landmark cameras were used to photograph the landmarks in the flight segment of the aircraft, and then the landmarks were compared to obtain the position and attitude of the aircraft. The matching process is shown in Figure 5, Figure 6 and Figure 7.
The trajectory of the spacecraft was drawn based on the captured information, as shown in Figure 8, Figure 9 and Figure 10.
The spacecraft was analyzed in the simulation, and the error parameters of SINS and landmark navigation were set as follows.
① Initial position errors of the three directions wer all 0 m. ② The initial speed errors were 0 m/s. ③ The initial attitude angle error was 0 . ④ The gyro random constant drift was 0.1 / h . ⑤ The gyroscope random noise drift satisfied a normal distribution with a mean of 0 and a standard deviation of 0.5 . ⑥ The accelerometer random constant bias was 10 4 g , where g is the Earth gravity constant. ⑦ The accelerometer random noise offset satisfied a normal distribution with a mean of 0 and a standard deviation of 5 × 10 5 g . ⑧ The standard deviation of the landmark navigation output position error was 50 m. ⑨ The standard deviation of the landmark navigation output attitude error was 3 .
The sampling interval of SINS was 0.01 s, the landmark navigation was 0.1 s, the combined interval was 0.1 s, the simulation time was 1110 s, and the Monte Carlo method was used for simulation. Due to the requirements of the launch inertial system, initial latitude and longitude were 116.34 and 39.98 , respectively, and the height was 0 m. The platform angle error, gyro drift, position error and speed error were corrected by the feedback during the navigation process.
To verify the importance of the landmark attitude determination in the integrated navigation, two methods (with attitude determine and without attitude determine) were simulated, and the trajectories of the aircraft was obtained, as shown in Figure 11.
It can be seen in Figure 11 that, during the whole navigation process, the orbit determination of the aircraft with attitude determination is almost the same as the real orbit, while the trajectory without attitude determination is far from the real orbit, which indicates that the attitude determination is critical. The comparison of the position as well as the velocity determination in three directions are given, respectively.
In Figure 12 , we can find that, in the X direction, when there is no attitude determination, the error of the position gradually becomes larger before 900 s, and there is a convergence trend after 900 s: the velocity error becomes larger with time. With the attitude determination, the position and velocity errors can be kept near 0. In Figure 13, for Y direction, without the attitude determination, the position and velocity errors diverge over time. With the attitude determination, the position error is very small, almost at 0 level and the speed figure has a slight error, but it also has a better improvement. In Figure 14, for Z direction, without the attitude determination, the position error diverges with time, the velocity error is divergent before 160 s, and then gradually converges after 160 s. When with the attitude determination, the changes of the position and velocity errors are similar to the Y direction.
Through the comparison of the three directions, it can be seen that the position errors gradually increase with time when the landmark only provides the positioning information, and large offsets appear in three directions, leading the gradually decreasing of the navigation accuracy. When the landmark completes the attitude determination, the navigation result can remain stable for a long time. The velocity determination is similar to the position determination.
Figure 12, Figure 13 and Figure 14 express the qualitative analysis of the accuracy on whether determine the attitude in the three directions, while Table 2 and Table 3 express the quantitative accuracy analysis of it.
It can be seen in Table 2 and Table 3 that it is quite different whether there is the attitude determination for navigation, and the attitude determination has a large impact on the navigation accuracy. When the attitude determination process is completed, the position and the velocity of the integrated navigation maintain at a high precision, and the navigation result approximates the true.

6. Conclusions

In this paper, faced with the limitation of the traditional SINS/CNS integrated navigation that the system is difficult to navigate all day and all weather, a SINS/Landmark integrated navigation method based on landmark attitude determination is proposed. Through the theoretical analysis of the landmark navigation principle and the simulation of the SINS/Landmark integrated navigation system, the landmark information is necessary to determine the attitude. When the landmark navigation only extracts the position information, the accuracy of the integrated navigation will maintain a large range of fluctuations. After the attitude determination, the accuracy of the integrated navigation can reach a high level and does not decrease with time. This shows that the SINS/Landmark integrated navigation method based on the landmark attitude determination is indeed effective, which can improve the accuracy of the navigation to some extent.

Author Contributions

S.X. proposed the main idea and finished the draft manuscript; J.W. conceived of the experiments and drew the figures and tables; Z.H. conducted the simulations; and H.Z. and D.W. analyzed the data.

Funding

This research was supported by the National Natural Science Foundation of China (Grant No. 61773021), the Natural Science Foundation for Distinguished Young Scholars of Hunan Province (Grant No. 2019JJ20018), the Natural Science Foundation of Hunan Province 2019JJ50745), and Civil Aerospace Advance Research Project (Grant No. D020213).

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

SINSstrapdown inertial navigation system
CNScelestial navigation system
UAVunmanned aerial vehicle
GPSglobal position system
SIFTScale-Invariant feature transform
KDk-dimensional
EDeigenvalue decomposition
MSEmean square error

References

  1. Ning, X.; Zhang, J.; Gui, M.; Fang, J. A Fast Calibration Method of the Star Sensor Installation Error Based on Observability Analysis for the Tightly Coupled SINS/CNS-Integrated Navigation System. IEEE Sens. J. 2018, 18, 6794–6803. [Google Scholar] [CrossRef]
  2. Ning, X.; Liu, L. A two-mode INS/CNS navigation method for lunar rovers. IEEE Trans. Instrum. Meas. 2014, 63, 2170–2179. [Google Scholar] [CrossRef]
  3. Han, J.; Changhong, W.; Li, B. Novel all-time initial alignment method for INS/CNS integrated navigation system. In Proceedings of the IEEE 13th International Conference on Signal Processing (ICSP), Chengdu, China, 6–10 November 2016; pp. 1766–1770. [Google Scholar]
  4. Lijun, S.; Wanliang, Z.; Yuxiang, C.; Xiaozhen, C. Based on Grid Reference Frame for SINS/CNS Integrated Navigation System in the Polar Regions. Complexity 2019, 2019, 1–8. [Google Scholar] [CrossRef]
  5. Pimenta, F. Astronomy and navigation. In Handbook of Archaeoastronomy and Ethnoastronomy; Ruggles, C.L., Ed.; Springer: New York, NY, USA, 2015; pp. 43–65. [Google Scholar]
  6. Katz-Bassett, E.; Sherry, J.; Huang, T.Y.; Kazandjieva, M.; Partridge, C.; Dogar, F. Helping conference attendees better understand research presentations. Commun. ACM 2016, 59, 32–34. [Google Scholar] [CrossRef]
  7. Wang, Q.; Li, Y.; Ming, D.; Wei, G.; Zhao, Q. Performance enhancement of INS/CNS integration navigation system based on particle swarm optimization back propagation neural network. Ocean Eng. 2015, 108, 33–45. [Google Scholar] [CrossRef]
  8. Delaune, J.; Besnerais, G.L.; Voirin, T.; Farges, J.L.; Bourdarias, C. Visual–inertial navigation for pinpoint planetary landing using scale-based landmark matching. Robot. Autom. Syst. 2016, 78, 63–82. [Google Scholar] [CrossRef]
  9. Gupta, S.; Fouhey, D.; Levine, S.; Malik, J. Unifying map and landmark based representations for visual navigation. arXiv 2017, arXiv:1712.08125. [Google Scholar]
  10. Costello, M.J.; Castro, R. Precision Landmark-Aided Navigation. US Patent 7,191,056, 13 March 2007. [Google Scholar]
  11. Cesetti, A.; Frontoni, E.; Mancini, A.; Zingaretti, P.; Longhi, S. A Vision-Based Guidance System for UAV Navigation and Safe Landing using Natural Landmarks. J. Intell. Robot. Syst. 2010, 57, 233. [Google Scholar] [CrossRef]
  12. He, Y.; Li, S.; Guo, Q. Landmark based position and orientation method with tilt compensation for missile launcher. In Proceedings of the 35th Chinese Control Conference (CCC), Chengdu, China, 27–29 July 2016; pp. 5585–5589. [Google Scholar]
  13. Khuller, S.; Raghavachari, B.; Rosenfeld, A. Landmarks in graphs. Discrete Appl. Math. 1996, 70, 217–229. [Google Scholar] [CrossRef] [Green Version]
  14. Kim, Y.; Hwang, D.H. Vision/INS integrated navigation system for poor vision navigation environments. Sensors 2016, 16, 1672. [Google Scholar] [CrossRef] [PubMed]
  15. Babel, L. Flight path planning for unmanned aerial vehicles with landmark-based visual navigation. Robot. Autom. Syst. 2014, 62, 142–150. [Google Scholar] [CrossRef]
  16. Malioutov, D.; Cetin, M.; Willsky, A.S. A sparse signal reconstruction perspective for source localization with sensor arrays. IEEE Trans. Signal Process. 2005, 53, 3010–3022. [Google Scholar] [CrossRef] [Green Version]
  17. Cai, T.T.; Wang, L. Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise. IEEE Trans. Inf. Theory 2011, 57, 4680–4688. [Google Scholar] [CrossRef]
  18. Carmi, A.; Gurfil, P.; Kanevsky, D. Methods for Sparse Signal Recovery Using Kalman Filtering With Embedded Pseudo-Measurement Norms and Quasi-Norms. IEEE Trans. Signal Process. 2010, 58, 2405–2409. [Google Scholar] [CrossRef]
  19. Cheung, W.; Hamarneh, G. n-SIFT: n-Dimensional Scale Invariant Feature Transform. IEEE Trans. Image Process. 2009, 18, 2012–2021. [Google Scholar] [CrossRef] [PubMed]
  20. Cruz-Mota, J. Scale Invariant Feature Transform on the Sphere: Theory and Applications. Int. J. Comput. Vision 2012, 98, 217–241. [Google Scholar] [CrossRef]
  21. Smirnov, F.; Zamolodchikov, A. On space of integrable quantum field theories. Nucl. Phys. B 2017, 915, 363–383. [Google Scholar] [CrossRef]
  22. Lan, Z.; Lin, M.; Li, X.; Hauptmann, A.G.; Raj, B. Beyond gaussian pyramid: Multi-skip feature stacking for action recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 8–10 June 2015; pp. 204–212. [Google Scholar]
  23. Li, M.; Wang, L.; Hao, Y. Image matching based on SIFT features and kd-tree. In Proceedings of the 2nd International Conference on Computer Engineering and Technology, Chengdu, China, 16–18 April 2010; pp. 218–222. [Google Scholar]
  24. Gorbenko, A.; Popov, V. The problem of selection of a minimal set of visual landmarks. Appl. Math. Sci. 2012, 6, 4729–4732. [Google Scholar]
Figure 1. Acquisition process of landmarks.
Figure 1. Acquisition process of landmarks.
Sensors 19 02917 g001
Figure 2. The principle of image compression during the landmark matching.
Figure 2. The principle of image compression during the landmark matching.
Sensors 19 02917 g002
Figure 3. Flow of SIFT.
Figure 3. Flow of SIFT.
Sensors 19 02917 g003
Figure 4. Process of the SINS/Landmark integrated navigation.
Figure 4. Process of the SINS/Landmark integrated navigation.
Sensors 19 02917 g004
Figure 5. Schematic diagram of the matching process about the Shandong Peninsula.
Figure 5. Schematic diagram of the matching process about the Shandong Peninsula.
Sensors 19 02917 g005
Figure 6. Schematic diagram of the matching process about the Liaodong Peninsula.
Figure 6. Schematic diagram of the matching process about the Liaodong Peninsula.
Sensors 19 02917 g006
Figure 7. Schematic diagram of the matching process about the Yangtze River Delta.
Figure 7. Schematic diagram of the matching process about the Yangtze River Delta.
Sensors 19 02917 g007
Figure 8. 3D trajectory of the aircraft on the Earth.
Figure 8. 3D trajectory of the aircraft on the Earth.
Sensors 19 02917 g008
Figure 9. Partial 3D trajectory of the aircraft in the flight area.
Figure 9. Partial 3D trajectory of the aircraft in the flight area.
Sensors 19 02917 g009
Figure 10. Trajectory of the aircraft under latitude and longitude reference.
Figure 10. Trajectory of the aircraft under latitude and longitude reference.
Sensors 19 02917 g010
Figure 11. The Influence of the attitude determination on the accuracy of aircraft orbit.
Figure 11. The Influence of the attitude determination on the accuracy of aircraft orbit.
Sensors 19 02917 g011
Figure 12. Precision comparison of the X direction.
Figure 12. Precision comparison of the X direction.
Sensors 19 02917 g012
Figure 13. Precision comparison of the Y direction.
Figure 13. Precision comparison of the Y direction.
Sensors 19 02917 g013
Figure 14. Precision comparison of the Z direction.
Figure 14. Precision comparison of the Z direction.
Sensors 19 02917 g014
Table 1. The effect of the number and correlation of landmarks on the accuracy of the transformation matrix.
Table 1. The effect of the number and correlation of landmarks on the accuracy of the transformation matrix.
34567
Non-correlationPitch angle error3.4146 2.0300 2.3630 1.6049 1.8437
Yaw angle error4.5328 1.4792 1.9401 1.8308 0.6509
Roll angle error1.5083 2.2644 0.8618 0.7915 0.2807″
Strong correlationPitch angle error28.4667″24.9020 13.8017 11.8040 3.3160
Yaw angle error7.9635 2.8863 4.4863 4.6482 4.0131
Roll angle error3.1885 5.7244 3.5935 2.2859 2.3123
Table 2. Position accuracy data statistics (unit: m).
Table 2. Position accuracy data statistics (unit: m).
XYZ
With attitude determinationThe average error25.657360.1919175.0093
The maximum error40.5812116.1984240.7109
Without attitude determinationThe average error1004.22606.17664.0
The maximum error1601.26251.013615
Table 3. Velocity accuracy data statistics (unit: m/s).
Table 3. Velocity accuracy data statistics (unit: m/s).
V X V Y V Z
With attitude determinationThe average error0.51001.13333.4702
The maximum error1.25672.05576.2684
Without attitude determinationThe average error1.66335.582912.1824
The maximum error3.11037.277619.0498

Share and Cite

MDPI and ACS Style

Xu, S.; Zhou, H.; Wang, J.; He, Z.; Wang, D. SINS/Landmark Integrated Navigation Based on Landmark Attitude Determination. Sensors 2019, 19, 2917. https://doi.org/10.3390/s19132917

AMA Style

Xu S, Zhou H, Wang J, He Z, Wang D. SINS/Landmark Integrated Navigation Based on Landmark Attitude Determination. Sensors. 2019; 19(13):2917. https://doi.org/10.3390/s19132917

Chicago/Turabian Style

Xu, Shuqing, Haiyin Zhou, Jiongqi Wang, Zhangming He, and Dayi Wang. 2019. "SINS/Landmark Integrated Navigation Based on Landmark Attitude Determination" Sensors 19, no. 13: 2917. https://doi.org/10.3390/s19132917

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop