Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Imtc RobotNavigation 7594

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

IMTC 2007 IEEE Instrumentation and Measurement

Technology Conference
Warsaw, Poland, May 13, 2007
Autonomous Dead-Reckoning Mobile Robot Navigation System With Intelligent
Precision Calibration
Md. Suruz Miah
1
, Wail Gueaieb
1
, Md. Abdur Rahman
2
, Abdulmotaleb El Saddik
2
School of Information Engineering and Engineering,
University of Ottawa,
Ontario, K1N6N5, Canada
1
Email: {smiah069, wgueaieb}@site.uOttawa.ca.
2
Email: {rahman, abed}@mcrlab.uOttawa.ca.
Abstract An open hardware architecture for a goal-oriented indoor
mobile robot navigation based on corrected odometry information us-
ing fuzzy logic controller is presented in this paper. The proposed
approach exploits the ability of mobile robots to navigate in unstruc-
tured, cluttered and potentially hostile environments using calibrated
odometry information, proximity sensor data and a fuzzy logic engine.
A key advantage to this technique is its simplicity over similar naviga-
tion systems discussed in the literature. The simplication stems from
the fact that only the previous traveled position of the robot needs to
be tracked to reach to the target position. The superiority of this tech-
nique is supported by the experimental results that were collected for
this purpose.
Keywords Fuzzy logic, mobile robot, dead-reckoning navigation.
I. INTRODUCTION
Navigation is a central component in most mobile robot ap-
plications. Numerous methods have been suggested in this
context over the past several years, such as landmark-based,
vision-based, and behavior-based navigation systems, to name
a few. Adopting a conventional navigation approach for sys-
tems incorporating a large number of sensors often results in a
complicated and inefcient navigation algorithm. In this paper,
we present an alternative dead-reckoning navigation technique
based on a fuzzy logic control engine. The model uses a set
of infrared proximity sensors in order to detect objects in the
environment which obstruct the motion of the robot along its
path. The path must be recomputed when unknown obstacles
are detected and sensible deviations are estimated during nav-
igation. The fuzzy rules control the orientation of the robot
according to the information about the distances from the ob-
stacles around it. The motivation of using fuzzy logic is that
distances from the obstacles are usually not precisely known.
The proposed method is cost effective and can be used in semi-
structured and unstructured environments. It is cost-effective
since only simple and low cost sensors are required.
The rest of the manuscript is organized as follows. Related
work is discussed in section II. Section III describes the over-
all system architecture. The proposed approach is detailed in
section IV. The implementation and experimental results with
a real mobile robot are presented in section V. Finally, we
conclude the paper with several remarks and future research
directions in section VI.
II. RELATED WORK
Signicant research works have been conducted in the eld
of mobile robotics that incorporate intelligent sensors. Au-
thors in [1] described a strategy and control architecture to
allow a mobile robot to navigate in indoor environment on a
planned path. The navigation system of the mobile robot in-
tegrates the position estimation obtained by a vision system
with the position estimated through an odometric technique
using a Kalman lter. The obstacle detection is performed
by several ultrasonic sensors. Their system is suitable for a
structured or quasi-structured environment and needs a-priori
knowledge of the world model. This priori knowledge is de-
signed by means of a CAD system. The CAD description is
provided to dene an initial setup of the world in order to plan
the path in advance. The indoor autonomous goal-based mo-
bile robot navigation using the cooperative strategy between
odometry and a novel visual self localization method is de-
scribed in [2]. The odometer used in their research is a dead-
reckoning sensor, which estimates only relative motion so as
to obtain the absolute position of the vehicle. The wheels of
the vehicle are equipped with encoders to measure as well as
to calibrate its relative position and orientation. The visual self
location method capitalizes on excellent angular resolution of
a CCD TV camera. The position and orientation of the vehicle
is further calibrated using this standard CCD camera. The ex-
perimental setup composed of three articial landmarks whose
coordinates are known and are imaged by CCD camera. The
orientation of the camera is estimated through the image co-
ordinates of the landmarks. Authors in [3] sketched a dead
reckoning navigation algorithm for autonomous mobile robot
which is based on a differential encoder and gyroscope. An
indirect Kalman lter is used to feedback the error estimates to
the main navigation algorithm. An indirect Kalman lter was
adopted to estimate the encoders and gyroscopes systematic
errors. The outdoor navigation system suggested in [4] uses vi-
sion and DGPS information for the position and orientation es-
timation of a mobile robot. While moving, the robot uses con-
ventional dead-reckoning and matches up the landmarks with
the environmental model for the position estimation in order to
reduce localization errors. The robot also uses vision to detect
landmarks and DGPS information to determine robots initial
position and orientation.
The problem of autonomous guided vehicle (AGV) naviga-
tion has been addressed in [5]. The approach proposed therein
is to fuse the odometry and the information provided by the vi-
sion system. A pointable camera was utilized to come up with
the vision system. The function of the camera is to be moved
in different direction to x a point in the environment while the
AGV is moving. The coordinates of the landmark are not sup-
posed to be a priori known. The camera is then responsible for
nding the coordinates of the landmark. The main difference
of the work in [5] over previous navigation methods is that any
point in the observed scene by the camera can be selected as
a landmark and not just pre-measured point. The technique
in [6] focuses on integrating encoders and Global Positioning
System (GPS) for the precise navigation of mobile robots.
Computation intelligence tools have been adopted by sev-
eral researchers for the mobile robot navigation problem in
both indoor and outdoor environments. In [7], the authors de-
signed a framework for the mobile robot that uses a Genetic
Algorithm-based navigation system. Behavior-based robot
navigation system is being adopted by several researches such
as the fuzzy logic approach in [8]. Authors in [9] describe
a fuzzy logic controller for the navigation of multiple mobile
robots with collision avoidance in a dynamic environment.
The work presented in this manuscript is signicantly dif-
ferent from previous studies in the following: 1) it deals with
the physical parameters of the sensors such as its placement in
space and time 2) the proposed navigation approach considers
two modes of operation of the mobile robot: a target seeking
mode and an obstacle avoidance mode. In the target seeking
mode robot calculates the desired orientation between its own
position and the target location called heading angle, then it
moves towards that target using its current orientation. The ob-
stacle avoidance mode is controlled by a fuzzy logic controller
to avoid obstacles that may be lying around in the environment
(more details in Section A).
III. SYSTEM ARCHITECTURE
The high level architecture of the proposed navigation sys-
tem is shown in Figure 1. The environment considered in this
paper is a oor of a building consisting of four rooms (R1, R2,
R3 and R4) with hallways between the rooms. For the proof
of concept, a scenario is dened where the mobile robot can
be utilized as a means of carrying les and/or smaller loads
from one room to another. The robot starts from initial location
(room R1 in this case) and ends in the target location (room R3
in this case) specied by the user. During navigation, robot
Fig. 1. High level architecture.
may face obstacles around its left, front and right side.
Finding the target is the main objective of the robot. A tar-
get can be any 2-D coordinate in the environment. The user
species the location of the target to the robot in the environ-
ment. The robot keeps track of target and proceeds towards
that target according to the decision generated by the naviga-
tion controller. The mobile robot uses several sensors that are
able to detect obstacles and to measure its position and orienta-
tion at any given instant of time. The next subsections describe
the fuzzy logic controller to avoid obstacles and the localiza-
tion technique of the mobile robot.
A. Fuzzy Logic Controller
Fuzzy logic control is well suited for controlling a robot be-
cause it is capable of making inferences in uncertain environ-
ments [10]. This paper shows that how fuzzy sets and fuzzy
rules can be used to represent a real system or process. Fuzzy
logic controller is responsible to change the orientation of the
robot when it faces an obstacle. This controller is activated
when the robot is in obstacle avoidance mode. The fuzzy logic
controller is composed of several fuzzy rules. Some of the
fuzzy rules are activated based on the information acquired by
the robot through its sensors. The output of the fuzzy rules con-
trol the orientation of the wheels. This change of orientation is
denoted by .
Figure 2(a) shows the Fuzzy Inference System (FIS) with
three inputs and one output. Three inputs are obstacle distances
provided by the left, right and front infrared proximity sensors
of the robot. The FIS makes the decision about . The mem-
bership variables Near, Med, High are the linguistic terms
used for the input variables LeftOD (Left Obstacle Distance),
FrontOD (Front Obstacle Distance) and RightOD (Right
Obstacle Distance). The terms NH(Negative High), NL(Nega-
tive Low), Zero, PL (Positive Low) and PH(Positive High) are
the membership variables for the output variable DeltaTheta
which is the change of orientation of the robot. The member-
ship functions are triangular or trapezoidal. The values of the
parameters are found by analyzing the experimental naviga-
tional results for different environmental scenarios. The fuzzy
rules are dened as follows in a general form:
If (LeftOD is LeftOD
i
and FrontOD is FrontOD
j
and
RightOD is RightOD
k
) Then is
ijk
where i=1 to 3, j=1 to 3 and k=1 to 3 and the values of
the LeftOD
i
, FrontOD
j
, RightOD
k
can be Near, Med or
High respectively. According to the Mamdani Fuzzy Inference
System [11], a factor ring strength for the rules is dened as
equation 1.
w
ijk
=
LeftOD
(dis
i
)
FrontOD
(dis
j
)
RightOD
(dis
k
)
(1)
Fig. 2. Fuzzy model used by the mobile robot.
where dis
i
, dis
j
, dis
k
are distance values provided by the left,
front and right infrared proximity sensors respectively. By ap-
plying the compositional rule of inference [11], the member-
ship value of the change in orientation can be computed as the
equation 2.

ijk
() = w
ijk

ijk
()

(2)
The overall conclusion can be obtained by combining the out-
puts of all the fuzzy rules using the equation 3.

() =

111
() . . .

ijk
() . . .

333
() (3)
Finally the corresponding crisp value of the change in ori-
entation is computed using centroid of area defuzzication
method [11] as shown in equation 4.
=

x
x

(x)dx

(x)dx
(4)
Where x is orientation.
For the navigation in target seeking mode, the robot simply
calculates the heading angle between its own position and the
target location using the coordinate system dened in it and
then re-orient itself and starts moving towards the target.
B. Robot Localization Mechanism
Localization is an important concern during the navigation
of a mobile robot. After a pre-dened time the robot local-
izes itself with respect to the environment. This section de-
scribes the odometry-based localization mechanism of the mo-
bile robot during navigation in the environment. The accuracy
of the odometry measurements for dead-reckoning is to a great
extent a direct function of the kinematic design of a vehicle.
Because of the close relationship between kinematic design
and positioning accuracy, one must consider the kinematic de-
sign closely before attempting to improve the dead-reckoning
accuracy. For this reason, the differential drive mobile robot
is considered for the positioning system [12]. The robot can
perform dead-reckoning by using simple geometric equations
to compute the momentary position of the vehicle relative to a
known starting position.
The physical movement of the robot falls into two cate-
gories: one is when the robot moves without any hindrance i.e.
operating target seeking mode and another one is when it faces
an obstacle on its path. If the robot faces any obstacle, it will
avoid obstacle regardless of thinking about the desired target.
So, the robot will be out of the path of the target. In that case,
robot has to localize itself and orient towards the target again.
Once the robot detects an obstacle, it updates its position (X,Y)
and orientation based on the previous position, orientation and
distance traveled (U
i
) using equation 5.
x
i+1
= x
i
+ U
i
cos
i
y
i+1
= y
i
+ U
i
sin
i

i+1
=
i
+

(5)
where
i
= previous orientation of the robot
x
i
= previous x coordinate of the robot
Equation 5 is valid for both clockwise and anti-clockwise rota-
tion of the robot. Note that, if the robot turns clockwise direc-
tion then is positive and for anticlockwise direction, is
negative. If the robot nds an obstacle around it after traveling
distance U
i
, then it will try to re-orient itself by . is
provided by the Fuzzy Inference System (FIS) [9] described in
section A.
IV. PROPOSED APPROACH
A owchart of the proposed navigation system is shown in
Figure 3. The description of each step of the mobile robot is
given below.
Fig. 3. Flowchart of the mobile robot navigation system.
textbfStep 1: The robot is initialized with the starting posi-
tion and orientation. The robot is also given the target position
where it should reach after navigation in the environment. The
starting position and target position are the (x,y) coordinates in
the given coordinate system of the environment and orientation
is the angle between the x-axis and robots current direction.
Step 2: After initialization step, the robot measures distances
of obstacles at the left, front and right direction using its in-
frared proximity sensors. The distance measurements are used
for further processing during navigation.
Step 3: This step is responsible for checking whether the dis-
tances of the obstacles at its left, front and right sides are far
enough. If these obstacle distances are far (Yes) then the robot
goes to target seeking mode, otherwise (No) it gets itself into
obstacle avoidance mode.
Step 4: There are two states in this step. One is the calcula-
tion of heading angle and based on its current position and
orientation, and other one is to extract change of orientation
provided by the fuzzy model. The rst case is for target seek-
ing mode and the second case is for obstacle avoidance mode.
Step 5: In this step, the robot will be rotated by an angle
provided by step 4 either counter clockwise or clockwise di-
rection which depends on the target or obstacles around it.
Step 6 to 8: After rotation, the robot will rst update its nal
orientation and then move straight for a predened amount of
time. The robot updates its position (x,y) based on the previ-
ous position and orientation after traveling for that amount of
time stamp.
Step 9 to 10: The robot is checking whether it has reached at
the specied target or not. If not, then it jumps to the Step 2
and continue execution of the above steps. If the robot nds the
target then it stops the navigation and exits from the system.
V. IMPLEMENTATION AND RESULTS
The mobile robot that we used for the experiment consists of
Altera Stratix Nios II FPGA board, two Futaba FPS-148 servo
motors for mobility using differential steering, both equipped
with optical sensors with incremental encoders, and infrared
proximity sensors for obstacle recognition. Two servo motors
independently control two wheels on a common axis. The dis-
tance between wheels is 20 cm, the wheel diameter is 5.5 cm,
the wheel thickness is 2.5 cm.
It is essential that an autonomous mobile robot be able to re-
cover its orientation if it becomes disoriented. At such a time,
the strategy is to realign itself and proceed until it nds the
specied target. A typical test run of the mobile robot based on
the proposed navigation method is shown in the series of pho-
tographs in Figure 4. The sequence shows the robot starts from
one corner of MIRaM laboratory (Figure 4 (a)), reorient itself
when it faces an obstacle in front of it (Figure 4 (b)) and then
travel until it nds the target (the door in this case) (Figure 4
(c)). The robots traveling velocity was 0.02m/s. Total trav-
eled distance was 15m and traveling time from starting point
to target was 13 minutes. The equivalent path of our navigation
environment is shown in Figure 5.
VI. CONCLUSION
In this paper, an efcient mobile robot navigation algorithm
using fuzzy logic controller is designed and implemented. At
rst, some necessary hardware modules are designed and im-
plemented. Then, a fuzzy logic inference is integrated in the
system to calibrate and improve its positional precision. Fi-
nally, the algorithm is tested in a real-world environment. The
algorithms performance was demonstrated through a number
of experiments conducted in our laboratories. Apossible future
research avenue to extend this work is to consider the collabo-
ration of multiple robots to navigate autonomously in a shared
common workspace.
Fig. 4. Test run of the mobile robot.
Fig. 5. Simple trajectory of the traveled path.
[1] T. DOrazio, M. Ianigro, E. Stella, F. P. Lovergine, and A. Distante, Mo-
bile robot navigation by multi-sensory integration, in IEEE Interna-
tional Conference on Robotics and Automation, May 1993, vol. 2, pp.
373379.
[2] E. Stella, F.P. Lovergine, L. Caponetti, and A. Distante, Mobile robot
navigation using vision and odometry, in Proceedings of the Intelligent
Vehicles 94 Symposium, October 1994, pp. 417422.
[3] KyuCheol Park, Hakyoung Chung, Jongbin Choi, and Jang Gyu Lee,
Dead reckoning navigation for an autonomous mobile robot using a dif-
ferential encoder and a gyroscope, in 8th International Conference on
Advanced Robotics, ICAR 97, July 1997, pp. 441446.
[4] S. Kotani, K. Kaneko, T. Shinoda, and H. Mori, Mobile robot navi-
gation based on vision and dgps information, in IEEE International
Conference on Robotics and Automation, May 1998, vol. 3, pp. 2524
2529.
[5] A. Adam, E. Rivlin, and H. Rotstein, Fusion of xation and odometry
for vehicle navigation, IEEE Transactions onSystems, Man and Cyber-
netics, Part A, vol. 29, no. 6, pp. 593603, November 1999.
[6] He Bo, Wang Danwei, Pham Minhtuan, and Yu Tieniu, Gps/encoder
based precise navigation for a 4ws mobile robot, in Proceedings of
the 7th International Conference on Control, Automation, Robotics and
Vision, ICARCV 2002, 2002, pp. 12561261.
[7] Dong-Ying Ju and Satoshi Kushida, Intelligent control of mobile robot
during autonomous inspection of welding damage based on genetic al-
gorithm, in Proceedings of the 14th International conference on Indus-
trial and engineering applications of articial intelligence and expert
systems: engineering of intelligent system, Budapest, Hungary, June 4-7
2001, Springer-Verlag, London, UK.
[8] Petru Rusu, Emil M. Petriu, Thom E. Whalen, Aurel Cornel, and Hans
J. W. Spoelder, Behavior-based neuro-fuzzy controller for mobile robot
navigation, IEEE Transactions on Instrumentation and Measurement,
vol. 52, no. 4, pp. 13351340, August 2003.
[9] Dayal R. Parhi, Navigation of mobile robot using a fuzzy logic con-
troller, Journal of Intelligent and Robotic Syst., vol. 42, no. 35, pp.
253273, March 2005.
[10] Alessandro Safotti, The uses of fuzzy logic in autonomous robot nav-
igation: a catalogue raisonne, Soft Computing Research journal, vol. 1,
no. 4, pp. 180197, 1997.
[11] F. Karray and C. D. Silva, Soft Computing and Intelligent Systems De-
sign: Theory, Tools and Applications, Pearson Education Publishing,
UK, 2004.
[12] Johan Borenstein and Liqiang Feng, Correction of systematic odome-
try errors in mobile robots, in International Conference on Intelligent
Robots and Systems, Pittsburgh, Pennsylvania, August 1995, pp. 569
574.

You might also like