Infrared Range Sensor Array For 3D Sensing in Robotic Applications
Infrared Range Sensor Array For 3D Sensing in Robotic Applications
Infrared Range Sensor Array For 3D Sensing in Robotic Applications
1.Introduction
When the environment of a robot is not strictly
controlled, sensing the external world is an important
part of the work of the robot. Using external sensors, a
robot can interact with its surroundings in a flexible
manner. For example, a robot can manipulate a target
object flexibly based on sensor data without intervention
by a human operator. It is a common task in
manufacturing industry for a robot manipulator to pick
up objects on a moving conveyor belt by sensing the
location and orientation of the objects [1]. For a mobile
robot,sensingobstaclesandanyhazardsthatlieaheadis
ofgreatimportanceinautonomousnavigation[2].
Figure1.NineIRrangedetectorsinamatrixarray(allunitsare
mm).
Figure2.GP2Y0A02YKinfrareddistancemeasurementsensor
[16]:(a)Outlinedimensions(allunitsareinmm),(b)Operational
functiondiagram.
The sensor element was calibrated by modelling the
relationship between distance and sensor output voltage
usingafractionalfunctionasfollows:
L k / (V m) n = + + ,(1)
where L ( 20 L 150 s s ) is the distance in cm, V is the
output voltage and k, m and n are the coefficients to be
determined.Thisequationcanberewrittenas:
LV Vn Lm h = + ,(2)
where h nm k + . The three unknown coefficients are
determined using the least squares method with N (
3 N s )knownLandVpairsfrom:
1 1 1 1
2 2 2 2
N N N N
L V V L 1
n
L V V L 1
m .
h
L V V L 1
( (
(
( (
(
( (
=
(
( (
(
( (
( (
(3)
Figure3.Distanceandoutputvoltagerelationshipof
GP2Y0A02YK[16].
Another method of calibrating the distancevoltage
characteristics of the sensor is to use a lookup table. The
ideaistoconstructatableofsamplesatregularintervals
andtolookforthetwoclosestvaluestotheinputvaluein
order to make an estimate based on their table values
[17]. A gradientbased interpolation method is simple to
use. With N samples (i.e., sensor output voltage values
measured for some known distances)
k
s , k 1, 2, , N = ,
the distance L is calculated from a sensor reading V in
twosteps:
k N 1
0 V s (s s ) / N. s < (4)
Here,
1
s and
N
s are the sample voltages for the
largestandsmallestdistances,respectively.Notethat
1 N
table[s ] table[s ] > for
1 N
s s < , referring to the
sensorcharacteristicsshowninFigure3.
- Step 2: Make an estimate by linear interpolation
betweenadjacentsamplesasfollows:
3 Yongtae Do and Jongman Kim: Infrared Range Sensor Array for 3D Sensing in Robotic Applications
www.intechopen.com
k
k 1 k
table[k] table[k 1]
f(V) table[k] (V s ) .
s s
(5)
The pros and cons of these two approaches are evident.
The method using Eq. (1) requires the determination of
coefficients, but it is in a closed form. This approach is
also less sensitive to inaccuracies in the calibration data
because the function globally fits the data. The table
lookup method, on the other hand, is simple and model
free. However, it needs many samples to achieve high
accuracy.Inaddition,anyerrorinasamplevaluecannot
besmoothedout.
2.2Systemstructureandinterfacing
The sensor elements were interfaced to microcontrollers
(ATmega128fromAtmel[18]),whichwere connectedto
a PC via a serial port, as shown in Figure 4(a). Since the
ATmega 128 has an eight channel analoguetodigital
converter (ADC) but our system design employs nine
sensors, two microcontrollers were used. Although each
sensor requires 33 to 50mA to operate, the
microcontrollerprovidesonly20mA.Therefore,anon/off
controlcircuitwasbuiltusingPowerMOSFETs,asshown
in Figure 4(b). Figure 5 shows the sensor control pulse
andthevoltageoutputfromthesensor.
Figure4.Systemstructure:(a)Overallstructure,(b)Sensor
control.
Figure5.Controlsignalandsensoroutput.
3.Applicationsandresults
Thedetectionofobstaclesandhazardsisoneofthemost
important issues for the autonomous navigation of a
mobilerobot.Often,banksofultrasonicrangefindersare
attachedaroundthebodyofarobotfordetection,butthe
gathereddataareonlyin2Dandtheresolutionisusually
quitelow.Inthissection,weshowthatthedevelopedIR
array can be effectively used to gather 3D information
aboutobstaclesandhazards.Inaddition,thesensorarray
is arranged to find the shape and position of an object
movingonaconveyorbelt.
3.1Sensingobstaclesandhazardsformobilerobots
The developed IR sensor array can be effectively used to
detect obstacles and hazards around a mobile robot. The
prototype built as described in Section 2 is shown in
Figure6.ThesystemwascalibratedbasedonEq.(1)with
data collected using a linear sliding base from Intelitek
[19],asshowninFigure7.Aftercalibration,wemadeten
measurements at 20cm intervals within the range of
sensing so as to test the system accuracy. Table 1 shows
the average measurement error at each distance. Note
that the error is less than 4cm approximately, which
could be good enough for many robotic applications,
includingthosetobedescribedinthissection.
Figure6.Prototypesensorsystem.
4 Int J Adv Robotic Sy, 2013, Vol. 10, 193:2013 www.intechopen.com
Figure7.Setupforcalibratingthedevelopedsystem.
Distance[cm] 40 60 80 100 120 140
Absoluteerror[cm] 0.8 2.4 4.2 2.8 3.0 3.5
Relativeerror[%] 2.0 4.0 5.3 2.8 2.5 2.5
Table1.Rangemeasurementerror.
Figure8.Obstacledetection.(a)Horizontalbarat30cmdistance
andIRsensormeasurements,(b)Caseofarectangularobject,(c)
Caseofaverticalobject
The calibrated system was tested to determine if it could
effectively perceive 3D information about obstacles for
application to mobile robots. We used obstacles in
various shapes for the test. For example, Figure 8(a)
shows a test involving a horizontal bar at a distance of
30cmfromthesensorpanelwebuilt.Therightpartofthe
figureshowsmeasurementresultsdisplayedonagraphic
user interface (GUI). The simple GUI was developed to
display sensor measurements and communicate with
microcontrollers from the control PC. On the display of
theGUI,theleftwindowdisplaysthegreylevelimageof
sensor values presented in grid cells. Note that the grey
leveloftheimagecorrespondstotherangemeasuredby
each sensor (i.e., a lower greylevel indicates a closer
distance). The display size of the image is enlarged for
easier recognition, although its actual resolution is only
33 pixels. The figure shows that the objects shape was
well detected. Experimental results for other objects are
alsopresentedinFigure8(b)and(c).
1
n n 1
wtan L L L
, n =3or2,(6)
where w is the width between the sensor rows (4cm as
shown in Figure 1). If there is a downstep object ahead,
L willbemuchlargerthanthenormalvalue,asshown
inFigure10.Notethatthenormalvaluefromthesensors
ineachrowcanbeeasilycomputedby:
1
n n
h sin L
, n =1,2,3,(7)
where h istheknownheightofthesensorrow.Itisalso
possible that the sensors in the top row fail to make
measurements if the step height is large. Sensors in the
middlerowandthenthoseinthebottomrowwillfailin
sequenceastherobotproceeds.
Figure9.Diagramforgroundsurfacesensingusingmultiple
rowsofIRrangedetectors
5 Yongtae Do and Jongman Kim: Infrared Range Sensor Array for 3D Sensing in Robotic Applications
www.intechopen.com
Figure10.Caseofadownstephazard
3.2Sensingamovingobject
In many factory automation systems, parts are
transferred by a conveyor belt. An automatic mechanism
suchasarobotmanipulatorthenidentifiesthepartsand
picksthemupwhiletheyaremoving.Thistaskisusually
performed using vision sensing [1, 20, 21]. However,
visual sensing requires a wellcontrolled system
arrangement, including structured illumination and high
performancecomputers,toprocessimagedatareliablyin
real time. The cost required to set up a vision system is
usually quite high. Instead of using a machine vision
system, we considered an arrangement of IR sensors for
reconstructing the 3D shape of a moving object and
finding its position. Our proposed technique for sensing
moving objects has three stages: (i) the arrangement of
theIRsensors,(ii)sensorcontrolfordataacquisitionand
(iii) data interpretation for reconstructing the 3D shape
andposeofthetargetobject.
Figure11.Systemstructureforsensingobjectsmovingona
conveyorbelt.
Figure12.ArrangementofIRsensorsabovetheconveyorbelt
Stage 2 (sensor control): By adjusting the relative
speedbetweenthebeltandthesensoroperation,itis
possible to obtain nine sensor readings along a
virtualscanlineacrossthemovingobjectsothata3D
profile along the line can be obtained. In our
experimental system, sensors were separated into
three groups according to their positions. The first
groupofsensorsinthefrontline(S1,S4andS7)was
first triggered to take measurements. Then, after a
300ms delay, the second group of sensors in the
middle line (S2, S5 and S8) was triggered. The third
group of sensors in the rear line was triggered after
another 300ms interval. Note that the GP2Y0A02YK
6 Int J Adv Robotic Sy, 2013, Vol. 10, 193:2013 www.intechopen.com
IR sensor needs 38.39.6ms to make a single
measurementandthereisamaximumofa5msdelay
for producing the output voltage [16]. We thus
allowed50msfortheoperationofasensor.
Figure 14. Filling the gap between sparse sensor readings: (a)
Sensor measurement example, (b) Filling the gap by averaging,
(c)Fillingthegapbythenextvalueassumingsubpixelaccuracy
Stage 3 (sensor data processing): Due to the emitter
detector sensor configuration (see Figure 2(b)) and
the distance between sensor locations, the sensor
measurements are not dense. For instance, the
obtained range map of the object shown in Figure
13(a)isthatshowninFigure13(b).Inthisfigure,the
rectangular object moved under only three sensors
and there are three sparse lines of measurements.
Thus,itisrequiredtofillthegapbetweenthesensor
readings. We tested two simple methods of gap
filling. One method uses the mean value of the left
andrightsensorreadingsandtheothermethoduses
the adjacent sensor reading assuming subpixel
resolution within the gap. Figure 14 illustrates both
methods and Figure 13(c) and (d) show the results,
respectively. When tested using various objects, the
second method showed better results and we
therefore used it in our experiment. The pose of the
object was then estimated simply by computing the
moments [22] of the binary image obtained by
thresholding the range map. Table 2 gives the errors
ofpositionandorientationmeasurementsforthetest
objects. In the table, the real value data were
estimatedusingacalibratedvisionsystem.
4.Conclusions
We described techniques for using IR sensors in robotic
applications. When we calibrated a GP2Y0A02YK range
measurement sensor using a simple distancevoltage
relation, we obtained a measurement error of about 4cm
or less in the range between 20 and 150cm. Nine
calibratedIRsensorswerearrangedinamatrixarrayand
tested in their ability to detect obstacles. The test results
were very promising because we could obtain the 3D
shape information of obstacles using the simple sensor
array. Although the obtained 3D data are sparse, the
perceived information is sufficient for a mobile robot to
avoidtheobstacle.Ifthesensorpanelistiltedtowardthe
ground, it can also be employed to detect hazardous
negative obstructions on the ground, such as a down
step. In addition, an arrangement of IR range finders for
sensingobjectsmovingonaconveyorbeltwasproposed.
The results of experiments with various objects proved
the validity of the proposed system. We believe that the
proposed methods of using IR range finders could be
quiteusefulfortheapplicationsdescribedhereinandfor
other robotic tasks that require the simple and low cost
acquisition of approximate 3D information of a robots
surroundings.
7 Yongtae Do and Jongman Kim: Infrared Range Sensor Array for 3D Sensing in Robotic Applications
www.intechopen.com
Objects Methods
Centroid
measurement
[mm]
Centroid
measuring
error[mm]
Absolute
error[mm]
(
2 2
x y
e e )
Orientation
( )[deg]
Orientation
measuringerror
[deg]
Height
measurement
[mm]
Height
measuring
error[mm]
x
y
ex ey
Object1
Sensing 50.0 55.0
5.0 5.3 7.29
0.00
0.04
97.5
2.5
Real 45 49.7 0.04 100
Object2
Sensing 64.3 49.3
3.8 6.8 7.79
48.02
5.20
100.3
0.3
Real 68.1 56.1 42.82 100
Object3
Sensing 60.0 65.0
3.1 6.2 6.93
0.00
0.09
98.4
1.6
Real 56.9 58.8 0.09 100
Object4
Sensing 80.0 45.0
0.1 5.6 5.60
90.00
0.03
103.7
3.7
Real 79.9 39.4 89.97 100
Object5
Sensing 50.0 20.0
3.6 2.4 4.33
90.00
0.30
48.2
1.8
Real 53.6 22.4 89.70 50
Object6
Sensing 53.7 38.9
4.9 3.4 5.96
33.12
2.53
54.1
4.1
Real 48.8 42.3 30.59 50
Table2.Measurementresultsforvariousobjects
5.References
[1]BaumannR,WilmhurstDA(1983)VisionSystemSorts
Castings at General Motors Canada. Pugh A, ed.
RobotVision.Bedford:IFSLtd.pp.255266.
[2] Everett HR (1995) Sensors for Mobile Robots: Theory
andApplication.Wellesley,Mass:AKPeters.
[3] Lai X, Wang H, Xu Y (2012) A Realtime Range
Finding System with Binocular Stereo Vision. Int. J.
Adv. Rob. Syst. 9. pp. 19. Available:
http://www.intechopen.com/journals/international_j
ournal_of_advanced_robotic_systems/arealtime
rangefindingsystemwithbinocularstereovision.
[4] Ye C, Borenstein J (2002) Characterization of a 2D
Laser Scanner for Mobile Robot Obstacle
Negotiation. Proc. IEEE Int. Conf. Robotics and
Automation.WashingtonDC,USA.pp.25122518.
[5] Hussmann S, Liepert, T (2007) Robot Vision System
based on a 3DTOF Camera. Proc. IEEE
Instrumentation and Measurement Technology
Conference(IMTC2007).Warsaw,Poland.pp.15.
[6]FuchsS,MayS.(2008)CalibrationandRegistrationfor
Precise Surface Reconstruction with TimeOfFlight
Cameras. Int. J. Intell. Sys. Tech. Appl. 5(3). pp. 274
284(DOI:10.1504/IJISTA.2008.02129).
[7]KimS,ParkC,LeeH,LeeJ(2010)TrajectoryPlanning
of Autonomous Robot Using Advanced Fuzzy
Controller. Proc. IEEE Int. Conf. Information and
Automation.Harbin,China.pp.482485.
[8] Wen YJ, Tsai CH, Yu WS, Lin PC (2011) Infrared
Sensor Based Target Following Device for a Mobile
Robot. Proc. IEEE/ASME Int. Conf. Advanced
Intelligent Mechatronics (AIM2011). Budapest,
Hungary.pp.4954.
[9]KimKH,etal.(2005)DevelopmentofDockingSystem
for Mobile Robots Using Cheap Infrared Sensors.
Proc. 1st Int. Conf. Sensing Technology. Palmerston
North,NewZealand.pp.287291.
[10] Chen CH, Song KT (2005) Complete Coverage
Motion Control of a Cleaning Robot Using Infrared
Sensors. Proc. IEEE Int. Conf. Mechatronics. Taipei,
Taiwan.pp.543548.
[11] Malik R, Yu H (1992) The Infrared Detector Ring:
Obstacle Detection for an Autonomous Mobile
Robot. Proc. 35th Midwest Symp. Circuits and
Systems.Washington,DC,USA.1.pp.7679.
[12] Gandhi D, Cervera E (2003) Sensor Covering of a
Robot Arm for Collision Avoidance. Proc. IEEE Int.
Conf. Systems, Man and Cybernetics. 5. pp. 4951
4955.
[13] Tar A, Koller M, Cserey G (2009) 3D Geometry
ReconstructionusingLargeInfraredProximityArray
for Robotic Applications. Proc. IEEE Int. Conf.
Mechatronics.Malaga,Spain.pp.16.
[14] Nakashima S, et al. (2010) A Proposal of High
Performance Method for Distance Measuring Sensor
UnitedwithPSD.Appl.Mech.Mater.36.pp.365369
(DOI:10.4028/www.scientific.net/AMM.36.365).
[15]RyuD,etal.(2010)Tless:aNovelTouchlessHuman
Machine Interface based on Infrared Proximity
Sensing. Proc. IEEE/RSJ Int. Conf. Intelligent Robots
andSystems.Taipei.pp.52205225.
[16]http://www.junun.org/MarkIII/datasheets/GP2Y0A02
YK.pdf.Accessed2012Mar14.
[17] Tang PTP (1991) TableLookup Algorithms for
ElementaryFunctionsandTheirErrorAnalysis.Proc.
IEEE Symp. Computer Arithmetic. Grenoble. pp.
232236.
8 Int J Adv Robotic Sy, 2013, Vol. 10, 193:2013 www.intechopen.com
[18]http://www.atmel.com/devices/atmega128.aspx.
Accessed2012May30.
[19]http://www.intelitek.com/ProductDetails.asp?Product
_ID=110&CategoryID=41&Industrial=&Education=yes
&Specification=yes&category_str_id=3;147;41.
Accessed2012May30.
[20]Houshangi N (1990) Control of a Robotic
ManipulatortoGraspaMovingTargetUsingVision.
Proc.IEEEInt.Conf.RoboticsandAutomation.1.pp.
604609.
[21] Borangiu T, Anton FD, Dogar A (2010) Visual Robot
Guidance in Conveyor Tracking with Belt Variables.
Proc. IEEE Conf. Automation Quality and Testing
Robotics(AQTR).ClujNapoca,Romania.1.pp.16.
[22]JainR,KasturiR,SchunckBG(1995)MachineVision.
McGrawHill.
9 Yongtae Do and Jongman Kim: Infrared Range Sensor Array for 3D Sensing in Robotic Applications
www.intechopen.com