Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

A Comparative Study On LIDAR and Ultrasonic Sensor For Obstacle Avoidance Robot Car

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

A comparative study on LIDAR and Ultrasonic Sensor For

Obstacle Avoidance Robot Car

Achinta Brata Roy Tonmoy MD Sarwar Zinan


Department of Electrical and Electronics Engineering Department of Electrical and Electronics Engineering

Jain (Deemed to-be University) Jain (Deemed to-be University)


achintotonmoy@gmail.com mdsarwarzinan@gmail.com

Selim Sultan Abir Sarker


Department of Electrical and Electronics Engineering Department of Mechanical Engineering

Jain (Deemed to-be University) Jain (Deemed to-be University)


selimsultan5038@gmail.com abirsarker286@gmail.com

Abstract - In this comparative research, the effectiveness of avoidance for robot cars. Autonomous robots have the capacity
LIDAR and ultrasonic sensors two widely used sensors for to identify walls and barriers around them to forecast collision-
obstacle avoidance in robot cars was assessed. To compare the free pathways autonomously [9]. Although, the detection range,
accuracy, range, and reaction times of the two types of sensors spatial resolution, and processing complexity of these
in detecting obstacles, two obstacle avoidance system was
traditional sensors are constrained. For instance, the ultrasonic
designed and put into use. According to the findings, ultrasonic
sensors are less costly and have a faster reaction time than distance sensors are constrained by blanking intervals and
LIDAR. LIDAR which has a higher accuracy and greater range angular ambiguity [1]. We will evaluate the advantages and
compared to the ultrasonic sensor. However, environmental disadvantages of both technologies and present the results of
conditions like sunshine and dust might have an impact on the experiments conducted to compare their accuracy and efficiency
functioning of LIDAR sensors since they are more sensitive to in detecting obstacles. This study findings can give researchers
them. The study concludes that the needs and limitations of the and engineers in the realm of autonomous vehicles with
application determine the best sensor for an obstacle avoidance valuable information that can assist them inselecting sensors for
system in a robot automobile. LIDAR is often a good choice for obstacle avoidance applications.
applications that need high precision and a greater range, but
ultrasonic sensors are better suited for applications that need
quick reaction times and are price sensitive. 2. LITERATURE REVIEW
A. Detection of Light and Ranging
Keywords: Accuracy, Obstacle Avoidance Robot, Range, LIDAR,
Ultrasonic Sensor; In order to determine distance and other details about a target,
LiDAR, an optical scanning technology, examines the
characteristics of radiated light. One technique for figuring out
I. INTRODUCTION how far away something is to fire laser light pulses onto its
surface. As seen in Figure 1, one technique for measuring an
The development of autonomous vehicles has generated object’s distance involve projecting laser light pulses onto the
considerable interest. Obstacle avoidance is one of the crucial surface of the object. In a vacuum, light moves at a speed of
components of autonomous vehicles to ensure their safety.
0.3 metres every nanosecond or around 300,000 kilometres per
Autonomous robot implemented with various sensors can be
second. It is possible to calculate an object's distance by
analysed in different environments [6], also in industrial fields.
measuring the interval between light emitted and light returned
[7, 8]. There are various sensor technologies available that can
[2-4]. A set of points arranged according to polar coordinates
be used for obstacle avoidance, among which LIDAR (Light
is the typical LiDAR data [1].
Detection and Ranging) and ultrasonic sensors are the most
widely used. In this comparative study, we aim to analysethe
performance of LIDAR and ultrasonic sensors in obstacle
Principle Of Lidar Figure 3 represents the main block diagram of the robot system
for obstacle avoidance. The coordinates of the object's angle
and distance received by LiDAR will be utilised as references
in this system. The YDLIDAR drivers and Robot Operating
System is installed within a Raspberry Pi 3 single-board
computer. A vehicle known as a Braitenberg vehicle has two
sensors and two motors, but various connections are used to
link them.[5]. The Braitenberg vehicle 2b approach is
Fig. 1. The principles of LiDAR. employed in this work, in which the right motor is controlled
by the sensor on the mobile robot's left side and vice versa.
B. Sensing And Measuring Distance The motor speed is calculated using the pulse Width
Modulation (PWM) approach.
The ultrasonic sensor consists of a transducer that generates
the ultrasonic sound waves and a receiver that listens for the
reflected sound waves. The transducer emits a high-frequency
sound pulse, which travels through the air and bounces back
when it encounters an obstacle. The receiver then measures the
time it takes for the reflected sound wave to return to the sensor
and converts this time into a distance measurement.

Principle Of Ultrasonic Sensor


Fig. 04. The YDLIDAR X4 with 360° scanning.

Figure 4 represents the LiDAR, a sensor that measures the


object's distance in a clockwise motion, only 360° data will be
used in this test taken into account for shifts of half a degree
between -90° to 90°.
Fig. 2. The principles of Ultrasonic Sensor .

3. RESEARCH METHOD
1. Choose a robot car platform: Select a robot car platform
that can accommodate both LIDAR and Ultrasonic sensors and
that is suitable for obstacle avoidance applications.
2. Integrate LIDAR and Ultrasonic sensors: Mount the
LIDAR and Ultrasonic sensors on the robot car in a way that Fig.05.Block Diagram
provides a good field of view. Ensure that both sensors are
properly connected to the robot car's microcontroller or single- The microcontroller is the central processing unit of the system.
board computer. It receives input from the sensors, processes the data, and
3. Write control software: Write code that integrates the outputs control signals to the actuators. It may be an Arduino
LIDAR and Ultrasonic sensors and implements obstacle Uno or a similar microcontroller. The sensors provide
avoidance algorithms. The algorithms should be able to detect information about the environment to the microcontroller. In
obstacles and adjust the robot car's path to avoid them. the case of an obstacle avoidance robot car, the sensors typically
4. Test the system: Test the system in various scenarios and include an ultrasonic sensor. The actuators receive control
environments to collect data on the performance of the LIDAR signals from the microcontroller and perform actions in the
and Ultrasonic sensors. The data should include obstacle environment. For an obstacle avoidance robot car, the actuators
detection range, accuracy, and response time for both sensors. typically include motors which will control the movement of
the robot. The motor driver receives control signals from the
microcontroller and amplifies the current to drive the motors. It
may include a separate motor driver IC or may be integrated
into the microcontroller. Communication interfaces, such as
USB or serial, may be used to receive commands from an
external computer or to send data to an external display. The
power supply provides the necessary voltage and current to all
the components in the system. It may consist of a battery or a
DC-DC converter connected to an external power source.
Fig. 3. Schematic of the Autonomous Robot System
Typically, the operator controls the
robot to achieve the intended results, such as travelling to a
certain position, measuring temperature and humidity, or
completing similar duties.[10]

Fig. 06. Ultrasonic Sensor


The ultrasonic sensor will be connected to the Arduino Uno by
connecting its Trig and Echo pins to the appropriate digital
input/output (I/O) pins on the board. In the sketch (program) for
the Arduino, the I/O pins used for the Trig and Echo should be
initialized as outputs and inputs, respectively. The sensor is
triggered by sending a short, high-level pulse to the Trig pin.
This pulse triggers the emission of a burst of ultrasound waves
from the sensor. After the trigger pulse, the Arduino measures
the time it takes for the Echo pin to go high, indicating that the
ultrasound waves have been reflected back to the sensor. This
time is the time-of-flight (ToF) and is used to calculate the
distance to the obstacle. The distance to the obstacle is Fig. 08. This experiment utilised a robot platform.(2).
calculated using the following formula: distance = (speed of
sound * ToF) / 2. The speed of sound is a constant that can be 4. RESULTS AND DISCUSSION
pre-defined in the sketch. The calculated distance can be
displayed on the serial monitor or sent to other components, A. LiDAR computation
such as a motor controller, to control the robot's behavior.
Figure 7 shows the robot which was taken as a demonstration
model in this experiment. The robot was made up with an 11.1V
Lithium Polymer (LiPo) battery pack, a Raspberry Pi 3 acting
as a motor controller, motor driver module, a step-downBuck
converter module functioning as a 5V power supply. LiDAR is
a 360° scanning sensor that measures the distance and angle of
objects surrounding autonomous robot. With an average
inaccuracy of 0.8%, Table 1 demonstrates that the LiDAR can
measure distances of up to 10.5 metres and 0.14 metres,
respectively. The accuracy of the LiDAR-collected data is
evaluated by conducting measurements with varying ambient
light intensity and different item colours. Each experiment has
six different object colour types with object distances of 1 m
and 2 m. The colours are orange, purple, red, green, blue, and
black. For light intensities of 82 lux and 0.04 lux, respectively,
the output of the LiDAR measurements are shown in Tables 2
and 3. The results indicate that the object's surface colour and
light intensity have no obvious effect on the precision of the
LiDAR readings
Fig. 07.This experiment utilised a robot platform(1).
Range READ FAULT
(M) RANGE (%)

0-0.12 0 -
Objec t Object distance : Object distance :2 m
0.14 0.122 0.82
Colour 1m
0.17 0.153 0.81
Measured Error Measured Fault (%)
0.22 0.211 0.84 Range (m) (%) Distanc e
(m)
0.25 0.232 0.83

0.6 0.511 1 Orange 0.996 0.5 2.027 0.9

1.5 1.488 0.33


Purple 0.987 1.7 2.009 0.06
2.5 2.509 0.36
Red 0.981 2.00 2.019 0.75
3.5 3.517 0.48
Green 0.991 0.7 2.008 0.4
4.5 4.531 0.68
Blue 0.995 0.8 2.002 0.04
5.5 5.525 0.45
Dark 1.007 0.2 2.029 1.78
6.5 6.561 0.93
Average 0.982 1.06 2.009 0.59
7.5 7.606 1.41

8.5 8.600 1.17


Table :2: The measurements made by LiDAR with 82 lux
9.5 9.679 1.88

10.5 10.662 1.54


Object Object Object
11.5 0 - Colour Range: 1m Range : 2 m

12.5 0 - Measured Error Measured Fault(%)


Range(m) (%) Range (m)
Average 0.8

Orange 0.994 0.7 2.016 0.75

Purple 0.967 0.8 1.987 1.2


Table 1 :LiDAR measurement results.
Red 0.979 1.8 1.987 0.4

Green 0.989 0.8 1.998 0.8

Blue 0.988 0.4 1.988 0.6

Black 1.001 0.2 2.021 0.8

Average 0.989 0.70 1.989 0.72

Table:03 The measurements made by LiDAR with 0.04 lux


B. Mobile robot obstacle avoidance Table :05: Avoiding numerous objects with an autonomous robot.

In the first trial, the mobile robot avoids a variety of coloured


obstacles at distances of 30 cm and 60 cm. In the second
experiment, the autonomous robot is used to perform obstacle Object Object distance:
avoidance at distances of 30 cm and 60 cm. Table 4 distance: 30cm 60cm
demonstrates that the mobile robot is 100% successful at
Object
avoiding obstacles of different widths. The third experiment
uses a variety of coloured and sized obstacles at distances Clash Clash
between 30 cm and 60 cm. The autonomous robot can avoid
coloured items of various sizes including corrugated carton box,
plastic jerry can, and sand transparent glass bottle as shown in
yes no yes no
Table 5. However, transparent things like acrylic and
polycarbonate bottles cannot be avoided by the robot. This can Polycarbonate √ √
be because the item is not reflecting back the laser pulses that Bottle
LiDAR is emitting. The total experiment involves navigating Acrylic √ √
the autonomous mobile robot both with and without obstacles Corrugated √ √
within a room inside. carton box
Plastic Jerry √ √
cans
Transparent √ √
glass Bottle
Table 04. Different widths of obstacles are avoided by the mobile
robot.
Table 6 & 7 shows the different parameters and object
detection result when Ultrasonic Distance Sensor mounted
on an Automotive obstacle avoiding robot system.
Distance
Distance between Table 06 : Ultrasonic sensor examination results.
between
Object objects : 60cm
objects: 30cm
Width (cm) Measured Range Range from sensor
0 cm 0 cm
Clash 7 cm 7 cm
Clash
14 cm 14 cm
22 cm 22 cm
yes no
yes no
30 cm 30 cm
6 √
11 √
16 √
18 √
24 √
35 √
0 100
Total (%) 0 100

Fig.09 : Representation of the test's methodology.


Table 07 : Object Dimension Test.
6. REFERENCES
[1] N. A. Rahim, M. A. Markom, A. H. Adorn, S. A. A. Shukor, A. Y.
Object Able to detect object M. Shakaff, E. S. M. M. Tan, "A mapping mobile robot using RP
Dimension Lidar scanner", 2016, pp. 87-92.
yes No [2] M. D. Adams, "Lidar design, use, and calibration concepts for
correct environmental detection", IEEE Trans. Robot. Autom., vo!.
2x5 cm √ 16, no. 6, 2000,pp.753-761 .

[3] M. M. Atia, S. Liu, H. Nematallah, T. B. Karamat, A. Noureldin,


"Integrated indoor navigation system for ground vehicles with
3x5 cm √ automati c 3-D alignment and position initialization", IEEE Trans
. V eh. Techno!., vo!. 64, no. 4, 2015, pp. 1279-1292.

4x5 cm √ [4] J. Liu, Q. Sun, Z. Fan, "TOF Lidar Development in Autonomous


Vehicle", IEEE 3rd Optoelectron. Glob. Conf., 2018, pp. 185-190.

5x5 cm √ [5] X. Yang, R. V. Patel, M. Moallem, "A fuzzy-braitenberg


navigation strategy for differential drive mobile robots", J. Intel!.
Robot. Syst. Theory App!., vol!. 47, no. 2, 2006, pp. 101-124.
15x16 cm √ [6] H. Widyantara, M. Rivai, D. Purwanto, "Wind direction sensor
based on thermal anemometer for olfactory mobile robot",
Indonesian Journal of Electrical Engineering and Computer
Science, vo!. 13, no. 2, 2019, pp. 475-484.

[7] M. Rivai, Rendyansyah, D. Purwanto, "Implementation of fuzzy


logic control in robot arm for searching location of gas leak",
5.CONCLUSION International Seminar on Intelligent Technology and Its
Application, 2015, pp. 69- 74.
In conclusion, obstacle avoidance in robot automobiles can [8] G. A. Rahardi, M. Rivai, D. Purwanto, "Implementation of hot-
benefit from both LIDAR and Ultrasonic Sensors, but each has wire anemometer on olfactory mobile robot to localize gas source",
advantages and limitations of its own. While offering 360- International Conference on Information and Communications
degree coverage, long-range detection, and great precision, Technology, 2018, pp. 412--417.
LIDAR is nevertheless rather pricey. Ultrasonic sensors are less
[9] M. Bengel, K. Pfeiffer, B. Graf, A. Bubeck, A. Veri, "Mobile
expensive and smaller in size, but their range is constrained and robots for offshore inspection and manipulation", IEEE/RSJ Int.
they are sensitive to outside conditions like humidity and Conf. Intell. Robot. Syst., 2009, pp. 3317-3322.
temperature. It is possible to select either a LIDAR or an
ultrasonic sensor depending on the needs and limitations of the [10] J. Wade, T. Bhattacharjee, R. D. Williams, and C. C. Kemp, “A
force and thermal sensing skin for robots in human environments,”
application. The performance and robustness of the obstacle
Rob. Auton. Syst., 2017.
avoidance system can also be increased by combining the data
from the two sensors.

Acknowledgement
This experimentation was held with the assistance of financial
grants from Professor Dr. Ezhilarasan Ganesan and Associate
professor & HOD Dr. P Pradeepa, Jain University, Bangalore,
Karnataka, India.“ Achinta Brata Roy Tonmoy, MD Sarwar
Zinan, Selim Sultan and Abir Sarkar” contributed equally to
this work.

You might also like