Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 52

DESIGN AND DEVELOPMENT OF VOICE CONTROLLED ROBOTIC

VEHICLE

DEPARTMENT OF COMPUTER SCIENCE

DADABHOY INSTITUTE OF HIGHER EDUCATION

i
DESIGN & DEVELOPMENT OF VOICE CONTROLLED ROBOTIC
VEHICLE
(FINAL YEAR PROJECT REPORT)

GROUP 12 BATCH 2017

STUDENT NAME ID NO

FARHAN ALI BS TECH M3-17/ML003

GHAZANFAR AHMAD KHAN BS TECH M3-17/ML001

WAQAR AHMAD MALIK BS TECH M3-17/ML002

INTERNAL ADVISOR ENG WAQAR HYDER

ii
CERTIFICATE

It is to certify that the following students have completed their project “Design and
Development of Voice Controlled Vehicle” satisfactorily, and that this report has been tested
against plagiarism.

GROUP 12 BATCH 2017

STUDENT NAME ID NO

FARHAN ALI BS TECH M3-17/ML003

GHAZANFAR AHMAD KHAN BS TECH M3-17/ML001

WAQAR AHMAD MALIK BS TECH M3-17/ML002

INTERNAL ADVISOR PROJECT COORDINATOR


ENG WAQAR HYDER ENG WAQAR HYDER
ASSISTANT PROFESSOR ASSISTANT PROFESSOR
DEPT OF COMPUTER SCIENCE DEPT OF COMPUTER SCIENCE
DIHE DIHE

DEPARTMENT OF COMPUTER SCIENCE


DADABHOY INSTITUTE OF HIGHER EDUCATION

iii
DEDICATION

“The scientist discovers a new type of material or energy and the engineer discovers a new use
for it..”
-Gordon Lindsley Glegg
For their infinite love and support in every stage of life, we dedicate this project to our
beloved families and friends in particular to the parents. It is committed to all employees of
the group who respond to the call in the middle of day or night. For those who respond from
close and far to the call. For those who do not expect private gain to respond to the call for
assistance. We also want to dedicate this project to our inner consultant Eng.Waqar Hyder
because it would not be feasible without him for his continuous assistance and guidance in
this project.
Thank You, Sir.

iv
ACKNOWLEDGEMENT

Firstly we would like to thank our project advisor Eng Waqar Ahmad for his contribution
towards our project, the project would not have been possible without guidance, strict
project protocol and motivation. Our team members were greatly influenced by his example
and as such gave us the determination required in completing the project.

We would also like to thank the Chairman of Computer Science Department for giving us the
opportunity to participate in the Computer Science Project and providing all the lab facilities
for testing, instrumentation and prototyping without any restrictions.

We would to thank all the teachers of Computer Science department for their support and
guidance towards problems which were outside our field of scope and providing us the best
possible solution to our problems.

v
ABSTRACT

Voice-activated Home Automation is a highly advantageous project for elderly and physically challenged
persons who are unable to perform various duties comfortably at home and want assistance.
The difficulties of wiring in place of wired automation is eliminated with our project. Home Automation
may save a significant amount of energy and is adaptable and compatible with future technologies,
allowing it to be easily adjusted for specific needs. Everything, including home appliances and other
electrical gadgets, must be automated in today's generation. Secure entry to workplaces, malls, and
businesses is also possible with this system.

The Home Automation sector has exploded in recent years. This is fueled by the drive to create
supporting systems that make our lives easier. Systems for automation is expected to be deployed in the
existing home environment, with no changes to the infrastructure. The automation is controlled by a
microcontroller and is based on the recognition of spoken commands. The overall design of 'Voice
Controlled Home Automation' is presented in this FYP report which is divided into a series of chapters
that will be addressed in the progression of the report.

vi
vii
CONTENTS
1 INTRODUCTION............................................................................................................................1
1.1 MOTIVATION.................................................................................................................................1
1.2 OBJECTIVES...................................................................................................................................2
1.3 SCOPE..............................................................................................................................................2
1.4 METHODOLOGIES AND STRATEGIES.....................................................................................3
1.5 USE OF VOICE CONTROLLED ROBOTIC SYSTEMS..............................................................3
1.6 DESIGN AND PROTOTYPE MODEL..........................................................................................4
2 LITERATURE REVIEW.................................................................................................................6
2.1 SPEECH RECOGNITION TECHNOLOGY...................................................................................6
2.2 EMBEDDED SYSTEMS.................................................................................................................7
2.3 AUTOMATION...............................................................................................................................9
2.4 A BRIEF HISTORY:.....................................................................................................................11
3 COMPONENTS USED IN FYP....................................................................................................12
3.1 HARDWARE COMPONENTS USED.........................................................................................12
3.1.1 MICROCONTROLLER (ARDUINO UNO)............................................................................12
3.1.2 MOTORS (HIGH TORQUE 12V, DC)....................................................................................13
3.1.3 ROBOT WHEELS (2 BY 30 MM)...........................................................................................15
3.1.4 4WD ROBOT CHASSIS KIT...................................................................................................16
3.1.5 L293D........................................................................................................................................18
3.1.6 L298N........................................................................................................................................20
3.1.7 JUMPER WIRES......................................................................................................................20
3.1.8 BLUETOOTH MODULE HC-05.............................................................................................21
3.2 SOFT WARE COMPONENTS.....................................................................................................23
3.2.1 PROTEUS SIMULATION SOFTWARE.................................................................................23
3.2.2 ARDUINO SOFTWARE IDE...............................................................................................24
3.2.3 VISUAL STUDIO CODE.........................................................................................................25
4 RESULTS AND CHALLENGES OF FYP...................................................................................26
4.1 RESULTS:......................................................................................................................................26
4.2 CHALLENGES INVOLVED........................................................................................................27
5 CONCLUSION AND FUTURE SCOPE.......................................................................................27
5.1 CONCLUSION..............................................................................................................................28
5.1.1 RECOMMENDATIONS AND FUTURE SCOPE..................................................................28
6 BIBLIOGRAPHY..........................................................................................................................29

viii
TABLE OF FIGURES

Figure 1.1-1 Voice Controlled Robot Vehicle...........................................................................................................1


Figure 1.6-1 Design of Prototype Model...................................................................................................................4
Figure 2.1-1 Speech Recognition Phenomemon........................................................................................................6
Figure 2.2-1 Embedded Systems which consists of multiple subcomponents............................................................7
Figure 2.3-1 Automation and its main components...................................................................................................9
Figure 3.1-1 Arduino Uno Microcontroller..............................................................................................................13
Figure 3.1-2 12 V DC Motor...................................................................................................................................14
Figure 3.1-3 Robot Wheels......................................................................................................................................16
Figure 3.1-4 Robot Chassis Kit................................................................................................................................18
Figure 3.1-5 Block Diagram of L293D....................................................................................................................19
Figure 3.1-6 Block Diagram L298N........................................................................................................................20
Figure 3.1-7 HC 05 Bluetooth Module....................................................................................................................21
Figure 3.1-8 Circuit connection for HC-05 and Arduino.........................................................................................22
Figure 3.1-9 Flow of Data........................................................................................................................................22
Figure 3.2-1 Proteus Interface..................................................................................................................................24
Figure 3.2-2 Arduino IDE........................................................................................................................................25
Figure 3.2-3 Visual Studio Code Interface...............................................................................................................25

ix
x
1 INTRODUCTION

Figure 1.1- 1 Voice Controlled Robot Vehicle

1.1 MOTIVATION
The primary goal is to limit the expense including in the task and power utilization fulfilling the need of
the day simultaneously. Our objective reaches out in including our proposition to give productive ,
availability and control of ordinary articles. Presently a-days shrewd telephones are turning out to be all
the more impressive with rein constrained processors, bigger capacity limits , more extravagant
amusement
capacity and more specialized strategy.
Bluetooth is primarily utilized for information trade; add new elements to shrewd telephones. Bluetooth
innovation , made by telecom merchant Ericsson in 1994, shows its benefit by coordinating with
advanced cells. Individuals utilize computerized innovation at home or office, and has move customary
wired computerized gadgets into remote gadgets.
A host Bluetooth gadget is fit for speaking with up to 7 bluetooth modules at same time through one link.
A debt of gratitude is in order for Bluetooth innovation and other comparative strategies, with sensational
expansion in advanced cells clients, PDAs have bit by bit transformed into a universally handy versatile
gadget and given individuals to their every day use .
Lately an open source stage Android has been generally utilized in brilliant phones. Android has total
programming bundle comprising of a working framework , middleware layer and center applications.
Not the same as other existing stages like IOS, it accompanies programming packs (SDK), which gives
fundamental devices and applications. Utilizing a savvy telephone as a cerebrum of robot is now a
functioning exploration field. During our FYP Project we get to understand the importance of Voice
recognition.

1
1.2 OBJECTIVES
Our Objectives of the Project was to study the applications of using voice recognition to
move our robot according to our requirements and to make sure it can follow specific
commands. The receiver will convert the software to Arduino code which will enable the
motor to move.

1.3 SCOPE
The following points highlight the scope of the project:
 Due to a fraction of the population having sight and other types of disability. Voice
Recognition vehicles can play a significant role in the market in the future.

 Recent developments in the field of Artificial Intelligence and self correcting


algorithms has enabled us to produce software bots which can produce highly
accurate results through self correction and machine learning.

 Enhances security features as only individuals with specific vocals are able to use
these said devices which prevents misuse of valuable products and devices

 Gives the user greater degree of freedom and control over the device which promotes
user friendliness and simple usage of products and enhances user experience.

2
1.4 METHODOLOGIES AND STRATEGIES
The following methodology is used to develop the voice recognition robot as follows:
A robot which can be controlled using specific voice commands
Speech to Text functionality is used to convert voice commands to text which are then sent to
Arduino through Bluetooth communication.
The voice commands are perceived using an android application which converts speech to
text.
This text is in the form of a string.
It is then sent to Arduino via the Bluetooth module.
We accept character by character from the serial buffer sent by the app and combine them to
form a string.
Then the code compares it to the command. If it matches, the command is carried out. For
example, if the string we received is "forward", the robot will go forward,

1.5 USE OF VOICE CONTROLLED ROBOTIC SYSTEMS


Some of the important uses of voice controlled robotic systems are mentioned below:

1) Indoor assistive robots that use speech instructions to navigate around and pick up objects from
one location and place them in another.
2) Surveillance applications that broadcast live camera feeds and it is useful in tracking down items
easily and with high accuracy 
3) Robotics in Industry can be controlled easily without using complex control mechanism which
promotes easy industrial operations 
4) Automobiles with onboard digital assistants can help improve user experience and promote car
navigation strategies

3
1.6 DESIGN AND PROTOTYPE MODEL
The Robot we are designing is simple in nature and it gives a general idea of how automated
voice control systems work. The list of components that are used in the project are covered in
detail in chapter 4.
The prototype design of our Voice Recognition Robot which uses a AAA pack is shown as
follows:

Figure 1.6-2 Design of Prototype Model

4
Figure 1.6- 3Side view of the robot vehicle prototype

5
Figure 1.6-4 Top view of the prototype

6
Figure 1.6- 5 Circuit composition of the prototype

7
8
2 LITERATURE REVIEW

2.1 SPEECH RECOGNITION TECHNOLOGY


Speech recognition technology used to capture the spoken words using a microphone and convert them
into digitally stored set of words. The quality of the speech recognition system analyzed with two factors
accuracy and speed.

Figure 2.1- 6 Speech Recognition Phenomemon

Speech recognition system has many applications. Software is commonly used in hands free computing,
automatic translations, robotics and automated customer service etc. Software has its own weaknesses
and nagging problems. The human voice commands are given to the robotic assistant remotely, by using
a smart mobile phone. Robot performs action on online cloud server.

The speech signal commands converted to text form are communicated to the robot over a Bluetooth
network. The effectiveness of the voice control communicated over a distance is measured through
several experiments. Performance evaluation is carried out with encouraging results of the initial
experiments.

Earlier robots were developed using the ZigBee protocol which were costly [1]. Another approach to
develop robot using the sound card and a micro phone are not user friendly. A technique to give voice
command using android based smart phone using Bluetooth is presented to construct the robot based on
micro-controller.

The robot can accept instructions from users verbally and interact with user by speaking various
sentences. Communication takes place in user friendly manner [2]. Gaps found in earlier approach is
Voice Recognition System has a disadvantage with human voice is that every person have their own
accent which is a difficult task for the robot to understand. Maintenance of this system is very difficult.
Range of Bluetooth Technology is very less up to 10 meters.

So, there is a need to develop user friendly robot with less power and cover maximum range.

9
2.2 EMBEDDED SYSTEMS
The role of computing devices, embedded into everyday objects, has grown tremendously over the last
two decades. To give an example, a typical car produced at the beginning of the 1990-ies was largely a
mechanical unit. Today, a large part of the development costs in a typical front-edge car manufacturing
company are related to software development.

The unprecedented complexity of existing software systems is paralleled by an analogous development


within the hardware technology. Hardware is being developed faster, while it is cheaper and more
powerful than ever before. Of course, at the same time hardware devices are becoming ever more
complex and heterogeneous. The rapid growth and success of communication technology is the third
constituent of what many consider as the next technological leap facing the human society, namely the
emergence of interconnected intelligent things or embedded devices, capable of communicating both
with each other and humans, sensing, taking decisions and acting on these decisions.

Figure 2.2-7 Embedded Systems which consists of multiple subcomponents

In fact, numerous applications of such interconnected things are already starting to reach the market, for
example home care surveillance devices, disaster warning systems, smart energy grids, intelligent
buildings, autonomic vehicle convoys, traffic prediction systems, smart automation, etc. However, the
rapid growth in software, hardware and communication technologies is not only an enabler but also a
grand challenge for the future interconnected systems.

The foreseeable complexity of such systems, together with their inevitable criticality for the human well-
being in many applications, pose a large number of challenging questions that cross-cut several research
disciplines. In our work, we aim to address some of these questions, with the focus towards federations
of embedded systems.

A federated embedded system (FES) is defined as a constellation of devices that are part of and control
different products, and that exchange data with each other and with external servers to the benefit of all,
in such a way that no individual device is in control over the others. Note that in many cases this
implicitly means that the constituent devices are produced by different manufacturers using different
platforms, standards, etc. Further, FES need not have a static structure, but are established, reestablished
and extended over time, and a particular device can be part of several FES at different times or
simultaneously.
10
Naturally, it must be profitable and secure for an embedded device to participate in a federation. In other
words, the efficiency and/or the possibilities of a device should be enhanced, while certain quality
attributes of FES, such as performance, safety, privacy and robustness should be guaranteed. To meet
these concerns, development is needed within such fields as programming technology, software and
hardware architecture, software development methods and tools, business structures, communication
protocols, data management and human-machine interaction, to mention a few.

The enormous potential for various aspects of human life that the vision of interconnected smart objects
offer, together with its significant technical challenges, has attracted the interest of researchers within
different disciplines.

11
2.3 AUTOMATION
Automation describes a wide range of technologies that reduce human intervention in processes. Human
intervention is reduced by predetermining decision criteria, subprocess relationships, and related actions — and
embodying those predeterminations in machines.[1]

Automation,[2] includes the use of various equipment and control systems such as machinery, processes in
factories, boilers,[3] and heat-treating ovens, switching on telephone networks, steering, and stabilization of ships,
aircraft, and other applications and vehicles with reduced human intervention.[4]

Automation covers applications ranging from a household thermostat controlling a boiler, to a large industrial
control system with tens of thousands of input measurements and output control signals. Automation has also
found space in the banking sector. In control complexity, it can range from simple on-off control to multi-variable
high-level algorithms.

Figure 2.3-8 Automation and its main components

In the simplest type of an automatic control loop, a controller compares a measured value of a process with a
desired set value and processes the resulting error signal to change some input to the process, in such a way that
the process stays at its set point despite disturbances. This closed-loop control is an application of negative
feedback to a system. The mathematical basis of control theory was begun in the 18th century and advanced
rapidly in the 20th.

Automation has been achieved by various means including mechanical, hydraulic, pneumatic, electrical, electronic
devices, and computers, usually in combination. Complicated systems, such as modern factories, airplanes, and
ships typically use all these combined techniques. The benefit of automation includes labor savings, reducing
waste, savings in electricity costs, savings in material costs, and improvements to quality, accuracy, and precision.

The World Bank's World Development Report 2019 shows evidence that the new industries and jobs in the
technology sector outweigh the economic effects of workers being displaced by automation.[5]

Job losses and downward mobility blamed on Automation has been cited as one of many factors in the resurgence
of nationalist, protectionist and populist politics in the US, UK and France, among other countries since the 2010s.
12
[6][7][8][9][10]

The term automation, inspired by the earlier word automatic (coming from automaton), was not widely used
before 1947, when Ford established an automation department.[2] It was during this time that industry was rapidly
adopting feedback controllers, which were introduced in the 1930s.[11]

13
2.4 A BRIEF HISTORY:

Voice recognition technology has become a part of our everyday lives, and with the increasing popularity
of home assistant devices, it is more familiar than ever. Predictions indicate that voice technology will
soon be common place in our workspaces, too.

Of course, voice recognition technology is nothing new. There have been key developments throughout
the decades, as voice recognition evolved to today’s recognizable iteration
In 1939, The Voder was demonstrated at the World Fair in New York City. It was developed to
synthesise human speech by imitating the effects of the human vocal chords, operated by selecting one of
the two basic sounds via pedal bar.

1952 saw Bell Labs design the Audrey. Capable of understanding a small selection of spoken digits, the
Audrey could distinguish between zero and nine. “Audrey could recognise the sound of a spoken digit –
zero to nine – with more than 90% accuracy”

IBM demonstrated the Shoebox at the 1962 Seattle World Fair. The Shoebox could understand up to 16
spoken words in English. This technology was operated by speaking into a microphone, which then
converted sounds into electrical impulses.
In 1976, after five years of research by DARPA, The Harpy was developed by Carnegie Mellon, this
technology was able to understand 1,011 words.

By the early 1980s, voice recognition began making great leaps towards greater viability. A technique
called Hidden Markov Model was used, allowing voice recognition machines to more accurately identify
speech. Around this time, IBM began work on Tangora, a technology able to identify 20,000 spoken
words

In the mid 1980s, voice recognition made its way to children’s toys, with Teddy Ruxpin (1985), Pamela
the Living Doll (1986) Talking Micky Mouse (1986), and many more bringing speech recognition into
our homes.

By 1990, speech recognition had reached the workplace with Dragon Dictate, via Windows PCs. The 90s
trend for speech recognition at work continued – Apple launched Speakable Items in 1993, a built-in
controlled software for their computers. 1993 also saw the introduction of the Sphinx-II, the first large
vocabulary continuous speech recognition system.

A few years later, in 1996, IBM launched MedSpeak – the first commercial product capable of
recognising continuous speech.

“In 10 years I believe that we’ll not only be using the keyboard and the mouse to interact, but during that
time we will have perfected speech recognition and speech output well enough that those will become a
standard part of the interface.” Bill Gates speaking in 1997

In 2002, Microsoft integrated speech recognition technology into their office products, while 2006 saw
The NSA using speech recognition to isolate keywords when analysing recorded conversations. Google
launched GOOG-412 in 2007 a telephoned directory service that paved the way for their other voice
recognition products.

In the last few years, voice recognition software has been enriched by the power of machine learning,
developing an ‘intelligence’ that was previously unprecedented. In 2008, Google launched the Voice
Search App for the iPhone, while Siri was introduced to the world in 2011, giving consumers their very
14
own digital personal assistant. This marked a change for mobile tech companies, as voice recognition
enabled users to control their devices more efficiently than ever before.
Between 2014 and 2017, the race to the top for voice recognition products intensified. Microsoft
introduced Cortana and Amazon gave us the Echo, a voice-controlled speaker powered by Alexa. In
early 2018, Condeco demonstrated Alexa integration with Condeco room booking technology at the
Workplace Innovation Forum, bringing the power of voice integration technology to the modern-day
office.

Although it may seem that voice recognition and control is a new technology, it has been in the works
since the middle of the 20th century. Only in the last five to eight years has voice recognition technology
gained mass appeal. However, it goes without saying that voice recognition has traveled a long road
before it reached where it is today.

3 COMPONENTS USED IN FYP

The main list of hardware and software which were used in the project are mentioned in
detailed below :
3.1 HARDWARE COMPONENTS USED
 Arduino UNO
 Motors(High Torque 6V)
 Wheels
 4WD Robot Chassis Kit
 L293D/L298N
 Jumper wires
 Bluetooth Module HC-05
3.1.1 MICROCONTROLLER (ARDUINO UNO)
The Arduino Uno Board is one of the basic boards used for programming integrated systems.
The Arduino Uno is a panel based on the ATmega328P (datasheet) microcontroller. It has 14
digital input / output pins (including 6 as PWM outputs), 6 analog inputs, a 16 MHz quartz
crystal, a USB connection, a power jack, an ICSP header, and a reset button. It includes
everything you need to help the microcontroller; merely connect it to a USB cable laptop or
power it to get began with an AC-to-DC adapter or battery. [17]

15
Figure 3.1-9 Arduino Uno Microcontroller

3.1.2 MOTORS (HIGH TORQUE 12V, DC)


A direct current (DC) motor is a type of electric machine that converts electrical energy into mechanical energy.
DC motors take electrical power through direct current, and convert this energy into mechanical rotation.

DC motors use magnetic fields that occur from the electrical currents generated, which powers the movement of a
rotor fixed within the output shaft. The output torque and speed depends upon both the electrical input and the
design of the motor.
These motors are also equipped with spur gear drivetrain, providing an optimal amount of torque to the wheels.
These gears are made with extreme care in order to reduce backlash, which is usually produced during the torque
transfer operation.

The use of spur gears also ensures quick and instantaneous transfer of power and torque, bearing extreme fatigue
during the process

Usually, a 5:1 to 10:1 gear ratio is used in these torque transmission, as it is capable of handling that much amount
of torque and provides additional safety during its handling.

16
Figure 3.1-10 12 V DC Motor

The term ‘DC motor’ is used to refer to any rotary electrical machine that converts direct current electrical energy
into mechanical energy. DC motors can vary in size and power from small motors in toys and appliances to large
mechanisms that power vehicles, pull elevators and hoists, and drive steel rolling mills. But how do DC motors
work?

DC motors include two key components: a stator and an armature. The stator is the stationary part of a motor,
while the armature rotates. In a DC motor, the stator provides a rotating magnetic field that drives the armature to
rotate.

A simple DC motor uses a stationary set of magnets in the stator, and a coil of wire with a current running through
it to generate an electromagnetic field aligned with the centre of the coil. One or more windings of insulated wire
are wrapped around the core of the motor to concentrate the magnetic field.
17
The windings of insulated wire are connected to a commutator (a rotary electrical switch), that applies an electrical
current to the windings. The commutator allows each armature coil to be energised in turn, creating a steady
rotating force (known as torque).

When the coils are turned on and off in sequence, a rotating magnetic field is created that interacts with the
differing fields of the stationary magnets in the stator to create torque, which causes it to rotate. These key
operating principles of DC motors allow them to convert the electrical energy from direct current into mechanical
energy through the rotating movement, which can then be used for the propulsion of objects.
Characteristics of DC Motor means the relation(or graph) between different parameter like Armature Torque,
Armature Current & Speed of the motor.
There are three characteristic of DC motors which are:
 Torque speed characteristics
 Torque current characteristics and
 Current speed characteristics.
These are explained step by step for each type of the motor (DC Motor). These graph of DC motor are
determined by keeping two things in mind first back emf equation and second torque equations. For a DC motor,
magnitude of the back emf of DC motor is same as emf equation of a dc generator.

3.1.3 ROBOT WHEELS (2 BY 30 MM)


Standard wheels which are made up of rubber and are durable and provide ease of mobility for the robot
These wheels are equipped with high aspect ratio rubber tyres, which provide enough traction to propel
the vehicle properly through the ground.
Since this prototype model is not equipped with any sort of suspension setup, the use of these tyres also
ensures that enough dampening cushioning is provided to the vehicle to avoid any sort of shocks or
vibration.
It also provides dampening while the vehicle falls from a certain height.
The wheels are usually made through a process called injection molding, a process where molten
material is injected into a mold of desired shaped cavity to obtain desired shaped part. The part should
also have the desired appearance as well as characteristic properties that tells the viewer about the part.
Extreme care is taken during this manufacturing process, as it requires operators’ focus and attention to
produce these kind of parts.
The first stage of injection moulding is to create the mould itself. Most moulds are made from metal,
usually aluminium or steel, and precision machined to match the features of the product they are to
produce.

Once the mould has been created by the mould-maker, the material for the part is fed into a heated barrel
and mixed using a helical shaped screw. Heating bands melt the material in the barrel and the molten
metal or molten plastic material is then fed into the mould cavity where it cools and hardens, matching
the shape of the mould. The cooling time can be reduced through the use of cooling lines that circulate
water or oil from an external temperature controller. Mould tools are mounted on plate moulds (or
‘platens’), which open once the material has solidified so that ejector pins can eject the part from the
mould.

Separate materials can be combined in one part in a type of injection moulding called a two-shot mould.
This technique can be used to add a soft touch to plastic products, add colours to a part or produce items
with different performance characteristics.

Moulds can be made of single or multiple cavities. Multiple cavity moulds can have identical parts in
each cavity or can be unique to create parts of different geometries. Aluminium moulds are not best
18
suited to high volume production or parts with narrow dimensional tolerances since they have inferior
mechanical properties and can be prone to wear, deformation and damage due to the injection and
clamping forces. While steel moulds are more durable they are also more expensive than aluminium
moulds.

Figure 3.1- 11 Robot Wheels

3.1.4 4WD ROBOT CHASSIS KIT


4WD Robot Chassis Kit is an easy to assemble and use robot chassis platform. It provides ample room for
expansion to add various sensors and controllers. Arduino/Rasperberry Pi and Motor Driver can jumpstart your
programming project easily.The Chassis has predrilled holes for easy installation of sensors and electronics as per
your requirements.

The chassis kit provides more weight support and mobility than the 2WD Wheel Kit and supports 4 DC Motors
for additional Power.
The powertrain layouts used in this prototype is called 4WD, a common layout for vehicles used in off-road
conditions as well as those vehicle which operate on snow, dirt, mud and wet paved roads.
The powertrain is everything that makes a vehicle move, including the engine and the drivetrain, while the
drivetrain is everything that makes the wheels move minus the engine.

There are three common types of drivetrain arrangements: rear-wheel drivetrains, front-wheel drivetrains, and
four-wheel/all-wheel drivetrains.

Although 4WD and AWD are different, they both transfer power to your front and back wheels, which can be
beneficial during muddy, snowy, rocky, and other difficult driving conditions.

You typically see 4WD systems on larger vehicles that are designed with all-terrain abilities, such as trucks,
19
SUVS, and off-road vehicles.
When 4WD is engaged, the engine sends power to the transmission, which is then split into the front and wheel
axles. The torque gets transferred to the wheels, but the wheels must have traction on the road in order for the
vehicle to move anywhere. Otherwise the tires will merely spin as you have probably experienced when stuck in
mud or sand.

Let’s say that you get your rear wheels stuck in mud. If you have two-wheel drive (2WD), then your wheels will
probably spin and spin. In this case, it might be extremely useful to have four-wheel drive so that your front
wheels could get some traction on the road. If power was transferred to the front wheels, where the traction is,
you’ be able to successfully get your car out of a sticky situation.

This is essentially what four-wheel drive does. It gives you traction where and when you need it. Although 4WD is
a bit more complicated than that, it’s essentially a way to increase traction and power on the road.

Most of the time, all you need is 2WD. 2WD is used for regular road driving. When you need extra power and
traction (deep mud, soft sand, ruts, steep inclines and declines, rocky surfaces, etc), you can engage 4WD by
pressing a button. The process for engaging 4WD, however, depends on your vehicle.
The main benefits of 4WD are traction and power. Have you ever seen those commercials where the Jeep is
climbing over boulders and rocks? That’s 4WD in action.

If you are climbing a steep hill or are off-roading, you will want increased power in order to get over obstacles and
climb steep hills. While 2WD will get you over even the steepest hills of San Francisco, if you are off-roading you
will probably want the extra power that comes with 4WD.

4WD improves traction in dangerous driving conditions, such as snow, ice, rocks, and other scenarios that can
make control difficult. By engaging both sets of wheels, traction and control improves.
Additional weight contributes to better grip on the road.
4WD is great for those who like off-roading.
If you frequently drive in conditions where there is low traction, or if you enjoy off-roading, you will greatly
benefit from four-wheel drive.
In most cases, 4WD is not necessary. It uses more fuel and can also lead to overconfidence, leading to more
situations where you can get stuck. Save money and fuel by only using 4WD when you need it.

The main disadvantage of 4WD is added cost for purchase, maintenance, and fuel. The extra equipment
(differentials, transfer case, etc.) adds complexity and weight to the vehicle, increasing initial market value, tire
wear, and the cost of repairs and maintenance.
The added power and weight of 4WD and AWD systems require more fuel, making them less efficient than their
2WD counterparts.
Added weight improves traction and control, but it also increases the braking distance required to make a complete
stop. Lighter vehicles can avoid collision easier than heavier vehicles.
4WD and AWD can cause overconfidence in drivers, ironically leading to more situations where you can become
stuck.
Although 4WD improves traction, slow down and use extreme caution on icy, snowy, and slick roads.
Overconfidence can lead to dangerous accidents.

As discussed above, this 4wd layout has many advantages over its disadvantages, so it can be usuful in future
prototypes as it is useful for our project.

20
Figure 3.1- 12 Robot Chassis Kit

3.1.5 L293D
The Device is a monolithic integrated high voltage, high current four channel driver designed to accept standard
DTL or TTL logic levels and drive inductive loads (such as relays solenoids, DC and stepping motors) and
switching power transistors.
To simplify use as two bridges each pair of channels is equipped with an enable input. A separate supply input is
provided for the logic, allowing operation at a lower voltage and internal clamp diodes are included.
This device is suitable for use in switching applications at frequencies up to 5 kHz.
The L293D is assembled in a 16 lead plastic package which has 4 center pins connected together and used for
heatsinking The L293DD is assembled in a 20 lead surface mount which has 8 center pins connected together and
used for heatsinking.

21
Figure 3.1- 13 Block Diagram of L293D

22
3.1.6 L298N
The L298 is an integrated monolithic circuit in a 15-lead Multiwatt and PowerSO20 packages. It is ahigh voltage,
high current dual full-bridge driver de-signed to accept standard TTL logic levels and drive inductive loads such as
relays, solenoids, DC and stepping motors. Two enable inputs are provided to enable or disable the device
independently of the in-put signals. The emitters of the lower transistors of each bridge are connected together and
the corresponding external terminal can be used for the connection of an external sensing resistor. An additional
supply input is provided so that the logic works at a lower voltage.

Figure 3.1- 14 Block Diagram L298N

3.1.7 JUMPER WIRES


A jumper wire is an electric wire that connects remote electric circuits used for printed circuit boards. By attaching
a jumper wire on the circuit, it can be short-circuited and short-cut (jump) to the electric circuit.

By placing the jumper wire on the circuit, it becomes possible to control the electricity, stop the operation of the
circuit, and operate a circuit that does not operate with ordinary wiring. Also, when specification change or design
change is necessary on the printed circuit board, reinforcement of the defective part, partial stop of the unnecessary
function, and change of the circuit configuration of the unnecessary output part by attaching or detaching the
jumper wire can do.

23
SHOWA jumper wire (NSL: New Showa Lead) is a lead-free tin-plated annealed copper wire. Tin plating is tin:
99.2%, copper: 0.8%.

In general, it is said that hot plating is difficult to control the plating thickness compared with electroplating, but
we control the plating thickness by the original processing method.

It also supports various environmental surveys such as RoHS Directive and REACH.

3.1.8 BLUETOOTH MODULE HC-05


HC-05 Bluetooth Module is an easy to use Bluetooth SPP (Serial Port Protocol) module, designed for transparent
wireless serial connection setup. Its communication is via serial communication which makes an easy way to
interface with controller or PC. HC-05 Bluetooth module provides switching mode between master and slave
mode which means it able to use neither receiving nor transmitting data..

3.1.8.1 SPECIFICATIONS
 Model: HC-05
 Input Voltage: DC 5V
 Communication Method: Serial Communication
 Master and slave mode can be switched

Figure 3.1- 15 HC 05 Bluetooth Module

3.1.8.2 DATA TRANSFER METHOD


Diagram below shows the hardware connection between HC-05 Bluetooth Module and Arduino
UNO. Besides Arduino, it may interface with any microcontroller such as PIC and etc

24
Figure 3.1- 16 Circuit connection for HC-05 and Arduino

After completing hardware and source code installation on Arduino UNO, the next step is setting up PC site. In
order to communicate with Arduino UNO, a Bluetooth device is needed as well on PC site. We recommend using
USB plug in Bluetooth device in PC site. See below diagram for data transfer between Arduino UNO and PC via
Bluetooth devices.

Figure 3.1- 17 Flow of Data

25
3.2 SOFT WARE COMPONENTS
There is a number of modern software which we were used in this thesis to develop our
prototype models and schematics design and then analyzing our result from it. The software
tools which we using are as follows:
 PROTEUS.
 Arduino
 MS Visual Studio Code

3.2.1 PROTEUS SIMULATION SOFTWARE


The Proteus Design Suite is a proprietary tool suite that is used primarily for automation of
electronic design. The software is mainly used by electronic design engineers and technicians
to create schematics and electronic prints for printed circuit board manufacturing.
There are different modules used in proteus some of them are given below:
 PCB design
 Microcontroller simulation
 3D Verification
 Schematic Capture
The Proteus Design Suite is a Windows application designed for schematic capture,
simulation and layout design for PCB (Printed Circuit Board). Depending on the size of the
designs being produced and the requirements for microcontroller simulation, it can be
purchased in many configurations. All PCB Design products include SPICE simulation
capacities for an auto- router and fundamental mixed mode.

26
3.2.1.1 PROTEUS WINDOWS INTERFACE
The window interface of proteus is look like as:

Figure 3.2-18 Proteus Interface

3.2.2 ARDUINO SOFTWARE IDE


The Arduino Integrated Development Environment - or Arduino Software (IDE) - contains a text editor for writing
code, a message area, a text console, a toolbar with buttons for common functions and a series of menus. It
connects to the Arduino hardware to upload programs and communicate with them.

The IDE application is suitable for different operating systems such as Windows, Mac OS X, and Linux. It
supports the programming languages C and C++. Here, IDE stands for Integrated Development Environment.

The program or code written in the Arduino IDE is often called as sketching. We need to connect the Genuino and
Arduino board with the IDE to upload the sketch written in the Arduino IDE software. The sketch is saved with
the extension '.ino.

27
Figure 3.2- 19 Arduino IDE

3.2.3 VISUAL STUDIO CODE


Visual Studio Code, also commonly referred to as VS Code,[9] is a source-code editor made by Microsoft for
Windows, Linux and macOS.[10] Features include support for debugging, syntax highlighting, intelligent code
completion, snippets, code refactoring, and embedded Git. Users can change the theme, keyboard shortcuts,
preferences, and install extensions that add additional functionality.

In the Stack Overflow 2021 Developer Survey, Visual Studio Code was ranked the most popular developer
environment tool, with 70% of 82,000 respondents reporting that they use it.

Figure 3.2- 20 Visual Studio Code Interface

28
4 RESULTS AND CHALLENGES OF FYP

4.1 RESULTS:

29
4.2 CHALLENGES INVOLVED
While designing and developing battery management systems, scientists and engineers face
open challenges. The main problems can be summarized as follows:
 Algorithm programming challenges
 Hardware failure
 Availability of Components
 Safety problems
 Variable behavior of each Li ion cells
Lets discuss the above mentioned problems in detail.

In order to control the robot accurately , the algorithm needs to self-assess itself and correct it
accordingly using the feedback given by the control system The construction of such
algorithm requires complex method which is very difficult to comprehend and implement.
Hence one of the major hurdles which were encountered in our project was related to
programming. So for simplicity purposes we use simple Arduino programming.

The second problem we faced in our project was hardware failure during the prototyping
stage. Since Electronic components are fragile and are prone to failure therefore they are to
be treated with utmost caution. Another type of problem we faced is that during circuit
construction and testing we were not able to identify the reasons of circuit failure which was
quite the hassle. But as we got more experience in handling electronic circuitry we were
easily able to overcome our problems.

The third main problem that we faced is lack of availability of components. Some
components like the temperature sensor LM 35 were scarcely available and we had to survey
a lot of markets just to find this sensor. Not only that but some of the other parts were not
available in those markets.

The fourth and one of the major challenges that we had to face was regarding safety. Since Li
ion cells are prone to fire and explosion, they had to be treated with caution. During the EV
Prototyping phase we faced similar safety problems.

Last of the problems that we faced was variable cell behavior. During the EV Battery
charging phase we observed that some cell modules had different voltages as compared to
others and this led to a variety of problems.

5 CONCLUSION AND FUTURE SCOPE


30
5.1 CONCLUSION
This project completely reforms the robotic vehicle and gives it a new dimension. It can easily recognize the voice
commands and runs smoothly. Further enhancement in project can be used for Home security and military
purposes where the commands can be given to robot without risk by increasing the range and by installing
cameras.

5.1.1 RECOMMENDATIONS AND FUTURE SCOPE

1. This research work has been narrowed down to short range Bluetooth module. Using a long range
modules and other connectivity devices will result in connectivity with the robot for long distances.

2. Power Optimization such sleep and wakeup schedules can be incorporated.


3. Image processing can be implemented in the robot to detect the color and the objects.
4. A thermal camera can be installed to sense the heat emitted by bodies useful in military purposes
to detect enemies on the lines.
5. Automatic Targeting System can be implemented in the robot for tracking the target

31
6 BIBLIOGRAPHY

[1] P. Torrone, “Arduino Arts UNO,” 21 June 2018. Available from:


[http://arduinoarts.com/what-is-ardiuno/]

32
33
34
APPENDIX

ARDUINO CODE:

#include <SoftwareSerial.h>
SoftwareSerial BT(10, 11); //TX, RX
String readvoice;
int RMF = 3; // IN1
int RMB = 4; // IN2
int LMF = 5; // IN3
int LMB = 6; // IN4
void setup() {
BT.begin(9600);
Serial.begin(9600);
pinMode (RMF, OUTPUT);
pinMode (RMB, OUTPUT);
pinMode (LMF, OUTPUT);
pinMode (LMB, OUTPUT);
}void loop() {
while (BT.available()){
delay(10);
char c = BT.read();
readvoice += c;
}
if (readvoice.length() > 0) {

Serial.println(readvoice);

if(readvoice == "forward")
{
digitalWrite(RMF, HIGH);
digitalWrite (LMF, HIGH);
digitalWrite(RMB,LOW);

35
digitalWrite(LMB,LOW);
delay(100); }
else if(readvoice == "back")
{ digitalWrite(RMF, LOW);
digitalWrite(LMF, LOW);
digitalWrite(RMB, HIGH);
digitalWrite(LMB,HIGH);
delay(100); }
else if (readvoice == "left")
{ digitalWrite (RMF,HIGH);
digitalWrite (LMF,LOW);
digitalWrite (RMB,LOW);
digitalWrite (LMB,LOW);
delay (100);}
else if ( readvoice == "right")
{digitalWrite (RMF, LOW);
digitalWrite (LMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite (LMB, LOW);
delay (100);}
else if (readvoice == "stop")
{digitalWrite (RMF, LOW);
digitalWrite (LMF, LOW);
digitalWrite (RMB, LOW);
digitalWrite (LMB, LOW);
delay (100);}
else if (readvoice == "off"){ digitalWrite (RMF, LOW);
digitalWrite (LMF, LOW);
digitalWrite (RMB, LOW);
digitalWrite (LMB, LOW);
delay (100);}
else if (readvoice == "happy dance"){
digitalWrite (RMF, LOW);digitalWrite (RMB, HIGH);
digitalWrite (LMF, LOW);digitalWrite (LMB, LOW);delay (400);
36
digitalWrite(RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite(LMF,HIGH);
digitalWrite(LMB,LOW);
delay(600);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, HIGH);
digitalWrite (LMB, LOW);
delay (500);
digitalWrite (RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite (LMF, LOW);
digitalWrite (LMB, HIGH);
delay (500);

digitalWrite (RMF, LOW);


digitalWrite (RMB, HIGH);
digitalWrite (LMF, LOW);
digitalWrite (LMB, LOW);
delay (400);
digitalWrite(RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite(LMF,HIGH);
digitalWrite(LMB,LOW);
delay(600);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, HIGH);
digitalWrite (LMB, LOW);
delay (500);
digitalWrite (RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite (LMF, LOW);
37
digitalWrite (LMB, HIGH);
delay (500);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, LOW);
digitalWrite (LMB, LOW);
delay (400);
digitalWrite(RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite(LMF,HIGH);
digitalWrite(LMB,LOW);
delay(600);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, HIGH);
digitalWrite (LMB, LOW);
delay (500);
digitalWrite (RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite (LMF, LOW);
digitalWrite (LMB, HIGH);
delay (500);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, LOW);
digitalWrite (LMB, LOW);
delay (400);
digitalWrite(RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite(LMF,HIGH);
digitalWrite(LMB,LOW);
delay(600);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
38
digitalWrite (LMF, HIGH);
digitalWrite (LMB, LOW);
delay (500);
digitalWrite (RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite (LMF, LOW);
digitalWrite (LMB, HIGH);
delay (500);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, LOW);
digitalWrite (LMB, LOW);
delay (400);
digitalWrite(RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite(LMF,HIGH);
digitalWrite(LMB,LOW);
delay(600);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, HIGH);
digitalWrite (LMB, LOW);
delay (500);
digitalWrite (RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite (LMF, LOW);
digitalWrite (LMB, HIGH);
delay (500);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, LOW);
digitalWrite (LMB, LOW);
delay (400);
digitalWrite(RMF, HIGH);
39
digitalWrite (RMB, LOW);
digitalWrite(LMF,HIGH);
digitalWrite(LMB,LOW);
delay(600);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, HIGH);
digitalWrite (LMB, LOW);
delay (500);
digitalWrite (RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite (LMF, LOW);
digitalWrite (LMB, HIGH);
delay (500);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, LOW);
digitalWrite (LMB, LOW);
delay (400);
digitalWrite(RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite(LMF,HIGH);
digitalWrite(LMB,LOW);
delay(600);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, HIGH);
digitalWrite (LMB, LOW);
delay (500);
digitalWrite (RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite (LMF, LOW);
digitalWrite (LMB, HIGH);
delay (500);
40
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, LOW);
digitalWrite (LMB, LOW);
delay (400);
digitalWrite(RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite(LMF,HIGH);
digitalWrite(LMB,LOW);
delay(600);
digitalWrite (RMF, LOW);
digitalWrite (RMB, HIGH);
digitalWrite (LMF, HIGH);
digitalWrite (LMB, LOW);
delay (500);
digitalWrite (RMF, HIGH);
digitalWrite (RMB, LOW);
digitalWrite (LMF, LOW);
digitalWrite (LMB, HIGH);
delay (500);
}
readvoice="";}

41
42

You might also like