Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

RF Mpu6050 10.1109@ecace.2019.8679290

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

2019 International Conference on Electrical, Computer and Communication Engineering (ECCE), 7-9

February, 2019

Designing and Implementation of a Wireless


Gesture Controlled Robot for Disabled and Elderly
People

Mahbuba Alam Mohammad Abu Yousuf


Institure of Information Technology Institure of Information Technology
Jahangirnagar University Jahangirnagar University
Dhaka, Bangladesh Dhaka, Bangladesh
mahbuba.alam.mist@gmail.com yousuf@juniv.edu

Abstract— Gesture controlled robot should perform its job sit on and solving their issues without anyone’s help. Robots
with the embedded system, combination of hardware and has become a part of our exponential technological growth in
software that is designed for performing dedicated tasks for the world. They are being used in industries in many major
specific users which can be controlled by hand gestures. In this sectors such as assembly lines and raw physical works
paper, we propose a gesture controlled robot for physically mainly. However, in recent days not only in large industries
challenged and elderly people. The robot uses motion sensors for cumbersome works but also it is getting a part of our
to recognize five gestures of hand. These five hand gestures daily lives in the household sector also. Personalized robotics
are employed to control five directions such as stop or steady, has become a large market around the globe in the developed
forward, backward, left and right. In this paper, the gesture
countries and all the concerns are pushing their boundaries of
has been recognized by motion sensors and without any image
imagination to implement applications based on this sector.
processing to keep the system simple & efficient. The prototype
device can be modified into a wheelchair, trolley bed or any As this research suggests, it is a potential place for
physical platform consisting of more than two wheels. An applying this gesture controlled robotic solution to be
Arduino Nano v3 embedded system with accelerometer implemented into suitable platforms comfortable for
gyroscope held by the user and the signals will be transferred maneuvering of disabled persons. Though the research is
wirelessly to the motors of the platform intended to move it as highlighted based on disabled people the application to
per the users hand gestures in order to carry themselves in normal consumers are also limitless for such solution in the
their surrounding environment. Machine learning is used to
available market of tourism, short distance travels and in
classify hand gesture accurately. The result shows that the
classification accuracy is near to 94% (93.8%).
some cases military applications. However, this paper
focuses on the disabled people use cases particularly. Also
Keywords—Gesture, image processing, wheelchair, Arduino, such problems are happening very rapidly in the developing
sensor, wirelessly, embedded system countries due to the lack of security for social environment
and also for the lack of well-built infrastructures.
I. INTRODUCTION
II. LITERATURE REVIEW
Around the world almost every day, many peoples are
becoming disabled of freely moving in their surroundings. Previous studies shows that gesture recognition can be done
The reasons can be road accidents, constructional injuries, with many systems like image processing, motion and angle
athletic injuries, major nervous system issues and cognitive detection. Normal gestures can be captured by Depth
inability. For these kind of people the only way remains is to Sensors by capturing images which is very challenging as to
sit in a wheelchair and use the help of others, sometimes with co-ordinate and locating the hand with right gestures and
great difficulty by themselves to steer the platform in order recognizing it. Jesus Suarez and Robin R. Murphy has
to interact with their surroundings for their daily needs. From discussed mainly 5 types of depth sensing image sensors
the very early times, the medical industry relied with the specially Microsoft Kinects [1] and tried to give an
general wheelchair and crutches to help these people. First overview of the challenges and executional methods in their
world countries have developed automated wheelchairs with paper. In the system proposed in this paper, the design is
in-build joysticks which are very expensive solution and also implemented by a low-cost Microcontroller Based Gesture
sometimes have a steep learning curve for people of various Recognition system without image processing. This can
ages or backgrounds & also to people who are not even
move a robot in 360 degree. This system is cheaper than
strong enough to push the controls here and there.
image processing. In this project the MPU 6050
The research intends to solve these problems for these Accelerometer Gyroscope sensor for measuring the gesture
individuals worldwide to ease their lives by introducing a of a human hand. Then the collected data will be send to the
gesture-controlled robot modified into a moving platform to robot via Radio Frequency communication and the robot

978-1-5386-9111-3/19/$31.00 ©2019 IEEE


will move forward, backward, left and right direction by communication and is readable and applicable for any part
following the hand gesture. The research advances to of the world irrespective of the demographic. It requires
develop a Human Robot Interaction (HRI) device. Human minimum effort to input the intended action to any receiving
computer interaction (HCI) also named Man-Machine end/terminal. It is easier to make someone learn it in the first
Interaction (MMI) refers to the relation between the human place. Thus a hand Gesture Control Robot is a kind of robot
and the computer or more precisely the machine, and since which is controlled by the hand gestures and not by using
the machine is insignificant without suitable utilize by the buttons. The robot is equipped with two sections-
human. Such kind of feasibility of controlling home Transmitting section and Receiving section. The design
appliances and platforms are suitable for disabled people makes the input and output modality very easy [7]. It can be
[2]. Moniruzzman Bhuiyan and Rich Picking in Centre for easily given by only hand rather than controlling anything
Applied Internet Research (CAIR), Glyndwr University, by foot like paddles and others. The input can be given by a
Wrexham, UK proposed and history of Gesture controlled single finger even in case of modified short model for
user interface (GCUI) and with their different investigation people who are only able to move so.
of the collated information for last 30 years they realize that
gesture control is the appropriate approach for controlling Varieties of intentions like come towards the user in case
existing and future ubiquitous devices [3]. There are two of remote location of the platform can be performed by
main characteristics should be deemed when designing an gestures. Filtering a digital image to attenuate noise while
HCI system as mentioned, functionality and usability. preserving the image detail is an essential part of image
System functionality referred to the set of functions or processing [8]. Throughout the technology era for robotics,
services that the system equips to the users, while system several types of methods were observed in the industry to
usability referred to the level and scope that the system can facilitate gesture control mechanisms. Some of them are
operate and perform specific user purposes efficiently. The Tethered (Direct Wires control); Wired Computer Control;
system that attains a suitable balance between these Ethernet and Wireless.
concepts considered as influential performance and
powerful system. Gestures are used for communicating
between human and machines as well as between people In this paper, the method, which is implemented, is based
using sign language. Movement of the hand in a specific on Radio Frequency (RF) sent by a transmitter in the hand
direction will transmit a command to the robot which will held device and received by the receiver of the robotic
then move in a specific direction. The transmitting device platform.
can include a comparator IC for assigning proper levels to
the input voltages from the accelerometer and an Encoder
IC which is used t to encode data and then it will be
transmitted by an RF transmitter module [4].

A gesture is a form of non-verbal communication of visible


body actions that make particular message for particular
gesture. It comprises of sound, light variation or any type of
body movement. Based upon the type of gestures, they have
been captured via Acoustic (sound), Tactile (touch), Optical
(light), Bionic and Motion Technologies through still
camera, data glove, Bluetooth, infrared beams etc. [5].

In this paper, a low-cost Microcontroller Based Gesture


Recognition system without image processing has been
proposed. This can move a robot in 360 degree. This system
is cheaper than image processing. we have used MPU 6050
Accelerometer Gyroscope sensor for measuring the gesture
of a human hand. Then the collected data will be sent to the
robot via Radio Frequency communication and the robot
will move forward, backward, left and right direction or stay
steady by following the hand gestures. This system can be
helpful for disabled and elderly persons.

III. PROPOSED SYSTEM MODEL

A Robot is a mechanically combined object that carries out


complex series of commands in action, which commands
come from programmable machine like computers. The
theme ‘Robot’ requires a device to gain data regarding the Fig 1: Robotic Chassis with controlling unit
environment, make a decision and then perform the task
accordingly [6]. The gestures are basic building blocks of

978-1-5386-9111-3/19/$31.00 ©2019 IEEE


A. Block Diagram with Description of Work Flow
The System methodology is described in Fig 2.

Fig. 3: Circuit diagram of the


Transmitter.
Fig. 2: Block diagram of the proposed gesture controlled
robot.

• The MPU 6050 module will be collecting the user


gesture data via its gyroscopic accelerator sensor and
will send to the Arduino module [9].
• The Arduino Nano v3 module will have the pre-
installed instructions coded in C++ to process the
gestures given as an input.
• The instructions processed and send by the Arduino
module will be encoded by the HT12E encoder IC
situated in the breadboard of the handheld device.
• The RF transmitter from the hand-held device to the
robots RF receiver through the spring antenna will
then transmit the instructions.
• The receiver will send the signals to the decoder chip
HT12D and after the decoding, the messages will be
directly transferred to the wheel motors of the robot
chassis and according to the instruction, it will rotate
and will have formation of the intended instruction
given by the gesture [10].
• The robotic platform will be powered by a different
power source.

Fig. 3 and Fig. 4 shows the circuit diagram of the


transmitter and receiver respectively.

Fig. 4: Circuit diagram of the Receiver.

978-1-5386-9111-3/19/$31.00 ©2019 IEEE


IV. IMPLEMENTATION 2. MCU receives the data and make instructions for
the robot.
A prototype of the hand gestured controlled maneuvering 3. Sends the instruction to the encoder IC.
robot has been developed which will follow the direction as 4. Encoded data transmits through the transmitter.
per given by common known patterns. It follows five 5. In the receivers end the receiver receives the
different gestures made by hand to perform its directional encoded data.
movement. 6. Receiver sends the encoded data to the decoder.
7. Decoder decodes the data and sends to the motor
A. Tasks performed by the robot driver.
Fig. 5 shows the gestures made by the physically 8. Motor driver drives the motor in all movements by
challenged or elderly people. following the instruction and gestures.
9. Finally, the robot moves with the gestures.

1. Stop/Steady -- when it gets hand gesture that is


parallel with horizontal VI. DETECTION AND SPECIFICATION OF THE MESSAGES
2. Front/Forward -- when it gets hand gesture that is THROUGH GESTURE
creating highest -90 degree (≅) in Y axis
3. Rear -- when it gets hand gesture that is creating • The MPU 6050 module collects the user gesture
highest +90 degree (≅) in Y axis data. It processes the angle, directional movement
via its gyroscopic accelerator sensor and sends to the
4. Left -- when it gets a hand gesture that is creating Arduino module.
highest -90 degree (≅) in X axis
• The Arduino Nano v3 module has the pre-installed
5. Right -- when it gets a hand gesture that is creating instructions coded in C++ to process the gestures
highest +90 degree (≅) in X axis given as an input. It sets up datasheet for MPU 6050.
It takes the value for X, Y and Z axes and then
assign the value into variables for 3 axes.

read_mpu_6050_data();
setup_mpu_6050_registers();
gyro_x -= gyro_x_cal;
gyro_y -= gyro_y_cal;
(a) (b) (c) gyro_z -= gyro_z_cal;

• For going to a directional way, it assigns the pitch and


roll value of X and Y axes. To make a movement the
considered angle is ~= 30 degree. The system will
start to move when it will get minimum 30° of
distortion (acceleration) and will continue up to
approximately 90° in each axes.

(d) (e) In Y axis, If angle => -30°, Decision = "Forward";


In X axis, If angle < -30°, Decision = Left";
Fig. 5: (a) Gesture to STOP the System Model, (b)
In X axis, If angle >30°, Decision = “Right";
Gesture for going FORWARD with the System
In Y axis, If angle < -30°, Decision = "Backward";
Model, (c) Gesture for going REAR/BACKWARD
with the System Model, (d) Gesture for going LEFT Otherwise, Decision = "Stop";
side with the System Model (e) Gesture for going
RIGHT side with the System Model (Left to right)
VII. EXPERIMENT

V. MECHANISM Table 1 shows the gesture to be performed, the decision


The full system follows the following algorithm: to be made after receiving gesture and the movement
according to the taken decision.
1. Gyroscope’s sensor senses the hand gesture
movement from hand to microcontroller.

978-1-5386-9111-3/19/$31.00 ©2019 IEEE


TABLE 1: EXPERIMENTAL ANALYSIS TABLE 2: CONFUSION MATRIX FOR GESTURE CONTROL
ROBOT
Received Gesture Decision Movement
according to Steady Left Right Forward Backward
Taken Decision Steady 99 0 0 1 0
When model receives Signal 0 Unchanged (0 )
hand gesture that is Left 2 96 1 1 0
parallel with horizontal Right 1 1 96 1 1
When model receives Forward Goes straightly
hand gesture that is movement to towards front Forward 3 1 2 94 0
creating -30 to -90 be done side Backward 6 2 3 5 84
degree (≅) in Y axis.
When model receives Rear Goes straightly
hand gesture that is movement to to Rear direction Table 2 shows a confusion matrix which is a summary of
creating +30 to +90 be done movement prediction results. It includes the incorrect and
degree (≅) in Y axis correct estimations. It has been found that the robot can
When model receives a Movement to Anti-clockwise recognize the gesture almost accurately. Among the 100
hand gesture that is Left Goes to left side times demonstrations, there some incorrect precisions have
creating-30 to -90 been got and those were counted to make the confusion
degree (≅) in X axis matrix. Steady, left, right, forward and backward movement
When model receives a Movement to Clockwise Goes achieved 99, 96, 96, 94 and 84 success value accordingly.
So, to improve the result, more Trial & Error method will be
hand gesture that is Right to Right side
applied.
creating +30 to +90
degree (≅) in X axis
IX. RESULTS
Table 1 shows the gesture to be performed, the decision
This section includes results related to a gesture control
to be made after receiving gesture and the movement
robot which can be used for assisting physically challenge
according to the taken decision.
people.
VIII. EVALUATION Fig. 6 shows all the movements according to the gestures
made by the hand-held device attached with body.
The prototype can be transferred to any scalable sizes,
the motored wheels can be increased as per real wheelchair Gesture Visual Movement
and winding size may be increased considering the real life Steady
load assumed for carrying certain people. The hand-held
device currently is in open breadboard format but by using
Vero board, the size of the hand-held device can be
significantly reduced to produce a more manageable form-
factor in a palm or even in a finger. With current model, the
main instructions are going to Arduino and the decisions Forward
made and sent to motor driver by Arduino. Motor driver
makes 2 front wheels to move and with their movement
back sided small wheel also moves (to keep balanced the
prototype). But it can be still done by 2 or 4 wheels
separately or using 2 radars wheel & main 2 steering wheel. Backward
The prototype has three wheels.

The model can be evaluated when:


1. Disabled or elderly person sits on the wheelchair
which is adorned with Arduino, HT12 Decoder, RF Left
receiver and motor driver.
2. Disabled or elderly person attaches the hand-held
device which is adorned with MPU 6050, RF
transmitter and HT12 Encoder into his/her body.
3. Even, an elderly person can bring the real Right
wheelchair to him/her or send it away whenever
it’s necessary. Even he/she can move it by moving
the hand-held device towards left/right or keep it
standby.
Fig. 6: Robot movements according to the gestures

978-1-5386-9111-3/19/$31.00 ©2019 IEEE


X. CONCLUSION
The technology is there to mitigate our day to day life
problems in all kinds of scalability to ensure a more
prosperous life. As per the research described in this report
it will open a new door to the people of a certain need to
have a good life in a very comforting way. The research has
the potential to grow much more if backed up by proper
industrial help. It has possibilities in upcoming ahead
through the effort. We can also apply various IOT feature
with this. We can add this prototype system with a real
wheelchair and implement that shortly. We can add some
more sensors to check the user’s health condition.

REFERENCES
[1] Jesus Suarez & Robin R. Murphy “Hand Gesture
Recognition with Depth Images: A review”, 2012 IEEE RO-
MAN: The 21st IEEE International Symposium on Robot
and Human Interactive Communication, Paris, France,
September 9-13, 2012.
[2] Silas Wan and Hung T. Nguyen,“Human Computer
interaction Using Hand Gesture” 978-1-4244-1815-
2/08/$25.00©200830th Annual International IEEE EMBS
Conference, British Columbia, Canada, August 20-24, 2008.
[3] K K Vyas, A. Pareek, S Vyas, “Gesture Recognition and
Control” Part 1 - Basics, Literature Review & Different
Techniques, International Journal on Recent and Innovation
Trends in Computing and Communication ISSN 2321 –
8169 Vol. 1 Issue: 7 575 – 581s
[4] Nitin and Naresh, “Gesture Controlled Robot PPT”,
[http://seminarprojects.com/s/hand-gesturecontrolled
robotppt, May3, 2017.
[5] Umesh Yadav, Ankur Tripathi, Sonali Dubey,S. K. Dubey,
“GESTURE CONTROL ROBOT”, International Journal of
Advanced Technology in Engineering and Science
www.ijates.com, Volume No.02, Issue No. 05, ISSN
(online): 2348 – 7550, May 2014.
[6] Wikipedia. Robot. [Online] Available at:
https://en.wikipedia.org/wiki/Robot, 11 Nov 11, 2018.
[7] R. Mardiyanto, M. F. R. Utomo, D. Purwanto and H.
Suryoatmojo, "Development of hand gesture recognition
sensor based on accelerometer and gyroscope for controlling
arm of underwater remotely operated robot," 2017 ISITIA,
Surabaya, pp. 329-333. doi: 10.1109/ISITIA.2017.8124104 ,
2017.
[8] Wang J., Miao Y., Khamis A., Karray F., Liang J.,
“Adaptation Approaches in Unsupervised Learning: A
Survey of the State-of-the-Art and Future Directions.” In:
Campilho A., Karray F. (eds) Image Analysis and
Recognition. ICIAR 2016. Lecture Notes in Computer
Science, vol 9730. Springer, Cham, 2016.
[9] Accelerometer Based Hand Gesture Controlled Robot using
Arduino [online] Available at https://goo.gl/Yyab7y,
November 11, 2018.
[10] Ronny Mardiyanto; Mochamad Fajar Rinaldi Utomo ;
Djoko Purwanto ; Heri Suryoatmojo "Development of hand
gesture recognition sensor based on accelerometer and
gyroscope for controlling arm of underwater
remotelyoperatedrobot,https://ieeexplore.ieee.org/document/
8124104", 2017.

978-1-5386-9111-3/19/$31.00 ©2019 IEEE

You might also like