0% found this document useful (0 votes)
20 views

Mini Project PDF

Uploaded by

ruthvikvyshnav
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views

Mini Project PDF

Uploaded by

ruthvikvyshnav
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Vidyavardhaka College of Engineering,

Mysuru – 570002
Department of Electronics and Communication Engineering

CERTIFICATE
This is to certify that the work entitled “HAND GESTURE CONTROLLED
ROBOT” is a bonafide work carried out by Ruthvik B J (4VV20EC126), Shreesha
Hanagud (4VV20EC138), Thejas Gowda R (4VV20EC165) and Manoj N
(4VV20EC088) in partial fulfillment of the award of the degree of Bachelor of
Engineering in Electronics and Communication Engineering of Visvesvaraya
Technological University, Belagavi, during the year 2022-2023.
It is certified that all corrections / suggestions indicated during internal assessment have
been incorporated in the report. The mini project report has been approved as it satisfies the
academic requirements in respect of the mini project work prescribed for the Bachelor of
Engineering degree.

_________________ _______________

Dr. Suchitra M Dr. C. M. Patil


Professor Prof. & Head
Dept. of ECE Dept. of ECE
HAND GESTURE CONTROLLED ROBOT

DECLARATION
We the members of the mini project team, studying in the V semester of Electronics and
Communication Engineering, Vidyavardhaka College of Engineering, hereby declare that the
entire mini project titled “HAND GESTURE CONTROLLED ROBOT” has been carried
out by us independently under the guidance of Dr. SUCHITRA M, Professor Department of
Electronics and Communication Engineering, Vidyavardhaka College of Engineering. This
mini project work is submitted to the Visvesvaraya Technological University, Belagavi, in
partial fulfilment of the requirement for the award of the degree of Bachelor of Engineering in
Electronics and Communication Engineering during the academic year 2022-2023.

This mini project report has not been submitted previously for the award of any other degree
or diploma to any other Institution or University.

Place: Mysuru

Date: 21-03-23

Name of the Students USN Signature

1. RUTHVIK B J 4VV20EC126

2. SHREESHA HANAGUD 4VV20EC138

3.THEJAS GOWDA R 4VV20EC165

4. MANOJ N 4VV20EC088

i
HAND GESTURE CONTROLLED ROBOT

ACKNOWLEDGEMENT

The satisfaction that accompanies the successful completion of any task would be incomplete
without the mention of people who made it possible and whose constant guidance and
encouragement crowned our efforts with success. We consider our privilege to express the
voice of gratitude and respect to all those who guided us and inspired us in completion of this
mini project.
We wish to express our gratitude to Dr. B. Sadashive Gowda, Principal, VVCE, for
providing congenial working environment.
We are thankful to Dr. C. M. Patil, Professor and Head, Dept. of ECE, VVCE, for
motivating us and allowing us to use the logistics of the department to complete this mini
project successfully.
We express our sincere thanks to our guide Dr. Suchitra M, Department of ECE,
VVCE, for her constant co-operation, support, and invaluable suggestions.
We extend our heartfelt gratitude to our mini project coordinators Dr. Suchitra M,
Professor, Dept of ECE, VVCE and Kavyashree B, Assistant Professor, Dept of ECE, VVCE
for their timely support and motivation.
We would like to thank our parents for their constant moral support throughout the
completion of this mini project.
Finally, last but not the least we would like to extend our deep sense of gratitude to our
friends who always inspired us and encouraged us throughout the completion of this mini
project.

Name of the Students

1. RUTHVIK B J

2.SHREESHA HANAGUD

3.THEJAS GOWDA R

4. MANOJ N

ii
HAND GESTURE CONTROLLED ROBOT

ABSTRACT
This paper presents an IoT based hand gesture-controlled robot. This IoT concept can deliver
a comfort, convenient use of things in daily life as well as industries. The proposed gadget
makes use of an accelerometer-based sensor to detect hand gestures, which are then transmitted
wirelessly to the robot via an RF transmitter module. The robot is connected with RF receiver
module, which receives the data and the Arduino Uno microcontroller, interprets the gesture
data and takes the corresponding actions through L293D motor driver, together with moving
ahead, backward, turning left or right. The use of hand gestures provides a more natural way
for the users to have interaction with robots, compared to traditional methods including buttons
or joysticks. This device is a prototype that can make things simpler especially in the growing
industries where humans are replace by robots. The results in this paper shows that the
proposed approach is effective and accurate in controlling robots using hand gestures,
providing the way for the improvement of better superior gesture-controlled robot structures.

iii
HAND GESTURE CONTROLLED ROBOT

TABLE OF CONTENTS

Declaration i
Acknowledgements ii
Abstract iii
Index iv

INDEX

CHAPTER 1 INTRODUCTION [01-03]


1.1 Background 01
1.2 Introduction 02
1.3 Problem Statement 03
1.5 Motivation 03
1.6 Existing System 03

CHAPTER 2 LITERATURE SURVEY [04-07]


2.1 Literature Survey

CHAPTER 3 METHEDOLOGY [08-14]


CHAPTER 4 RESULT AND DISCUSSIONS [15-17]
CHAPTER 5 CONCLUSION AND FUTURE SCOPE [18]
PUBLICATIONS [19]
REFERENCES [20]

iv
HAND GESTURE CONTROLLED ROBOT

CHAPTER-1

INTRODUCTION

1.1 Background
The concept of controlling robots using hand gestures has been explored for many years. It is
a natural way for humans to interact with machines, and can make robotics more accessible to
people who have difficulty using traditional input devices. In recent years, the development of
low-cost microcontrollers such as Arduino Uno has made it easier for hobbyists, students, and
researchers to experiment with this technology and develop their own hand gesture-controlled
robots.
The use of Arduino Uno microcontroller boards in robotics projects has become increasingly
popular due to their ease of use, low cost, and availability of open-source libraries and tutorials.
Arduino Uno is a versatile microcontroller board that can be programmed to interface with a
wide range of sensors, motors, and other electronic components.
There are various sensors that can be used to detect hand gestures, such as accelerometers,
gyroscopes, flex sensors, and camera-based computer vision systems. These sensors are used
to collect data on the user's hand movements, which is then processed and translated into
commands for the robot.
The software for hand gesture-controlled robots using Arduino Uno is typically written in C++
using the Arduino IDE. The software reads sensor data, interprets the user's hand gestures, and
sends commands to the robot to perform specific tasks. This can include controlling the robot's
movement, activating sensors or other devices, and responding to user input.
Overall, hand gesture-controlled robots using Arduino Uno offer an exciting and accessible
way for hobbyists, students, and researchers to experiment with robotics and explore the
possibilities of human-robot interaction. With continued advancements in sensor technology
and machine learning algorithms, this technology has the potential to transform the way we
interact with machines and other electronic devices.

1
HAND GESTURE CONTROLLED ROBOT

1.2 Introduction
With the Evolution of technology, the interaction with machine has increased day by day. From
Switching on Light from Smartphone to Controlling of vehicle. Technology has become a part
and parcel in one’s life. There are various ways to interact with machine like remote control,
joystick, etc. which is basic method. Nowadays, there be a lot of researches made on controlling
of system or device with human Gestures, i.e., eye movement, facial expression or hand
gestures which eliminates some barriers like language barrier for robot. Arduino UNO is used
to carry commands by interpreting the collection of data received from its interfaced devices.
The Device uses the accelerometer to detect the any motion in front of it and transmits the data
via Transceivers to receiving end and performs certain task using Arduino micro controller.
Recently, strong efforts have been carried out to develop intelligent and natural interfaces
between users and computer-based systems based on human gestures. Gestures provide an
intuitive interface to both human and computer. Thus, such gesture-based interfaces can not
only substitute the common interface devices, but can also be exploited to extend their
functionality. Robots are playing an important role in automation across all the sectors like
construction, military, medical, manufacturing, etc. After making some basic robots like line
follower robot, computer-controlled robot, etc.; we have developed this accelerometer-based
gesture-controlled robot by using Arduino Uno. In this project we have used hand motion to
drive the robot. For this purpose, we have used accelerometer which works on acceleration. A
gesture-controlled robot is controlled by using hand in place of any other method like buttons
or joystick. Here one only needs to move hand to control the robot. A transmitting device is
used in your hand which contains RF Transmitter and accelerometer. This will transmit
command to robot so that it can do the required task like moving forward, reverse, turning left,
turning right and stop. All these tasks will be performed by using hand gesture. Here the most
important component is accelerometer. Accelerometer is a 3-axis acceleration measurement
device with ±3g range. This device is made by using polysilicon surface sensor and signal
conditioning circuit to measure acceleration. The output of this device is Analog in nature and
proportional to the acceleration. This device measures the static acceleration of gravity when
we tilt it and gives a result in form of motion or vibration. According to the datasheet of adxl335
polysilicon surface-micromachined structure placed on top of silicon wafer. Polysilicon springs
suspend the structure over the surface of the wafer and provide a resistance against acceleration
forces. Deflection of the structure is measured using a differential capacitor which incorporate
independent fixed plates and plates attached to the moving mass. The fixed plates are driven

2
HAND GESTURE CONTROLLED ROBOT

by 180° out-of-phase square waves. Acceleration deflects the moving mass and unbalances the
differential capacitor resulting in a sensor output whose amplitude is proportional to
acceleration. Phase-sensitive demodulation techniques are then used to determine the
magnitude and direction of the acceleration.

1.3 Problem Statement


To develop a robotic system that can be controlled through natural hand gestures rather than
traditional input devices such as a keyboard or joystick. The project aims to enable users to
interact with the robot in a more natural way, making it easier to use.

1.4 Objective
• To provide a means of controlling a robot's movements or actions through hand
gestures, rather than using traditional input devices like a remote controller or a
keyboard
• To provide more accurate and natural way of movement with the robot

1.5 Motivation
Our motivation to work on this project came from a disabled person who was driving his wheel
chair by hand with quite a lot of difficulty. So, we wanted to make a device which would help
such people drive their chairs without even having the need to touch the wheels of their chairs.

1.6 Existing System


Today in this world where the Technology and Science is growing day by day the field of
Robotics also needed to be upgraded. Thus, it was found that there are few issues in the existing
system, As the existing system can be controlled only with the Controller so the user cannot
directly control the robot. Secondly the physical work is more to control the robot. Thirdly the
power consumption was more as the control er requires power supply to a higher extent than
the Accelerometer. Finally, the expense was more as more hardware was required. These were
a few Drawbacks in the existing System.

3
HAND GESTURE CONTROLLED ROBOT

CHAPTER 2

LITERATURE SURVEY
Literature survey plays an important role in life cycle of any project. The purpose of the
research is to come up with a solution by understanding the shortcomings and obstacles in the
existing system. It also includes comparing existing and proposed methods. This chapter gives
some related work of the different methods of warehouse managements and the objectives of
the present work.

2.1 Literature Review


Researchers have worked on different types different types of technologies and components to
achieve to develop a robot that is controlled by the outputs given from human such as and
gesture, voice etc.
Authors [1] have described hand gesture control technology as one of the oldest ways of
communication. The developed project is described in such a way that the process of
recognition of a hand gesture is done in three steps. The first is identifying the hand region
in the image. The second involves feature extraction which comprises of finding centroid
and major axis of magenta region followed by finding 5 centroids of cyan and yellow
region indicating fingers. The third step involves building classifier using learning vector
quantization. The procedure for the robot to move goes in a stepwise manner where the
web cam captures the image of the hand signal shown and processes it to find the
instruction then the corresponding instruction is sent through file on HTTP port to
ESP8226 which is connected to the remote system consisting of chassis with wheels and
motor to make the robot move.

Few authors [2] provides an idea about the hand gesture technology by stating it to be the
real time monitoring system, by which humans interacts with robots through gestures.
The author says that hand gesture technology is better than voice control system taking
accuracy into consideration. The four major blocks in the architecture of the researched
robot are, Input from accelerometer, Arduino processing, calibration and differentiation,
output to LDR and motor and an RF module in the receiving end. The recognition
procedure involves, Calibration, End point detection, Smoothing and Scaling,
Recognition and Navigation. The outcome of the project happens to be very much user

4
HAND GESTURE CONTROLLED ROBOT

friendly wherein the user can navigate the wireless robot in the environment using various
gestures commands.

The authors [3] defines a robot to be a device which can be used as tool for making work
easier and effortless. The author [3] describes the main purpose of the developed project
which us to provide simpler robot’s hardware architecture but with powerful
computational platforms so that robot’s designer can focus on their research and tests
instead of Bluetooth connection infrastructure. After observing the successful working of
the prototype, the author [3] has listed out the advantages of the hand gesture technology
to be, Replacement of man power by using the digital control, Components and
experimental setup takes much less amount when compared to others, The handling and
operation of process is very easy and accurate and It does not need a skilled person to
operate the control.

Some authors [4] describes that the introduction of IoT and combination of IoT and physical
devices makes life easier. The author [4] summarizes the objectives of the research done as
Connect and Communicate with physical devices, Faster and Smart innovation, Smart sensing
capabilities. The prototype is prepared using an accelerometer in the transmitter end and a
motor driver in the receiver end whose communication is done by RF pair module and the
actions are controlled on both the sides by Arduino UNO microcontroller. But, the major
limitations of the project are, lot of power is consumed so it has a high-power consumption.
Moreover, it cannot detect any object. If any object comes in front of the robot, it does not stop
or change its direction. So, addition of ultrasonic sensors can overcome the limitation.

The author [5] of the paper named International Journal on Emerging Technologies introduces
the system of a gesture driven robotic vehicle. The prototype is developed in such a way that
the movements and manipulations of the robot i.e., handling and control depends on the
gesture of the user. In this system, gesture is captured by accelerometer and it is processed by
software namely, microcontroller software and the parameters are sent to microcontroller and
encoder circuit. Also, in the paper, they have mainly mentioned about the way in which this
project can be an effective method to eradicate the social problems faced by the physically
challenged persons. The project’s main objective is locomotion of the robot in a wireless mode
so that the usage of the product becomes more easy and helpful for the physically challenged
people.

5
HAND GESTURE CONTROLLED ROBOT

Authors [6] of few papers have compared the developed prototype with similar products
already available and explained the advantages of using hand gesture method to control the
robot over other methods like joystick or a physical controller with buttons. The methodology
of this prototype has three main elements they are, the Accelerometer Sensor sends data to
Arduino. It provides the data. It is transmitted towards the RF Transmitter after checking the
present parameters. To transfer to the Receiver,the data transmitted first by Arduino as well
as received by a RF transmitter is required. The data is decoded, and the necessary signals are
delivered to the motor controller IC. The Robot'sWheel Motors are engaged as a result of this.
The accelerometer detects horizontal and vertical vibrations and generates only constant
analogue signal values. The objective of this project as per the paper is that an Arduino
interface is used to construct a roboticautomobile that is directed by hand gestures.

As per some of the papers [7], the prototype is developed in such a way that they are not
making use of any wireless technology to send the signal instead they are making use of wires
to send the signals from transmitter to the receiver. The objective of this wired control device
is achieved using Arduino uno and accelerometer. The Arduino microcontroller receives the
analog input values (x axis, y axis) from the accelerometer and converts that analog value to
digital value. The future scope of the developed project as per the author [7] says that a
wireless mode of communication of data can be implemented to make the robot work more
effectively and easily.

Few authors [8] have explained the insides of the prototype developed by stating that its
working is based on 3axis of accelerometer and robot move in four directions forward,
backward, left, and right. For sensing Human motion, an infrared sensor is used whose range
is 790nm wavelength from human body. This type of robot widely used in military
application, industrial robotic, construction field. They have used the pick and place robots
for this process due to the following reasons. Flexibility is one of the main advantages of
robotics system. Pick and place robots are easily programmable using computer software. In
robotic system pick and place is an application which is related to physically demanding.

Exploring more in this domain, we can find authors [9] explaining about the advantages to
expand the utilization of robots where conditions are not sure, for example, security tasks,
robots can be made to suchan extent that it will follow the guidance of human administrator
and execute the assignment. The author [9] explains the entire project to be categorized into

6
HAND GESTURE CONTROLLED ROBOT

two primary parts namely, transmitter part and the receiver part. Here transmitter will transmit
the signal as per the situation of accelerometer joined on your hand and the receiver will get
the signal and make the robot move in the configured direction. The project in this paper has
an advantage over the present projects as it has additionally included more hand motions, (for
example, the bend and slice)into the interface to control the vehicle in a more natural and
successful way.

Similarly, few authors [10] also have portrayed about the motion control robot which can be
controlled through your typical hand signal in which the entire project is categorized into two
primary parts namely, transmitter part and the receiver part. Here transmitter will transmit the
signal as per the situation of accelerometer connected to the hand and the receiver will get the
signal and make the robot move in the configured direction. The main objective of this paper is
to present the control of the robot utilizing the accelerometerwith the assistance of human hand
gesture or tilting.

In some papers [11], they have explained the concept of hand-gesture based control interface
for navigating a car-robot.A 3-axis accelerometer is used to record a user's hand gestures. The
data is transmitted wirelessly via an RF module to a microcontroller. A gesture-controlled
robot is controlled by using the hand in place of any other method like buttons or joystick.
The paper explains the importance of having an easy method to use the robot by controlling
it using hand gesture to its day-to-day applications instead of using conventional methods like
joystick and voice control.

Authors [12] also have described regarding how the conventional hand gestures can control a
robot and perform our desired tasks. The transmitter will transmit the signal in line with the
position of accelerometer and your hand gesture and therefore the receiver will receive the
signal and makethe robot move in respective direction. Robots are playing a crucial role in
automation across all the sectors like construction, military, medical, manufacturing, etc. This
project can be very helpful for physically challenged people as they can movecertain objects
with less physical movement. Moreover, the human errors are reduced on a great scale and
results are achieved with great accuracy.

7
HAND GESTURE CONTROLLED ROBOT

CHAPTER 3
METHODOLOGY

This chapter includes project flow, system design, working, hardware and software
requirements.

3.1 SYSTEM DESIGN


This Project mainly has two models for its working that is Transmitter end and Receiver end,
Figure 3.1 shows the block diagram of the Transmitter end and Figure 3.2 shows the block
diagram of Receiver end. Arduino Uno is supplied with a regulated power supply to operate in
co-ordination with other components. Using Arduino Uno make the project more affordable in
comparison with other microcontrollers. The Arduino Integrated Development Environment –
or Arduino Software (IDE) is more convenient to use and code for each component.

3.2 WORKING

Figure 3.1: Block diagram of transmitter end

Figure 3.2: Block diagram of receiver end

8
HAND GESTURE CONTROLLED ROBOT

A hand gesture-controlled robot works by using an ADXL335 accelerometer in the transmitter


end to detect the orientation of the hand and interpret it as a specific hand gesture. This
information is then sent to a Arduino Uno microcontroller and the microcontroller takes the
decision and this information is transferred to the receiver end through the RF transmitter
module, this information is received by the RF receiver module in the Receiver end which is
send the Arduino in the where it processes the received data and controls the motors of the
robot through a motor driver L293D.

Here is a detailed working of a hand gesture-controlled robot:

Transmitter end:

The user wears an ADXL335 accelerometer sensor on their hand, which is used to detect the
orientation and movement of the hand as shown in the Figure 3.3. The accelerometer sensor
sends the data to the Arduino Uno microcontroller, which processes the data using an
appropriate algorithm the user has given to detect the specific hand gesture. This data is then
transferred to the Receiver end through the RF transmitter module.

Receiver end:

The RF Receiver module receives the data that is sent from the transmitter end and based on
the hand gesture data detected from the received data, the Arduino Uno microcontroller
processes the data and sends a signal to the L293D motor driver to control the motors of the
robot. The L293D motor driver receives the signal and converts it into a form that can control
the motors of the robot. The motors of the robot move accordingly, allowing the robot to move
forward, backward, left, right, or turn in different directions, depending on the specific hand
gesture detected.

Overall, the hand gesture-controlled robot works by interpreting the orientation and
movement of the user's hand, and using this information to control the motors of the robot. It
provides a hands-free and intuitive way of controlling the robot's movement, which can be
useful in various applications such as home automation, industrial automation, medical fields,
and military fields.

3.3 HARDWARE REQUIREMENTS


Proposed system consists of hardware and software requirements, the hardware components
used are listed below.

9
HAND GESTURE CONTROLLED ROBOT

3.3.1. Arduino Uno


The Arduino Uno shown in Figure 3.3 is an open source microcontroller board based on
the Microchip ATmega328P microcontroller and developed by Arduino.cc and initially
released in 2010. The board is equipped with sets of digital and analog input/output (I/O) pins
that may be interfaced to various expansion boards (shields) and other circuits.[1] The board
has 14 digital I/O pins (six capable of PWM output), 6 analog I/O pins, and is programmable
with the Arduino IDE (Integrated Development Environment), via a type B USB cable. It can
be powered by the USB cable or by an external 9-volt battery, though it accepts voltages
between 7 and 20 volts. It is similar to the Arduino Nano and Leonardo. The hardware
reference design is distributed under a Creative Commons Attribution Share-Alike 2.5 license
and is available on the Arduino website. Layout and production files for some versions of the
hardware are also available.

Figure 3.3: Arduino Uno microcontroller

3.4.2. ADXL335 Sensor

The ADXL335 sensor shown in Figure 3.4 is a small, low-power, three-axis accelerometer that
is used to measure acceleration in a variety of applications. It is a MEMS
(microelectromechanical system) device that uses tiny, flexible, silicon structures to measure
acceleration. The ADXL335 contains three separate accelerometer sensors, one for each axis:
X, Y, and Z. Each sensor is a small, thin, square chip that contains a tiny mass suspended by
small springs. When the accelerometer is moved, the mass moves with it, and the springs bend
slightly, creating a small electrical signal that is proportional to the acceleration.

The ADXL335 also includes a built-in signal conditioning circuit that converts the analog
signals from each sensor into digital signals that can be read by a microcontroller or other
digital device. The device provides a voltage output that varies proportionally to the
acceleration in each axis, with zero acceleration corresponding to a voltage of approximately
1.5V.

10
HAND GESTURE CONTROLLED ROBOT

Figure 3.4: ADXL335 sensor

To use the ADXL335, you will typically connect it to a microcontroller or other digital device
using its analog voltage output. The microcontroller can then read the voltage and convert it
into acceleration values using a simple formula based on the device's sensitivity and resolution.

The ADXL335 can be used in a variety of applications, including robotics, gaming controllers,
and motion sensing devices. Its low power consumption, small size, and ease of use make it a
popular choice for many projects.

3.4.3. RF Module

An RF (Radio Frequency) module shown in Figure 3.5 is a small electronic device that uses
radio waves to transmit and receive data wirelessly. RF modules consist of a transmitter and a
receiver, which communicate with each other by sending and receiving radio signals.

Figure 3.5: RF Module

Here is a basic overview of how an RF module works:

11
HAND GESTURE CONTROLLED ROBOT

Transmitter: The transmitter module consists of an RF oscillator circuit, a modulator circuit,


and an antenna. The modulator circuit modulates the signal that needs to be transmitted onto
the RF carrier frequency, and the RF oscillator generates the carrier signal. The carrier signal
and the modulated signal are then combined and sent to the antenna for transmission.

Receiver: The receiver module consists of an antenna, an RF amplifier, a demodulator circuit,


and a decoder circuit. The antenna receives the radio signals that are transmitted by the
transmitter module, and the RF amplifier amplifies the signal. The demodulator circuit extracts
the modulated signal from the carrier frequency, and the decoder circuit decodes the signal to
its original form.

Overall, the transmitter module converts the signal that needs to be transmitted into a
modulated RF signal and sends it to the antenna, while the receiver module receives the
modulated RF signal and decodes it back into its original form.

RF modules are used in a variety of applications, including remote control systems, wireless
communication systems, and security systems. The range of an RF module depends on several
factors, including the frequency used, the output power, and the presence of obstacles between
the transmitter and receiver.

3.4.4. L293D Motor driver module

Figure 3.6: L293D Motor driver module


L293D Motor driver is the most used driver for Bidirectional motor driving applications. This
L293D IC allows DC motor to drive on either direction. L293D is a 16-pin IC which can control
a set of two DC motors simultaneously in any direction. It means that you can control two DC

12
HAND GESTURE CONTROLLED ROBOT

motor with a single L293D IC. Because it has two H-Bridge Circuit inside. The L293D can
drive small and quiet big motors as well.

Here is a brief overview of how the L293D motor driver works:

The L293D has two separate H-bridge circuits, one for each motor channel. Each H-bridge
consists of four MOSFET transistors, which are arranged in a particular configuration to
control the direction and speed of the motor. To control a DC motor using the L293D, you will
typically need to connect the motor to the motor outputs on the L293D and provide a control
signal to the input pins. The input pins control the four MOSFET transistors in the H-bridge
circuit to control the direction and speed of the motor.

The following are the basic steps involved in driving a motor with the L293D:

• Connect the motor to the motor outputs on the L293D. The motor outputs are labeled
as Out 1, Out 2 for Channel 1 and Out 3, Out 4 for Channel 2.
• Connect the power supply to the L293D. The L293D has a separate power supply pin
(VCC1) for the logic circuitry and a separate power supply pin (VCC2) for the motors.
• Apply a logic signal to the input pins to control the motor. The input pins are labeled
as In1, In2 for Channel 1 and In3, In4 for Channel 2. By applying different
combinations of logic signals to these pins, you can control the direction and speed of
the motor.
When the appropriate control signals are applied to the input pins, the L293D switches the
MOSFET transistors in the H-bridge circuit to allow current to flow through the motor in the
desired direction. By controlling the timing and sequence of these switches, the L293D can
control the speed and direction of the motor.

Overall, the L293D motor driver provides a convenient and efficient way to control DC motors
in a variety of applications, including robotics, automation, and motor control systems.

3.5 SOFTWARE REQUIREMENTS


Proposed system consists of hardware and software requirements, the software components
used are listed below.

13
HAND GESTURE CONTROLLED ROBOT

3.5.1 ARDUINO IDE


The Arduino Integrated Development Environment – or Arduino Software (IDE) – contains a
text editor for writing code, a message area, a text console, a toolbar with buttons for common
functions and a series of menus. It connects to the Arduino hardware to upload programs and
communicate with them. Programs written using Arduino Software (IDE) are called sketches.
These sketches are written in the text editor and are saved with the file extension. info. The
editor has features for cutting/pasting and for searching/replacing text. The message area gives
feedback while saving and exporting and also displays errors. The console displays text output
by the Arduino Software (IDE), including complete error messages and other information. The
bottom right-hand corner of the window displays the configured board and serial port. The
toolbar buttons allow you to verify and upload programs, create, open, and save sketches, and
open the serial monitor.

Figure 3.10: Arduino IDE

14
HAND GESTURE CONTROLLED ROBOT

CHAPTER 4
This chapter involves a complete elaborates about the result and output of developed prototype

RESULT AND DISCUSSION


4.1 PROTOTYPE

The Figure 4.1 shows the top view of our proposed model prototype. It is a hand gesture-
controlled robot developed using Arduino technology as mentioned in the previous chapters.
As showcased in the Figure 4.1, the main components used in the developed model are Arduino
UNO microcontroller, ADXL335 accelerometer, L293D motor drive and RF pair module.

Receiver end

Transmitter end

Figure 4.1

The two parts of the prototype namely transmitter end and receiver end has been shown in
Figure 4.1, the resultant working of the prototype is explained below.

15
HAND GESTURE CONTROLLED ROBOT

Forward Movement

Forward movement
Forward tilt

Figure 4.2

In Figure 4.2, the forward movement of the hand gesture-controlled robot has been picturized.
Once the glove consisting of transmitter part of the circuit is tilted in forward motion, the
accelerometer tilt is sensed and when the tilt angle reaches the configured degree, a forward
command is sent to the receiver part from the transmitter part. This command communication
process is done by the RF module. In the receiver part, once the receiver gets the command, it
sends that message to the Arduino UNO microcontroller and then the forward movement of
the robot is initiated with the help of L293D motor driver and battery-operated motors.

Backward Movement

Backward tilt

Backward movement

Figure 4.3

In Figure 4.3, the backward movement of the hand gesture-controlled robot has been picturized.
Once the glove consisting of transmitter part of the circuit is tilted in backward motion, the
accelerometer tilt is sensed and when the tilt angle reaches the configured degree, a backward
command is sent to the receiver part from the transmitter part. This command communication
process is done by the RF module. In the receiver part, once the receiver gets the command, it
sends that message to the Arduino UNO microcontroller and then the backward movement of
the robot is initiated with the help of L293D motor driver and battery-operated motors.

16
HAND GESTURE CONTROLLED ROBOT

Right Movement

Right movement
Right tilt

Figure 4.4

In Figure 4.4, the right movement of the hand gesture-controlled robot has been picturized.
Once the glove consisting of transmitter part of the circuit is tilted towards right, the
accelerometer tilt is sensed and when the tilt angle reaches the configured degree, a right
command is sent to the receiver part from the transmitter part. This command communication
process is done by the RF module. In the receiver part, once the receiver gets the command, it
sends that message to the Arduino UNO microcontroller and then the right movement of the
robot is initiated with the help of L293D motor driver and battery-operated motors.

Left Movement

Left movement
Left tilt

Figure 4.5
In Figure 4.5, the left movement of the hand gesture-controlled robot has been picturized. Once
the glove consisting of transmitter part of the circuit is tilted towards left, the accelerometer tilt
is sensed and when the tilt angle reaches the configured degree, a left command is sent to the
receiver part from the transmitter part. This command communication process is done by the
RF module. In the receiver part, once the receiver gets the command, it sends that message to
the Arduino UNO microcontroller and then the left movement of the robot is initiated with the
help of L293D motor driver and battery-operated motors.

17
HAND GESTURE CONTROLLED ROBOT

Chapter 5

This chapter includes conclusion and future scope of the project.

5.1 Conclusion

The hand gesture-controlled robot has many potential applications in the future. It can be used
in various industries, such as manufacturing, healthcare, education, defense and entertainment.
This robot can be used to automate processes, reduce the need for manual labor, and improve
safety. In addition, it can be used to help with tasks such as picking up objects or aiding people
with disabilities. With advances in technology, the potential for hand gesture-controlled robots
is only increasing.

5.2 Future scope

The future scope of hand gesture-controlled robots is vast. With advancements in technology,
these robots can be used in a variety of fields such as healthcare, education, manufacturing,
and even in the home. For example, they can be used to help elderly and disabled people with
daily tasks, they can be used to assist in surgeries, they can be used to teach students, and they
can be used to automate manufacturing processes. In the home, they can be used to control
lights, appliances, and other electronic devices. The possibilities are endless.

18
HAND GESTURE CONTROLLED ROBOT

Publication

1. Ruthvik B J, Shreesha Hanagud, Manoj N, Thejas Gowda R, Dr Suchithra M, “Hand Gesture


Controlled Robot”, International Conference on advances in Engineering and Technology for
Intelligent systems (ICAETIS 2023), Dayanand Sagar College of Engineering, Bangalore
(submitted).

19
HAND GESTURE CONTROLLED ROBOT

References

[1] Benjula Anbu Malar M B, Praveen R, Kavi Priya K P, Hand Gesture Control Robot,
IJITE , Volume-9 Issue-2, December 2019.
[2] Viren Sugandh, Hand Based Gesture Controlled Robot, IRJET, Volume: 05 Issue: 09,
Sep 2018
[3] Bharath Kumar, Aravind Reddy, Anne Gowda, Hand Gesture Control Car, Journal of
Xi'an University of Architecture & Technology, Volume XII, Issue V, 2020
[4] Rutwik Shah, Vinay Deshmukh, Shatakshi Mulay, Viraj Kulkarni, Madhuri Pote, Hand
Gesture Control Car, IJERT, Volume 8, Issue 05 2020
[5] Kathiravan Sekar, Ramarajan Thileeban, Vishnuram Rajkumar, Sri Sudharshan Bedhu
Sembian, Hand Based Gesture Controlled Robot, IJERT, Vol. 9 Issue 11 November-2020.
[6] Amudhan Rajarajan, Sakthivel Murugesan, Gesture Controlled Robot Using Arduino,
Bulletin of Pure and Applied Sciences, Vol.37 F 2018
[7] Dr. M.C. Hanumantharaju, Navya Sri N, Lepakshi V, Mansi KK, Prajwal Pawar, Hand
Gesture Controlled Robot Based on Military Purpose, IRJMET, Volume:04/Issue:06/June-
2022
[8] Deepesh Tewari , Deepesh Darmwal , Mudit Gupta and A P Cheema, Hand Gesture
Controlled Robot, AITS, Vol:3, Issue:8, March 2018
[9] Suryarajsinh T. Vala - Student, Dept. of Electronics and Communication, Chandubhai
S.Patel Institute of Technology, IJRESM, Vol:1, Issue:11, February 2018

[10] Ms.Asmita Jadhav, Ms.Deepika Pawar, Ms.Kashmira Pathare, Ms.Prachi Sale,


IJRASET, Vol:6, Issue:3 March 2018

[11] Prof. Chetan Bulla, Ranjana Shebani, Bhoomika Balegar, Deepa Nandihalli,
Kalagouda Patil, Hand Gesture Controlled Robot, IJRECE, vol:7, Issue:2, June 2019

[12] V. Sathananthavathi, C.Arthika, Hand Gesture Controlled Robot, IJIRT, Vol:4,


Issue:8, January 2018

20

You might also like