Mini Project PDF
Mini Project PDF
Mysuru – 570002
Department of Electronics and Communication Engineering
CERTIFICATE
This is to certify that the work entitled “HAND GESTURE CONTROLLED
ROBOT” is a bonafide work carried out by Ruthvik B J (4VV20EC126), Shreesha
Hanagud (4VV20EC138), Thejas Gowda R (4VV20EC165) and Manoj N
(4VV20EC088) in partial fulfillment of the award of the degree of Bachelor of
Engineering in Electronics and Communication Engineering of Visvesvaraya
Technological University, Belagavi, during the year 2022-2023.
It is certified that all corrections / suggestions indicated during internal assessment have
been incorporated in the report. The mini project report has been approved as it satisfies the
academic requirements in respect of the mini project work prescribed for the Bachelor of
Engineering degree.
_________________ _______________
DECLARATION
We the members of the mini project team, studying in the V semester of Electronics and
Communication Engineering, Vidyavardhaka College of Engineering, hereby declare that the
entire mini project titled “HAND GESTURE CONTROLLED ROBOT” has been carried
out by us independently under the guidance of Dr. SUCHITRA M, Professor Department of
Electronics and Communication Engineering, Vidyavardhaka College of Engineering. This
mini project work is submitted to the Visvesvaraya Technological University, Belagavi, in
partial fulfilment of the requirement for the award of the degree of Bachelor of Engineering in
Electronics and Communication Engineering during the academic year 2022-2023.
This mini project report has not been submitted previously for the award of any other degree
or diploma to any other Institution or University.
Place: Mysuru
Date: 21-03-23
1. RUTHVIK B J 4VV20EC126
4. MANOJ N 4VV20EC088
i
HAND GESTURE CONTROLLED ROBOT
ACKNOWLEDGEMENT
The satisfaction that accompanies the successful completion of any task would be incomplete
without the mention of people who made it possible and whose constant guidance and
encouragement crowned our efforts with success. We consider our privilege to express the
voice of gratitude and respect to all those who guided us and inspired us in completion of this
mini project.
We wish to express our gratitude to Dr. B. Sadashive Gowda, Principal, VVCE, for
providing congenial working environment.
We are thankful to Dr. C. M. Patil, Professor and Head, Dept. of ECE, VVCE, for
motivating us and allowing us to use the logistics of the department to complete this mini
project successfully.
We express our sincere thanks to our guide Dr. Suchitra M, Department of ECE,
VVCE, for her constant co-operation, support, and invaluable suggestions.
We extend our heartfelt gratitude to our mini project coordinators Dr. Suchitra M,
Professor, Dept of ECE, VVCE and Kavyashree B, Assistant Professor, Dept of ECE, VVCE
for their timely support and motivation.
We would like to thank our parents for their constant moral support throughout the
completion of this mini project.
Finally, last but not the least we would like to extend our deep sense of gratitude to our
friends who always inspired us and encouraged us throughout the completion of this mini
project.
1. RUTHVIK B J
2.SHREESHA HANAGUD
3.THEJAS GOWDA R
4. MANOJ N
ii
HAND GESTURE CONTROLLED ROBOT
ABSTRACT
This paper presents an IoT based hand gesture-controlled robot. This IoT concept can deliver
a comfort, convenient use of things in daily life as well as industries. The proposed gadget
makes use of an accelerometer-based sensor to detect hand gestures, which are then transmitted
wirelessly to the robot via an RF transmitter module. The robot is connected with RF receiver
module, which receives the data and the Arduino Uno microcontroller, interprets the gesture
data and takes the corresponding actions through L293D motor driver, together with moving
ahead, backward, turning left or right. The use of hand gestures provides a more natural way
for the users to have interaction with robots, compared to traditional methods including buttons
or joysticks. This device is a prototype that can make things simpler especially in the growing
industries where humans are replace by robots. The results in this paper shows that the
proposed approach is effective and accurate in controlling robots using hand gestures,
providing the way for the improvement of better superior gesture-controlled robot structures.
iii
HAND GESTURE CONTROLLED ROBOT
TABLE OF CONTENTS
Declaration i
Acknowledgements ii
Abstract iii
Index iv
INDEX
iv
HAND GESTURE CONTROLLED ROBOT
CHAPTER-1
INTRODUCTION
1.1 Background
The concept of controlling robots using hand gestures has been explored for many years. It is
a natural way for humans to interact with machines, and can make robotics more accessible to
people who have difficulty using traditional input devices. In recent years, the development of
low-cost microcontrollers such as Arduino Uno has made it easier for hobbyists, students, and
researchers to experiment with this technology and develop their own hand gesture-controlled
robots.
The use of Arduino Uno microcontroller boards in robotics projects has become increasingly
popular due to their ease of use, low cost, and availability of open-source libraries and tutorials.
Arduino Uno is a versatile microcontroller board that can be programmed to interface with a
wide range of sensors, motors, and other electronic components.
There are various sensors that can be used to detect hand gestures, such as accelerometers,
gyroscopes, flex sensors, and camera-based computer vision systems. These sensors are used
to collect data on the user's hand movements, which is then processed and translated into
commands for the robot.
The software for hand gesture-controlled robots using Arduino Uno is typically written in C++
using the Arduino IDE. The software reads sensor data, interprets the user's hand gestures, and
sends commands to the robot to perform specific tasks. This can include controlling the robot's
movement, activating sensors or other devices, and responding to user input.
Overall, hand gesture-controlled robots using Arduino Uno offer an exciting and accessible
way for hobbyists, students, and researchers to experiment with robotics and explore the
possibilities of human-robot interaction. With continued advancements in sensor technology
and machine learning algorithms, this technology has the potential to transform the way we
interact with machines and other electronic devices.
1
HAND GESTURE CONTROLLED ROBOT
1.2 Introduction
With the Evolution of technology, the interaction with machine has increased day by day. From
Switching on Light from Smartphone to Controlling of vehicle. Technology has become a part
and parcel in one’s life. There are various ways to interact with machine like remote control,
joystick, etc. which is basic method. Nowadays, there be a lot of researches made on controlling
of system or device with human Gestures, i.e., eye movement, facial expression or hand
gestures which eliminates some barriers like language barrier for robot. Arduino UNO is used
to carry commands by interpreting the collection of data received from its interfaced devices.
The Device uses the accelerometer to detect the any motion in front of it and transmits the data
via Transceivers to receiving end and performs certain task using Arduino micro controller.
Recently, strong efforts have been carried out to develop intelligent and natural interfaces
between users and computer-based systems based on human gestures. Gestures provide an
intuitive interface to both human and computer. Thus, such gesture-based interfaces can not
only substitute the common interface devices, but can also be exploited to extend their
functionality. Robots are playing an important role in automation across all the sectors like
construction, military, medical, manufacturing, etc. After making some basic robots like line
follower robot, computer-controlled robot, etc.; we have developed this accelerometer-based
gesture-controlled robot by using Arduino Uno. In this project we have used hand motion to
drive the robot. For this purpose, we have used accelerometer which works on acceleration. A
gesture-controlled robot is controlled by using hand in place of any other method like buttons
or joystick. Here one only needs to move hand to control the robot. A transmitting device is
used in your hand which contains RF Transmitter and accelerometer. This will transmit
command to robot so that it can do the required task like moving forward, reverse, turning left,
turning right and stop. All these tasks will be performed by using hand gesture. Here the most
important component is accelerometer. Accelerometer is a 3-axis acceleration measurement
device with ±3g range. This device is made by using polysilicon surface sensor and signal
conditioning circuit to measure acceleration. The output of this device is Analog in nature and
proportional to the acceleration. This device measures the static acceleration of gravity when
we tilt it and gives a result in form of motion or vibration. According to the datasheet of adxl335
polysilicon surface-micromachined structure placed on top of silicon wafer. Polysilicon springs
suspend the structure over the surface of the wafer and provide a resistance against acceleration
forces. Deflection of the structure is measured using a differential capacitor which incorporate
independent fixed plates and plates attached to the moving mass. The fixed plates are driven
2
HAND GESTURE CONTROLLED ROBOT
by 180° out-of-phase square waves. Acceleration deflects the moving mass and unbalances the
differential capacitor resulting in a sensor output whose amplitude is proportional to
acceleration. Phase-sensitive demodulation techniques are then used to determine the
magnitude and direction of the acceleration.
1.4 Objective
• To provide a means of controlling a robot's movements or actions through hand
gestures, rather than using traditional input devices like a remote controller or a
keyboard
• To provide more accurate and natural way of movement with the robot
1.5 Motivation
Our motivation to work on this project came from a disabled person who was driving his wheel
chair by hand with quite a lot of difficulty. So, we wanted to make a device which would help
such people drive their chairs without even having the need to touch the wheels of their chairs.
3
HAND GESTURE CONTROLLED ROBOT
CHAPTER 2
LITERATURE SURVEY
Literature survey plays an important role in life cycle of any project. The purpose of the
research is to come up with a solution by understanding the shortcomings and obstacles in the
existing system. It also includes comparing existing and proposed methods. This chapter gives
some related work of the different methods of warehouse managements and the objectives of
the present work.
Few authors [2] provides an idea about the hand gesture technology by stating it to be the
real time monitoring system, by which humans interacts with robots through gestures.
The author says that hand gesture technology is better than voice control system taking
accuracy into consideration. The four major blocks in the architecture of the researched
robot are, Input from accelerometer, Arduino processing, calibration and differentiation,
output to LDR and motor and an RF module in the receiving end. The recognition
procedure involves, Calibration, End point detection, Smoothing and Scaling,
Recognition and Navigation. The outcome of the project happens to be very much user
4
HAND GESTURE CONTROLLED ROBOT
friendly wherein the user can navigate the wireless robot in the environment using various
gestures commands.
The authors [3] defines a robot to be a device which can be used as tool for making work
easier and effortless. The author [3] describes the main purpose of the developed project
which us to provide simpler robot’s hardware architecture but with powerful
computational platforms so that robot’s designer can focus on their research and tests
instead of Bluetooth connection infrastructure. After observing the successful working of
the prototype, the author [3] has listed out the advantages of the hand gesture technology
to be, Replacement of man power by using the digital control, Components and
experimental setup takes much less amount when compared to others, The handling and
operation of process is very easy and accurate and It does not need a skilled person to
operate the control.
Some authors [4] describes that the introduction of IoT and combination of IoT and physical
devices makes life easier. The author [4] summarizes the objectives of the research done as
Connect and Communicate with physical devices, Faster and Smart innovation, Smart sensing
capabilities. The prototype is prepared using an accelerometer in the transmitter end and a
motor driver in the receiver end whose communication is done by RF pair module and the
actions are controlled on both the sides by Arduino UNO microcontroller. But, the major
limitations of the project are, lot of power is consumed so it has a high-power consumption.
Moreover, it cannot detect any object. If any object comes in front of the robot, it does not stop
or change its direction. So, addition of ultrasonic sensors can overcome the limitation.
The author [5] of the paper named International Journal on Emerging Technologies introduces
the system of a gesture driven robotic vehicle. The prototype is developed in such a way that
the movements and manipulations of the robot i.e., handling and control depends on the
gesture of the user. In this system, gesture is captured by accelerometer and it is processed by
software namely, microcontroller software and the parameters are sent to microcontroller and
encoder circuit. Also, in the paper, they have mainly mentioned about the way in which this
project can be an effective method to eradicate the social problems faced by the physically
challenged persons. The project’s main objective is locomotion of the robot in a wireless mode
so that the usage of the product becomes more easy and helpful for the physically challenged
people.
5
HAND GESTURE CONTROLLED ROBOT
Authors [6] of few papers have compared the developed prototype with similar products
already available and explained the advantages of using hand gesture method to control the
robot over other methods like joystick or a physical controller with buttons. The methodology
of this prototype has three main elements they are, the Accelerometer Sensor sends data to
Arduino. It provides the data. It is transmitted towards the RF Transmitter after checking the
present parameters. To transfer to the Receiver,the data transmitted first by Arduino as well
as received by a RF transmitter is required. The data is decoded, and the necessary signals are
delivered to the motor controller IC. The Robot'sWheel Motors are engaged as a result of this.
The accelerometer detects horizontal and vertical vibrations and generates only constant
analogue signal values. The objective of this project as per the paper is that an Arduino
interface is used to construct a roboticautomobile that is directed by hand gestures.
As per some of the papers [7], the prototype is developed in such a way that they are not
making use of any wireless technology to send the signal instead they are making use of wires
to send the signals from transmitter to the receiver. The objective of this wired control device
is achieved using Arduino uno and accelerometer. The Arduino microcontroller receives the
analog input values (x axis, y axis) from the accelerometer and converts that analog value to
digital value. The future scope of the developed project as per the author [7] says that a
wireless mode of communication of data can be implemented to make the robot work more
effectively and easily.
Few authors [8] have explained the insides of the prototype developed by stating that its
working is based on 3axis of accelerometer and robot move in four directions forward,
backward, left, and right. For sensing Human motion, an infrared sensor is used whose range
is 790nm wavelength from human body. This type of robot widely used in military
application, industrial robotic, construction field. They have used the pick and place robots
for this process due to the following reasons. Flexibility is one of the main advantages of
robotics system. Pick and place robots are easily programmable using computer software. In
robotic system pick and place is an application which is related to physically demanding.
Exploring more in this domain, we can find authors [9] explaining about the advantages to
expand the utilization of robots where conditions are not sure, for example, security tasks,
robots can be made to suchan extent that it will follow the guidance of human administrator
and execute the assignment. The author [9] explains the entire project to be categorized into
6
HAND GESTURE CONTROLLED ROBOT
two primary parts namely, transmitter part and the receiver part. Here transmitter will transmit
the signal as per the situation of accelerometer joined on your hand and the receiver will get
the signal and make the robot move in the configured direction. The project in this paper has
an advantage over the present projects as it has additionally included more hand motions, (for
example, the bend and slice)into the interface to control the vehicle in a more natural and
successful way.
Similarly, few authors [10] also have portrayed about the motion control robot which can be
controlled through your typical hand signal in which the entire project is categorized into two
primary parts namely, transmitter part and the receiver part. Here transmitter will transmit the
signal as per the situation of accelerometer connected to the hand and the receiver will get the
signal and make the robot move in the configured direction. The main objective of this paper is
to present the control of the robot utilizing the accelerometerwith the assistance of human hand
gesture or tilting.
In some papers [11], they have explained the concept of hand-gesture based control interface
for navigating a car-robot.A 3-axis accelerometer is used to record a user's hand gestures. The
data is transmitted wirelessly via an RF module to a microcontroller. A gesture-controlled
robot is controlled by using the hand in place of any other method like buttons or joystick.
The paper explains the importance of having an easy method to use the robot by controlling
it using hand gesture to its day-to-day applications instead of using conventional methods like
joystick and voice control.
Authors [12] also have described regarding how the conventional hand gestures can control a
robot and perform our desired tasks. The transmitter will transmit the signal in line with the
position of accelerometer and your hand gesture and therefore the receiver will receive the
signal and makethe robot move in respective direction. Robots are playing a crucial role in
automation across all the sectors like construction, military, medical, manufacturing, etc. This
project can be very helpful for physically challenged people as they can movecertain objects
with less physical movement. Moreover, the human errors are reduced on a great scale and
results are achieved with great accuracy.
7
HAND GESTURE CONTROLLED ROBOT
CHAPTER 3
METHODOLOGY
This chapter includes project flow, system design, working, hardware and software
requirements.
3.2 WORKING
8
HAND GESTURE CONTROLLED ROBOT
Transmitter end:
The user wears an ADXL335 accelerometer sensor on their hand, which is used to detect the
orientation and movement of the hand as shown in the Figure 3.3. The accelerometer sensor
sends the data to the Arduino Uno microcontroller, which processes the data using an
appropriate algorithm the user has given to detect the specific hand gesture. This data is then
transferred to the Receiver end through the RF transmitter module.
Receiver end:
The RF Receiver module receives the data that is sent from the transmitter end and based on
the hand gesture data detected from the received data, the Arduino Uno microcontroller
processes the data and sends a signal to the L293D motor driver to control the motors of the
robot. The L293D motor driver receives the signal and converts it into a form that can control
the motors of the robot. The motors of the robot move accordingly, allowing the robot to move
forward, backward, left, right, or turn in different directions, depending on the specific hand
gesture detected.
Overall, the hand gesture-controlled robot works by interpreting the orientation and
movement of the user's hand, and using this information to control the motors of the robot. It
provides a hands-free and intuitive way of controlling the robot's movement, which can be
useful in various applications such as home automation, industrial automation, medical fields,
and military fields.
9
HAND GESTURE CONTROLLED ROBOT
The ADXL335 sensor shown in Figure 3.4 is a small, low-power, three-axis accelerometer that
is used to measure acceleration in a variety of applications. It is a MEMS
(microelectromechanical system) device that uses tiny, flexible, silicon structures to measure
acceleration. The ADXL335 contains three separate accelerometer sensors, one for each axis:
X, Y, and Z. Each sensor is a small, thin, square chip that contains a tiny mass suspended by
small springs. When the accelerometer is moved, the mass moves with it, and the springs bend
slightly, creating a small electrical signal that is proportional to the acceleration.
The ADXL335 also includes a built-in signal conditioning circuit that converts the analog
signals from each sensor into digital signals that can be read by a microcontroller or other
digital device. The device provides a voltage output that varies proportionally to the
acceleration in each axis, with zero acceleration corresponding to a voltage of approximately
1.5V.
10
HAND GESTURE CONTROLLED ROBOT
To use the ADXL335, you will typically connect it to a microcontroller or other digital device
using its analog voltage output. The microcontroller can then read the voltage and convert it
into acceleration values using a simple formula based on the device's sensitivity and resolution.
The ADXL335 can be used in a variety of applications, including robotics, gaming controllers,
and motion sensing devices. Its low power consumption, small size, and ease of use make it a
popular choice for many projects.
3.4.3. RF Module
An RF (Radio Frequency) module shown in Figure 3.5 is a small electronic device that uses
radio waves to transmit and receive data wirelessly. RF modules consist of a transmitter and a
receiver, which communicate with each other by sending and receiving radio signals.
11
HAND GESTURE CONTROLLED ROBOT
Overall, the transmitter module converts the signal that needs to be transmitted into a
modulated RF signal and sends it to the antenna, while the receiver module receives the
modulated RF signal and decodes it back into its original form.
RF modules are used in a variety of applications, including remote control systems, wireless
communication systems, and security systems. The range of an RF module depends on several
factors, including the frequency used, the output power, and the presence of obstacles between
the transmitter and receiver.
12
HAND GESTURE CONTROLLED ROBOT
motor with a single L293D IC. Because it has two H-Bridge Circuit inside. The L293D can
drive small and quiet big motors as well.
The L293D has two separate H-bridge circuits, one for each motor channel. Each H-bridge
consists of four MOSFET transistors, which are arranged in a particular configuration to
control the direction and speed of the motor. To control a DC motor using the L293D, you will
typically need to connect the motor to the motor outputs on the L293D and provide a control
signal to the input pins. The input pins control the four MOSFET transistors in the H-bridge
circuit to control the direction and speed of the motor.
The following are the basic steps involved in driving a motor with the L293D:
• Connect the motor to the motor outputs on the L293D. The motor outputs are labeled
as Out 1, Out 2 for Channel 1 and Out 3, Out 4 for Channel 2.
• Connect the power supply to the L293D. The L293D has a separate power supply pin
(VCC1) for the logic circuitry and a separate power supply pin (VCC2) for the motors.
• Apply a logic signal to the input pins to control the motor. The input pins are labeled
as In1, In2 for Channel 1 and In3, In4 for Channel 2. By applying different
combinations of logic signals to these pins, you can control the direction and speed of
the motor.
When the appropriate control signals are applied to the input pins, the L293D switches the
MOSFET transistors in the H-bridge circuit to allow current to flow through the motor in the
desired direction. By controlling the timing and sequence of these switches, the L293D can
control the speed and direction of the motor.
Overall, the L293D motor driver provides a convenient and efficient way to control DC motors
in a variety of applications, including robotics, automation, and motor control systems.
13
HAND GESTURE CONTROLLED ROBOT
14
HAND GESTURE CONTROLLED ROBOT
CHAPTER 4
This chapter involves a complete elaborates about the result and output of developed prototype
The Figure 4.1 shows the top view of our proposed model prototype. It is a hand gesture-
controlled robot developed using Arduino technology as mentioned in the previous chapters.
As showcased in the Figure 4.1, the main components used in the developed model are Arduino
UNO microcontroller, ADXL335 accelerometer, L293D motor drive and RF pair module.
Receiver end
Transmitter end
Figure 4.1
The two parts of the prototype namely transmitter end and receiver end has been shown in
Figure 4.1, the resultant working of the prototype is explained below.
15
HAND GESTURE CONTROLLED ROBOT
Forward Movement
Forward movement
Forward tilt
Figure 4.2
In Figure 4.2, the forward movement of the hand gesture-controlled robot has been picturized.
Once the glove consisting of transmitter part of the circuit is tilted in forward motion, the
accelerometer tilt is sensed and when the tilt angle reaches the configured degree, a forward
command is sent to the receiver part from the transmitter part. This command communication
process is done by the RF module. In the receiver part, once the receiver gets the command, it
sends that message to the Arduino UNO microcontroller and then the forward movement of
the robot is initiated with the help of L293D motor driver and battery-operated motors.
Backward Movement
Backward tilt
Backward movement
Figure 4.3
In Figure 4.3, the backward movement of the hand gesture-controlled robot has been picturized.
Once the glove consisting of transmitter part of the circuit is tilted in backward motion, the
accelerometer tilt is sensed and when the tilt angle reaches the configured degree, a backward
command is sent to the receiver part from the transmitter part. This command communication
process is done by the RF module. In the receiver part, once the receiver gets the command, it
sends that message to the Arduino UNO microcontroller and then the backward movement of
the robot is initiated with the help of L293D motor driver and battery-operated motors.
16
HAND GESTURE CONTROLLED ROBOT
Right Movement
Right movement
Right tilt
Figure 4.4
In Figure 4.4, the right movement of the hand gesture-controlled robot has been picturized.
Once the glove consisting of transmitter part of the circuit is tilted towards right, the
accelerometer tilt is sensed and when the tilt angle reaches the configured degree, a right
command is sent to the receiver part from the transmitter part. This command communication
process is done by the RF module. In the receiver part, once the receiver gets the command, it
sends that message to the Arduino UNO microcontroller and then the right movement of the
robot is initiated with the help of L293D motor driver and battery-operated motors.
Left Movement
Left movement
Left tilt
Figure 4.5
In Figure 4.5, the left movement of the hand gesture-controlled robot has been picturized. Once
the glove consisting of transmitter part of the circuit is tilted towards left, the accelerometer tilt
is sensed and when the tilt angle reaches the configured degree, a left command is sent to the
receiver part from the transmitter part. This command communication process is done by the
RF module. In the receiver part, once the receiver gets the command, it sends that message to
the Arduino UNO microcontroller and then the left movement of the robot is initiated with the
help of L293D motor driver and battery-operated motors.
17
HAND GESTURE CONTROLLED ROBOT
Chapter 5
5.1 Conclusion
The hand gesture-controlled robot has many potential applications in the future. It can be used
in various industries, such as manufacturing, healthcare, education, defense and entertainment.
This robot can be used to automate processes, reduce the need for manual labor, and improve
safety. In addition, it can be used to help with tasks such as picking up objects or aiding people
with disabilities. With advances in technology, the potential for hand gesture-controlled robots
is only increasing.
The future scope of hand gesture-controlled robots is vast. With advancements in technology,
these robots can be used in a variety of fields such as healthcare, education, manufacturing,
and even in the home. For example, they can be used to help elderly and disabled people with
daily tasks, they can be used to assist in surgeries, they can be used to teach students, and they
can be used to automate manufacturing processes. In the home, they can be used to control
lights, appliances, and other electronic devices. The possibilities are endless.
18
HAND GESTURE CONTROLLED ROBOT
Publication
19
HAND GESTURE CONTROLLED ROBOT
References
[1] Benjula Anbu Malar M B, Praveen R, Kavi Priya K P, Hand Gesture Control Robot,
IJITE , Volume-9 Issue-2, December 2019.
[2] Viren Sugandh, Hand Based Gesture Controlled Robot, IRJET, Volume: 05 Issue: 09,
Sep 2018
[3] Bharath Kumar, Aravind Reddy, Anne Gowda, Hand Gesture Control Car, Journal of
Xi'an University of Architecture & Technology, Volume XII, Issue V, 2020
[4] Rutwik Shah, Vinay Deshmukh, Shatakshi Mulay, Viraj Kulkarni, Madhuri Pote, Hand
Gesture Control Car, IJERT, Volume 8, Issue 05 2020
[5] Kathiravan Sekar, Ramarajan Thileeban, Vishnuram Rajkumar, Sri Sudharshan Bedhu
Sembian, Hand Based Gesture Controlled Robot, IJERT, Vol. 9 Issue 11 November-2020.
[6] Amudhan Rajarajan, Sakthivel Murugesan, Gesture Controlled Robot Using Arduino,
Bulletin of Pure and Applied Sciences, Vol.37 F 2018
[7] Dr. M.C. Hanumantharaju, Navya Sri N, Lepakshi V, Mansi KK, Prajwal Pawar, Hand
Gesture Controlled Robot Based on Military Purpose, IRJMET, Volume:04/Issue:06/June-
2022
[8] Deepesh Tewari , Deepesh Darmwal , Mudit Gupta and A P Cheema, Hand Gesture
Controlled Robot, AITS, Vol:3, Issue:8, March 2018
[9] Suryarajsinh T. Vala - Student, Dept. of Electronics and Communication, Chandubhai
S.Patel Institute of Technology, IJRESM, Vol:1, Issue:11, February 2018
[11] Prof. Chetan Bulla, Ranjana Shebani, Bhoomika Balegar, Deepa Nandihalli,
Kalagouda Patil, Hand Gesture Controlled Robot, IJRECE, vol:7, Issue:2, June 2019
20