See discussions, stats, and author profiles for this publication at: https://www.researchgate.
net/publication/348347079
Hand Gesture Recognition Control for Computers Using Arduino
Chapter · January 2021
DOI: 10.1007/978-981-15-8530-2_45
CITATIONS READS
2 2,032
4 authors, including:
Vimali js Gowri Shanmugam
Sathyabama Institute of Science and Technology Sathyabama Institute of Science and Technology
30 PUBLICATIONS 31 CITATIONS 53 PUBLICATIONS 91 CITATIONS
SEE PROFILE SEE PROFILE
Some of the authors of this publication are also working on these related projects:
Group Decision Making View project
A framework for sentence level sentiment classification View project
All content following this page was uploaded by Vimali js on 22 January 2021.
The user has requested enhancement of the downloaded file.
10/20/2020 e.Proofing | Springer
Hand Gesture Recognition Control for
Computers Using Arduino
J. S. Vimali 1✉
Email
[email protected] Senduru Srinivasulu 12
Email [email protected]
J. Jabez 13
S. Gowri 14
1 Department of Information Technology, Sathyabama Institute of Science and
Technology, Chennai, India
Abstract
The abstract is published online only. If you did not include a short abstract for
the online version when you submitted the manuscript, the first paragraph or the
first 10 lines of the chapter will be displayed here. If possible, please provide us
with an informative abstract.
With the change in time, new technologies are coming into existence. One of
them is hand gesture control used by robots. Hand gestures are used to give
commands as an input to these robots. These gesture controls are used as input
devices to computer systems. These existing systems are costly and unaffordable
by general uses. The idea proposed in the article is to make a cheaper version of a
hand gesture controller using the Arduino board. The input is given with the help
of a pair of ultrasonic sensors. The sensors are used to calculate the changes in
the distance of our hand movements while giving the gesture. The Arduino will
calculate the changes in distance and recognize the action done by the user. The
Arduino will request the system processor to perform the requested action by the
user. The time taken to perform basic operations like open, minimize, maximize,
play, save, etc., operations are reduced because of this proposed model.
Keywords
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQQ… 1/15
10/20/2020 e.Proofing | Springer
Hand gestures
Arduino UNO
Ultrasonic sensors
Python
PyAutoGUI
1. Introduction
Hands-free operation of electronic items is new to this world. Some types of
sensors are surrounded by all of us. These can vary from different shape and sizes.
These can be of multiple-use or made for any specific purpose, likewise, they have
their own uses of data. Such example of sensors is light sensors, heart sensor, flow
and level sensor, tilt sensor, color sensor, IR sensor, accelerometer, proximity
sensor, smoke, gas and alcohol sensor, and an ultrasound sensor. These sensors
collect different types of data and give it to their processor for their own uses.
AQ1
Sensors are used with different types of devices for their own specific tasks
assigned to them. One of the best examples of sensors is its own mobile phone. It is
fitted with different types of sensors to control mobile phone features like screen
brightness, photo depths, and many other features. These features make the
environment more interactive.
The world mainly uses 13 types of sensors. These 13 types of sensors can be used
in different ways or with a different configurations to get different types of work
done. Each sensor can differ in their shape, size and cost. These sensors can be
fitted with any real-time devices such as vehicles, crops, medical field, or for
weather forecasting. Sensors can be else called as an input device. These input
devices are used to give some input commands to the processer. There can be
different types of processors to control these sensors. The project is aimed to design
a system in which user hand gesture can be used as input and process it with the
help of Arduino and perform the triggered actions over the operating system
applications.
This project deals with a similar type of real-time usage of the ultrasound sensor.
The sensor is used to calculate the change in hand gesture, which will be further
used to calculate the change in distance. This data will be used to perform the
triggered action over the operating system. There are much application which has
inputs to perform some tasks such as scroll up, next, pause, rewind, and scrolls
down, etc., functions.
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQQ… 2/15
10/20/2020 e.Proofing | Springer
2. Related Work
Godwin Ponraj et al. worked on sensor fusion of leap motion controller and flex
sensors using kalman filter for human finger tracking. It contains a wearable glove
which can be worn over hands to provide gestures. It needs to be charged on every
time interval and requires High maintenance.
Patsakorn Rittitum et al. developed digital scrum board using leap motion. It is a
very costly system. It consists of a leap motion sensor and a wearable device.
Leigh Ellen Potter and others also worked on the leap motion sensor. It was named
as “The Leap Motion controller: A view on sign language.” It is mainly designed to
understand Australian signs. It is very costly and contains less accuracy.
Lucy Dodakian designed free-hand interaction with leap motion controller for
stroke rehabilitation. The system ranges to very high cost. It consists of a leap
motion sensor and a wearable device.
Meenakshi Panwar had designed hand gesture recognition based on shape
parameters. It uses infrared sensors to detect gestures. The recognition was very
slow and less accurate.
The previous existing systems are hand gesture control system [1, 2] using leap
motion devices. These systems are very costly and use various other sensors. These
systems are not affordable by common people. It was basically designed for
Australian people’s. Hand gesture recognition based on shape parameters detects
sharp gestures only. It follows all the basic principles for gesture detection. It uses
an infrared sensor to detect the movement and detection is not accurate.
The signal could be an image of physical behavior or enthusiastic expression. It
incorporates body motion and hand signal. It falls into two categories: inactive
motion [3, 4] and energetic motion [5, 6]. For the previous, the pose of the body or
the signal of the hand signifies a sign. For the last mentioned, the development of
the body or the hand passes on a few messages. Motion can be utilized as a device
of communication between a computer and human [7, 8]. It is significantly diverse
from the conventional equipment-based strategies and can finish human–computer
interaction through signal acknowledgment. Motion acknowledgment decides the
client expectation through the acknowledgment of the signal or movemessnt of the
body or body parts. Within the past decades, numerous analysts have strived to
move forward the hand signal acknowledgment innovation. Hand signal
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQQ… 3/15
10/20/2020 e.Proofing | Springer
acknowledgment has extraordinary esteem in numerous applications such as sign
dialect acknowledgment [9, 10], increased reality (virtual reality) [11, 12], sign
dialect mediators for the crippled, and robot control.
3. Proposed Work
This article proposes a simple hand motion control based on Arduino. This system
helps us to control various functions of a computer using hand signals. As a
replacement of using a keyboard, mouse or joystick, our hand motion can be used
to direct few computer tasks such as playing/pausing a clip, moving left/right in a
picture slide show, scrolling up/down on a Web page, and more. It is cheaper than
the existing system as it contains two Ultrasonic sensors and an ARDUINO board
which are available at very low cost.
As shown in Fig. 1, the different hand gestures of user are going to be given as
input to the sensors fixed in the particular computer. The ultrasonic sensor receives
the inputs and converts those into different operations need to perform on the
computer.
Fig. 1
Block diagram
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQQ… 4/15
10/20/2020 e.Proofing | Springer
4. Background
4.1. Arduino Uno
The Arduino Uno is a microcontroller based on AT mega 328, which is having 14
digital input/output pins, six analog input pins, a USB connection facility, a power
jack and a reset button facility (Fig. 2).
Fig. 2
Aurdino UNO board
4.2. Ultrasonic Sensor
Ultrasonic sensors measure the distance by using ultrasonic waves by which the
sensor head emits an ultrasonic wave, i.e., the transmitter head, and the receiver
absorbs the wave from the reflected target back. Ultrasonic sensors measure the
distance between the emission and the receiving time of the target (Figs. 3 and 4).
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQQ… 5/15
10/20/2020 e.Proofing | Springer
Fig. 3
Ultrasonic sensor
Fig. 4
Signal transmission in ultrasonic sensor
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQQ… 6/15
10/20/2020 e.Proofing | Springer
4.3. PyAutoGUI
PyAutoGUI is used to control the hardware like a mouse, keyboard, etc., using
programs. GUI automation for human beings can be achieved through a cross-
platform Python module provided by PyAutoGUI. It is with sensible defaults and is
possibly designed to be very simple. It can simulate various functions of a keyboard
as well as a mouse. Pressing hot keys combination is one among them.
4.4. PySerial
Access to the serial ports of microcontrollers can be encapsulated by pySerial. It
works as a backend for Python. Python properties can be used to access serial port
settings. All the files are purely made on Python.
5. Methodology
The principle behind “Hand gesture recognition using Arduino” is the Arduino-
based hand gesture control system, which is very easy for implementation,
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQQ… 7/15
10/20/2020 e.Proofing | Springer
understanding, and processing. Two ultrasonic sensors with the Arduino Uno are
used as the main components to implement the project and to get the desired output.
In this paper, a technique is being introduced, which is based on determining
distances by the ultrasonic sensors. According to the specific distances, particular
functions are performed. Some gestures are proposed in this new technique and
then the actions are recognized by the sensors and then functions are performed
according to the instructions given by the user. Here, a few mainstream methods are
used, based on hand action recognition by ultrasonic sensors.
The circuit design is very simple but the setup of the sensors is important. The
positioning of Ultrasonic sensors is done in such a way that the two ultrasonic
sensors are placed at either ends of a computer screen. The respective distance
information is collected by the Arduino UNO, using Python programming and a
library called PyAutoGUI which converts the data into click action. Hand gesture
recognition system can be used for many different functionalities, such as video
control, music player control, Microsoft application control, and chrome handling.
All these interactions are real-time gesture recognition. By using the ultrasonic
sensor, the distance of hand acts as an input. Depending on the respective distances
of hand, particular functions are performed in the system (Fig. 5).
Fig. 5
Work flow process
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQQ… 8/15
10/20/2020 e.Proofing | Springer
5.1. Input
This step is the first and most important step which includes the process of
providing hand gestures to the receiver. The receiver here refers to the two
Ultrasonic sensors attached on the top of the screen side by side. There are various
gestures which can be provided for the automation. Some of them can be as
follows:
• Gesture 1: By keeping both the hands at a distance away and in front of the
sensor, the video can be played/paused.
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQQ… 9/15
10/20/2020 e.Proofing | Springer
• Gesture 2: Video can be forwarded one step by keeping the hand at a particular
far distance from the right sensor.
• Gesture 3: Video can be rewind one step by keeping the hand at a particular far
distance from the left sensor.
• Gesture 4: Video can be fast-forwarded by moving the hand from a particular
near distance towards the right sensor and vice versa can cause rewind of the
video.
• Gesture 5: Volume can be raised by moving the hand from a particular near
distance toward the left sensor and vice versa can cause a reduction in the
volume of the video.
• Gesture 6: Hand swipe in front of the right sensor can cause moving to the next
tab.
• Gesture 7: Hand swipe in front of the left sensor can cause moving to the
previous tab.
• Gesture 8: Hand swipe from the left sensor to right can change between tasks.
5.2. Reception of Input
Input is received by the two Ultrasonic sensors in the form of distance values.
These distance values are in the form of centimeters. The sensors emit Ultrasonic
waves which get reflected by our hands which are present in the front of sensors for
the gestures. These reflected waves are received by the sensors and the distance
from which the waves reflect back is passed to the Arduino.
5.3. Functions of Arduino
The distance values collected by the sensors are very important. These values are
then compared to the respective operations. The operation with which the incoming
values matches is collected by the python program as input through pySerial.
5.4. PyAutoGUI Function and Output
The input to the Python program is collected through the pySerial and specific
operation is performed. These operations can be controlling the keyboard, mouse
cursor, etc., with the help of PyAutoGUI module. According to the specified
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQ… 10/15
10/20/2020 e.Proofing | Springer
operation given as input by Arduino through pySerial, particular hotkeys
combination of a keyboard can also be automated. The output can be obtained on a
real-time basis on the computer screen.
These outputs can be as following:
• Tab switch in a Web browser
• Web page scrolling
• Task switching
• Changing the slides of the presentation
• Play/pause the video
• Change in volume
• Video forward
• Video rewind
6. Results
6.1. Controlling the Video
There are various possible operations which can be performed by the hand gestures
using Arduino. Some of them are performed in the following result.
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQ… 11/15
10/20/2020 e.Proofing | Springer
6.2. Controlling Web Browser
There are various operations which can be controlled by gestures. Some of them are
performed and the result is displayed below.
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQ… 12/15
10/20/2020 e.Proofing | Springer
7. Conclusıon
Classification of the text provides the table with automation and simplification. It is
amazing to see how marketers, product managers, designers, academics, and
engineers will all be able to make use of this technology. Technology's entire idea is
to make life easier. Classification of broad textual information helps standardize the
system, simplifies and improves user experience. Text mining will be of increasing
use in the future as a lot of everyday data in the form of text is generated. These
type of gestures can be used to perform basic operations on computer applications.
It is an open-source project where the gestures can be changed according to the
user's comfort. Hand gesture control can be used in virtual reality applications.
Augmented reality is the boom nowadays and hand gesture can play an important
role. Reading sign language can be an application of hand gesture.
References
1. Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQ… 13/15
10/20/2020 e.Proofing | Springer
human computer interaction: a survey. Artif Intell Rev 43(1):1–54
2. Jinda-Apiraksa A, Pongstiensak W, Kondo T (2010) A simple shape-based
approach to hand gesture recognition. ECTI-CON2010: the 2010 ECTI
International Confernce on Electrical Engineering/Electronics, Computer
Telecommunications and Information Technology, IEEE
3. Jinda-Apiraksa A, Pongstiensak W, Kondo T (2010) Shape-based finger pattern
recognition using compactness and radial distance. In: The 3rd International
Conference on Embedded Systems and Intelligent Technology (ICESIT 2010),
Chiang Mai, Thailand, February
4. Rokade R, Doye D, Kokare M (2009) Hand gesture recognition by thinning
method. In: 2009 International Conference On Digital Image Processing. IEEE,
pp 284–287, March
5. Gowri S, Divya G (2015) Automation of garden tools monitored using mobile
application. In: International confernce on innovation information in computing
technologies. IEEE, pp 1–6, February
6. Li G, Tang H, Sun Y, Kong J, Guozhang Jiang Du, Jiang BT, Shuang Xu, Liu H
(2019) Hand gesture recognition based on convolution neural network. Cluster
Comput 22(2):2719–2729
7. Li C, Xie C, Zhang B, Chen C, Han J (2018) Deep Fisher discriminant learning
for mobile hand gesture recognition. Pattern Recogn 77:276–288
8. Vimali JS, Srinivasulu S, Gowri S (2019) IoT based bank security system. Int J
Recent Technol Eng 2324–2327
9. Vinoth Krishnan V, Sai Anand M, Vimali JS (2016) Response mechanism for
crisis times from social networks. Int J Pharm Technol (IJPT) 8(2):11958–11966
10. Balamurali R, Vimali JS (2015) Certificate and message authentication
acceleration in VANET. IN: Proceedings 2015—IEEE International Conference
on Innovation, Information in Computing Technologies, ICIICT 2015
11. Duraipandian M, Vinothkanna MR (2019) Cloud based Internet of Things for
smart connected objects. J ISMAC 1(02):111–119
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQ… 14/15
View publication stats
10/20/2020 e.Proofing | Springer
12. Raj JS, Ananthi JV (2019) Automation using Iot in greenhouse environment. J
Inf Technol 1(01):38–47
https://eproofing.springer.com/books_v3/printpage.php?token=BNjxOw-vedhDBxMeMnzYywzaMnbIEjBtUYrzkgcrFH21jm2G9fmj-AVtROaxZZsTJQ… 15/15