100% found this document useful (1 vote)
753 views

Volume Control Using Gestures

In this paper we are developing a volume controller in which we are using hand gestures as the input to control the system
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
753 views

Volume Control Using Gestures

In this paper we are developing a volume controller in which we are using hand gestures as the input to control the system
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Volume 7, Issue 5, May – 2022 International Journal of Innovative Science and Research Technology

ISSN No:-2456-2165

Volume Control using Gestures


Martendra Pratap Singh, Arzoo Poswal, Eshu Yadav

Abstract:- In this paper we are developing a volume Vision Based method requires a web camera, so that one can
controller in which we are using hand gestures as the realize natural interaction between humans and computer
input to control the system , Opencv module is basically without using any other devices.
used in this implementation to control the gestures.This
system basically uses the web camera to record or The challenging part in these systems is background
capture the images /videos and accordingly on the basis images or videos which is recorded or captured during
of the input, the volume of the system is controlled by taking the inputs i.e. hand gesture by the user, also sometime
this application. The main function is to increase and lightning effect the quality of the input taken which creates
decrease the volume of the system. The project is the problem in recognizing the gestures. Process to find a
implemented using Python, OpenCV. connected region within the image with some of the
property such as color ,intensity and a relationship between
We can use our hand gestures to control the basic pixels i.e. pattern is termed as segmentation. And have used
operation of a computer like increasing and decreasing some important packages which have OpenCv-python,
volume. Therefore ,people will not have to learn tensorflow, numpy, mediapipe, imutils, scipy, numpy.
machine-like skills which are a burden most of the
time.This type of hand gesture systems provides a II. EXISTING SYSTEM
natural and innovative modern way of non verbal
communication. These systems has a wide area of Gesture Recognition using Accelerometer - The author
has introduced an ANN application used for the
application in human computer interaction.
classification and gesture recognition. Wii remote is
The purpose of this project is to discuss a volume basically used in this system as this remote rotate in the X,
control using hand gesture recognition system based on Y, Z direction. In order to decrease the cost and memory of
detection of hand gestures. In this the system is consist of the system the author has used two levels to implement the
a high resolution camera to recognise the gesture taken system. In the first level User is authenticated for gesture
as input by the user. The main goal of hand gesture recognition. Gesture recognition method which is used by
recognition is to create a system which can identify the the author Accelerometer- Based .
human hand gestures and use same input as the
information for controlling the device and by using real After that in second level of the system signal are
time gesture recognition specific user can control a processed for gesture recognition using
computer by using hand gesture in front of a system automata(Fuzzy).After this the data is used for
video camera linked to a computer. In this project we normalization using k means and Fast Fourier algorithm.
are developing a hand gesture volume controller system Now, the recognition accuracy has increases up to 95%.
with the help of OpenCV ,Python. In this system can be Hand Gesture Recognition by using Hidden Markov
controlled by hand gesture without making use of the Models - In this paper the author has made a system to
keyboard and mouse. recognize the numbers from 0-9 using the dynamic hand
Keywords:- Hand gesture ,Opencv-Python, volume gestures. The author has used two steps in this paper .In the
controller, mediapipe package , numpy package , Human initial stage preprocessing is done and in second step
computer Interface. classification is carried out. Basically, there are two types of
gestures Key gestures and Link gestures. For spotting
I. INTRODUCTION purpose the key gesture is used and the link gestures is used
in continuous gestures. In this paper ,Discrete Hidden
Hand gestures is the powerful communication medium Markov Model(DHMM) is used for the classification
for Human Computer Interaction (HCI).Several input purpose. This DHMM is trained by an algorithm named
devices are available for interaction with computer, such as Baum-Welch algorithm. Average recognition rates using
keyboard, mouse, joystick and touch screen, but these HMM ranges from 93.84% to 97.34%.
devices does not provide easier way to communicate. In this,
the system which is proposed will consists of desktop and Robust Part-Based Hand Gesture Recognition Using
laptop interface, hand gesture can be used by the users need Kinect Sensor- In this author has used low cost cameras in
to wear data gloves also can use the web camera or separate order to make the things affordable for the users. A kinect
cameras for recording the hand gestures. sensor is a sensor whose resolution is lower in comparison
of other cameras but can detect and capture the big images
The first and most important step toward any hand and objects .To deal with the noisy hand gestures ,only
gesture recognition system is to implement hand tracking fingers are matched with FEMD but not the whole hand .
system. Some Sensor devices are generally used in Data- This system works perfectly and efficiently in uncontrolled
Glove based methods for digitizing hand and finger motions environments. The accuracy of 93.2% is achieved with the
into multi parametric data. Other sensors used in this system experimental result.
will collect hand configuration and hand movements. The

IJISRT22MAY250 www.ijisrt.com 203


Volume 7, Issue 5, May – 2022 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
III. RELATED WORK IV. SYSTEM ARCHITECTURE AND METHODOLOGY

In vision community hand gesture is an active area of In this project we are using python technology to
research, for the purpose of sign language recognition and develop the project , the code is written and designed in
human computer interaction. In this we have used some python language using Opensv and NumPy modules. In this
algorithms and some modules to detect the gestures of the project firstly we import the libraries which are to be used
person and these gestures are taken as the input in the for further processing of the input and the output. The
system . Here, several modules are used like opencv-python, libraries which are used in this project which needs to be
mediapipe, numpy etc for the purpose of tracking the imported are OpenCV, mediapipe, math, ctypes, pycaw and
gestures. numpy.We get video inputs from our primary camera.

After capturing the input from the user the image is Now, here mediapipe is used to detect the video as the
used in the hand tracking system to check the dimensions input from our camera and use mphand.hands module to
and shape of the gesture which is received in the system. detect the gesture .Then , in order to access the speaker we
have used the pycaw and we have provided the range of the
Hand tracking module plays a important role in volume from minimum volume to maximum volume.
identifying the input recorded in the system, after that
classification and segmentation process is used to classify Next step is to convert the input image to rgb image to
the gestures in the system .Machine learning and deep complete the processing of the input captured. Then its turn
learning is also used to identify the training data from the to specify the points of thumb in input and fingers.
system and identify it according to the requirement of the
system .After this the gestures are identified from the trained Volume range id processed using the hand range in this
data and on the basis of that data the gestures rae recognized process numpy is used to convert this process and process
and is used for processing of the the system to implement the required output. NumPy package is fundamental
the functions like increase and decrease in volume. package for computing in Python language. It is consist of
several things like-
 powerful N-dimensional array
 object broadcasting
 tools to integrate C
 Fourier transform, and random number capabilities.

Fig. 1: System Architecture

Here, we have performd the Hand Gestures recognition A. OPEN CV


system to produce the better output, webcam is enabled Open CV is a library of python which tackle PC vision
while executing the program, also the type of gesture used is issue. It is used to detect the face which is done using the
static to recognize the shape of the hand and it provides us machine learning .It is a very important library and is used
the required output. In this project the volume is controlled in several projects to detect the face and recognize the
based on the shape of hand. The system takes input and will several frames also it supports several programming
capture the object ,detects after that hand gesture recognition languages. It also performs object detection and motion
is performed. detection. It also support several type of operating system
and can be used to detect the face of the animals also.

IJISRT22MAY250 www.ijisrt.com 204


Volume 7, Issue 5, May – 2022 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165
B. NUMPY the rgb in order to process the image in our system .
NumPy is the module of the Python. The numpy word Consequently the power of a pixel is in the range [0,255].
basically shows Numerical Python and it is utilized. This is
the module which is basically written in c language and is D. MEDIAPIPE
said as expansion module . Numpy guarantee remarkable MediaPipe is a module for processing video, audio and
execution speed. Numpy is mostly used for performing several types of related data across platform like Android,
calculations, tasks using certain functions it provides like iOS, web, edge device and several applied ML pipeline.
multiply, divide, power etc. Several types of functions are performed with the help of
this module , we have used this module in our project to
C. IMAGE FILTERING -HISTOGRAM recognize the hand gesture and detect the input from it.
Histogram is a type of graph which represents the  Face Detection
movement of the pixels power in the portrayal.In this we use  Multi-hand Tracking
to filter the images using histogram and convert them into  Segmentation
 Object Detection and Tracking

V. RESULTS

Fig. 2

Fig. 3

Fig. 4

IJISRT22MAY250 www.ijisrt.com 205


Volume 7, Issue 5, May – 2022 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165

Fig. 5

VI. CONCLUSION [10.] G. R. S. Murthy, R. S. Jadon. (2009). “A Review of


Vision Based Hand Gestures Recognition,”
This project is presenting a program that allows the International Journal of Information Technology and
user to perform hand gesture for convenient and easier way Knowledge Management, vol. 2(2).
to control the software .A gesture based volume controller [11.] Mokhtar M. Hasan, Pramoud K. Misra, (2011).
doesn’t require some specific type of markers and these can “Brightness Factor Matching For Gesture Recognition
be operated in our real life on simple Personal Computers System Using Scaled Normalization”.
with a very low cost cameras as this not requires very high
definition cameras to detect or record the hand gestures.
Specifically, system tracks the tip positions of the counters
and index finger of each hand.The main motive of this type
of system is basically to automate the things in our system in
order to make the things become easier to control. So in
order to make it realiable we have used this system to make
the system easier to control with the help of these
applications.

REFERENCES

[1.] RESEARCH GATE, GOOGLE .


[2.] C. L. NEHANIV. K J DAUTENHAHN M KUBACKI
M. HAEGELEC. PARLITZ
[3.] R. ALAMI "A methodological approach relating the
classification of gesture to identification of human
intent in the context of human-robot interaction”, 371-
377 2005.
[4.] M. KRUEGER Artificial reality II Addison-Wesley
Reading (Ma)1991.
[5.] H.A JALAB "Static hand Gesture recognition for
human computer interaction”, 1-72012. 4)
JC.MANRESARVARONAR. MASF.
[6.] PERALES"Hand tracking and gesture recognition for
human-computer interaction",2005.
[7.] Intel Corp, “OpenCV Wiki,” OpenCV Library
[Online], Available:
http://opencv.willowgarage.com/wiki .
[8.] Z. Zhang, Y. Wu, Y. Shan, S. Shafer. Visual panel:
Virtual mouse keyboard and 3d controller with an
ordinary piece of paper. In Proceedings of Perceptual
User Interfaces, 2001
[9.] W. T. Freeman and M. Roth, Orientation histograms
for hand gesture recognition. International workshop
on automatic face and gesture recognition. 1995, 12:
296- 301.

IJISRT22MAY250 www.ijisrt.com 206

You might also like