0% found this document useful (0 votes)
47 views4 pages

Gesture-Driven Virtual Mouse Empowering Accessibility Through Hand Movements

Uploaded by

bussu004
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views4 pages

Gesture-Driven Virtual Mouse Empowering Accessibility Through Hand Movements

Uploaded by

bussu004
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

2024 International Conference on Computing, Power, and Communication Technologies (IC2PCT)

Gesture-Driven Virtual Mouse: Empowering


2024 IEEE International Conference on Computing, Power and Communication Technologies (IC2PCT) | 979-8-3503-8354-6/24/$31.00 ©2024 IEEE | DOI: 10.1109/IC2PCT60090.2024.10486577

Accessibility through Hand Movements


Anitha Julian Danush Suresh Ramyadevi R
Department of Computer Science and Department of Computer Science and Department of Computer Science and
Engineering Engineering Engineering
Saveetha Engineering College Saveetha Engineering College Saveetha Engineering College
Chennai, India Chennai, India Chennai, India
[email protected] [email protected] [email protected]

Abstract— The mouse has been a vital tool for human- One breakthrough technology in this domain is the hand-
computer interaction, with wired, wireless, and Bluetooth gesture-controlled virtual mouse, driven by AI. It allows users
variations requiring power to connect a dongle to a PC. The to control their computer mouse by moving their hands instead
proposed work uses the latest technology in computer vision and of using a traditional device. This system operates on a camera-
machine learning to recognize hand movements without needing based approach, tracking the hand movements of the user and
extra equipment. Compatible with CNN models through translating those movements into mouse actions on the screen
mediapipe implementation, it enables various actions via various via computer vision methods and trained machine learning
hand movements, all on a single computer. Utilizing a camera as
models designed to understand particular hand movements such
the input device, the system relies on Python and OpenCV. The
as pointing or swiping.
camera's output is displayed on the system's screen, allowing users
to fine-tune their interactions. This cutting-edge technology offers multiple advantages,
notably improving accessibility and delivering a more intuitive
Keywords— Virtual mouse, interaction, computer vision, gesture user experience. It's beneficial in scenarios where a physical
recognition, hand gestures. mouse or touchpad isn't feasible, revolutionizing interaction
with computers by eliminating the need for tangible devices. Its
I. INTRODUCTION
applications span gaming, virtual reality, and enhance
The modern world revolves around technology, and accessibility for various user groups.
computer technology is continually advancing globally. It
enables tasks beyond human capabilities, essentially influencing II. LITERATURE REVIEW
human lives by handling functions that people cannot Authors of [1, 7] suggests virtual mouse control by use of
accomplish. Human-computer interaction largely relies on hand gesture recognition with the experimental setup of the
output devices like the mouse, which facilitates various actions system makes use of a low-cost web camera with high-
such as pointing and scrolling within graphical user interfaces definition recording capability that is installed in a fixed
(GUIs). However, traditional hardware mice and touchpads are position. The work of [2] illustrates the “concept of virtual
cumbersome and vulnerable to damage when carried around. mouse creating a mask from a red object and implemented it
Over time, technology transformed the conventional wired using Python software with various modules and functions.
mouse into wireless versions for enhanced functionality and The research in [3] introduced hand gesture recognition for
hassle-free mobility. Additionally, speech recognition emerged, Human Computer Interaction, enhancing interaction through
facilitating voice-based searches and translations, although it's background extraction and contours detection, exploring
slower for executing mouse functions. Subsequently, eye- multiple sensor modes. Additionally, they have suggested a
tracking techniques entered the realm of human-computer system using a blue-colored hand pad, a webcam, and Visual
interaction to control mouse cursors, albeit facing challenges C++ 2008 with the OpenCV library.
with wearables like contact lenses or long eyelashes affecting
accuracy. Authors of [4] present the design of intangible interface for
mouse less computer handling by utilizing a convex hull
Developers have experimented with different kinds of algorithm, Microsoft Virtual Studio, and a red glove to detect
models for recognizing human gestures, often involving the user's hand. Authors of [5] introduced cursor control using
expensive gloves, sensors, and colored markers for fingertip hand gestures, relying on camera-captured hand gestures with
positions. Amidst these innovations, AI plays a pivotal role colored tips, implemented using MATLAB, the MATLAB
across industries, expediting tasks and enhancing comfort. Image Processing Toolbox, and OpenCV library.
Efforts to tackle existing issues involve leveraging the latest AI
algorithms and tools. Research in [6, 9] suggest mouse cursor control based on
hand gesture using an external camera and color strips attached
to fingers, implemented through C programming and OpenCV

XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE

Copyright © IEEE–2024 ISBN: 979-8-3503-8354-6 755


Authorized licensed use limited to: Zhejiang University. Downloaded on November 22,2024 at 14:27:18 UTC from IEEE Xplore. Restrictions apply.
2024 International Conference on Computing, Power, and Communication Technologies (IC2PCT)

library. Authors of [7] illustrate vision-based computer mouse OpenCV's functionalities encompass:
control using hand gestures employing a vision-based system to
enhance mouse activities via hand gestures, implemented using 1. Managing and preparing Images/Videos: Easily loads and
MATLAB. The technology that has been built in [10] exhibits modifies images/videos from various sources.
great promise for improving human-computer interaction in 2. Identifying and Describing Features: Identifies edges,
non-traditional situations, including industrial settings, space corners, and blobs for object identification or motion
stations, undersea research, extreme weather, and distant tracking.
locales. 3. Detecting and Recognizing Objects: Effectively identifies
and recognizes objects using diverse techniques.
III. METHODOLOGY 4. Tracing: Tracks objects in video streams, estimating their
Google's MediaPipe framework simplifies real-time movements over time using algorithms like optical flow or
computer vision apps across platforms. It provides pre-built mean-shift.
tools for video and audio processing, identifying objects,
estimating poses, tracking hand movements, and recognizing IV. PROPOSED ALGORITHM
facial features. Some of the common hand gesture coordinates The proposed algorithm follows the following steps.
are displayed in Figure 1. Developers can create complex
pipelines easily by combining algorithms and running them in 1. Begin the process.
real-time on CPUs, GPUs, and accelerators like Google's Edge 2. Initialize the system and activate the webcam video
TPU. MediaPipe supports TensorFlow, PyTorch, C, Java, and capture.
Python. 3. Capture frames using the webcam.
4. Employ OpenCV and MediaPipe to identify hands, hand
Key features include tips, and draw landmarks and a box around the hand.
1. Video/Audio Processing: Real-time stream processing. 5. Create a box shape around the part of the computer screen
2. Recognition of face: Detects and identifies facial features where we'll use the mouse.
to recognize expressions, emotions and enable AR 6. Identify the raised finger:
experiences. • Neutral gesture: All five fingers up, proceed.
3. Hand movement tracing: Monitors hand movements for • Middle and index fingers up: Move cursor, return to
gestures and virtual interaction. step 2.
4. Detection of objects: Identifies/tracks objects in real-time • Joined index and middle fingers: Performs double
for AR, robotics, and surveillance. click, return to step 2.
5. Estimation of pose: Real-time estimation of human body • Both index and middle fingers down: Execute a left
poses for tracking fitness activities and enhancing click, return to step 2.
augmented reality experiences. • Middle finger down, index finger up: Perform a right
MediaPipe enables training/deploying ML models for tasks click, return to step 2.
like identifying objects and facial recognition. Overall, it • Volume control: Raise and lower volume by joining
empowers developers to build advanced real-time computer thumb and index fingers, moving them up and down.
vision and machine learning apps effortlessly. 7. Press the EXIT key to end the process.
This sequential algorithm demonstrates how to replicate
virtual mouse control with the help of hand gestures. The aim is
to seamlessly replace a traditional physical mouse with a virtual
counterpart, converting hand gestures into mouse functions.
V. RESULTS AND DISCUSSION
The culmination of our efforts led to a gesture recognition
system that astoundingly translated hand movements into
meaningful actions. Our system showcased commendable
accuracy and responsiveness, making controlling devices a
Fig. 1. Hand Coordinates
seamless and intuitive experience. The gestures used to control
OpenCV, a versatile computer vision and machine learning the mouse are shown.
library, is a go-to tool for newcomers in this field. It manages
tasks involving the processing of audio and video, including
filtering, identifying features, recognizing objects, and tracking.
Supporting Python, Java, MATLAB, and primarily coded in
C++, OpenCV finds application in self-driving cars, AR
robotics, and more. This library equips programmers with the
tools needed to build advanced computer vision programs.

Copyright © IEEE–2024 ISBN: 979-8-3503-8354-6 756


Authorized licensed use limited to: Zhejiang University. Downloaded on November 22,2024 at 14:27:18 UTC from IEEE Xplore. Restrictions apply.
2024 International Conference on Computing, Power, and Communication Technologies (IC2PCT)

. Figure 4 shows a gesture used to trigger the double click


action.

Fig. 2. Gesture for Neutral


Fig. 5. Gesture for right click
The above shown Figure 2 illustrates the gesture that does A gesture used for triggering right click is shown in Figure
not perform any action. 5.

Fig. 3. Gesture for moving cursor

Fig. 6. Gesture for left click


Figure 3 shows the gesture used for moving the cursor
around the screen.
Figure 6 shows a gesture used for triggering left click.

Fig. 4. Gesture for double click Fig. 7. Gesture for selecting multiple items and for drag and drop

Copyright © IEEE–2024 ISBN: 979-8-3503-8354-6 757


Authorized licensed use limited to: Zhejiang University. Downloaded on November 22,2024 at 14:27:18 UTC from IEEE Xplore. Restrictions apply.
2024 International Conference on Computing, Power, and Communication Technologies (IC2PCT)

Figure 6 shows a gesture used for selecting multiple items The evaluation of the model's performance indicates that the
and used to perform drag and drop operation. proposed virtual mouse exhibits superior accuracy compared to
existing models. The hand gestures namely, neutral gesture,
TABLE I. TABLE 1. ACCURACY OF THE VIRTUAL MOUSE cursor movement, left click and double click show 100%
Defined mouse functions 6 accuracy, whereas the hand gestures namely, right click and
Count of test for each function 50 drag-and-drop exhibit 80% accuracy. The proposed solution
Total tests performed 300 effectively addresses numerous limitations encountered in
A groundbreaking solution for individuals with disabilities current systems. This enhanced accuracy and functionality
who encounter difficulties with conventional mouse or render the virtual mouse viable for practical applications,
keyboard setups is a virtual mouse operated through hand offering a more precise and versatile alternative to the
gestures. This innovative technology enables users to navigate conventional traditional mouse through hand gesture control.
computers without the need for a traditional mouse, relying
REFERENCES
instead on defined hand movements.
[1] G N Srinivas, S Sanjay Pratap, V S Subrahmanyam , K G Nagapriya, and
The suggested virtual mouse could potentially outperform A Venkata Srinivasa Rao “Virtual Mouse Control Using Hand Gesture
regular mice in terms of accuracy and speed, depending on its Recognition” Proc. International Research Journal of Engineering and
design. User acceptance hinges on the ease of use, functionality, Technology (IRJET), Volume:10 Issue: 02, 2023, pp. 307-320.
and overall user experience. If the virtual mouse proves to be [2] K. H. Shibly, S. Kumar Dey, M. A. Islam and S. Iftekhar Showrav,
"Design and Development of Hand Gesture Based Virtual Mouse," 2019
intuitive, efficient, and user-friendly, it may gain widespread 1st International Conference on Advances in Science, Engineering and
adoption. Conversely, if it proves challenging to comprehend, Robotics Technology (ICASERT), Dhaka, Bangladesh, 2019, pp. 1-5.
malfunctions, or causes confusion, users may abandon it [3] I. Ralev and G. Krastev, "Components and model of implementation of a
quickly. Table 1 and Figure 8 present an experiment assessing hand gesture recognition system," 2022 International Congress on Human-
the accuracy of the virtual mouse. Computer Interaction, Optimization and Robotic Applications (HORA),
Ankara, Turkey, 2022, pp. 1-4.
[4] Alisha Pradhan, B.B.V.L. Deepak, "Design of Intangible Interface for
Mouseless Computer Handling using Hand Gestures", Procedia Computer
Science, Volume 79, 2016, pp. 287-292.
[5] Twinkle Shukla et al., "Cursor Control Using Hand Gestures",
International Journal for Multidisciplinary Research", 5(6), 2023, pp. 1-5.
[6] B. J. Boruah, A. K. Talukdar and K. K. Sarma, "Development of a
Learning-aid tool using Hand Gesture Based Human Computer Interaction
System," 2021 Advanced Communication Technologies and Signal
Processing (ACTS), Rourkela, India, 2021, pp. 1-5.
[7] S. Thakur, R. Mehra and B. Prakash, "Vision basedcomputer mouse
control using hand gestures"International Conference on Soft Computing
Techniques and Implementations (ICSCTI), 2015,pp. 85-89.
[8] M. Mishra, A. Gupta, V. Mehta and R. Ranjan, "Virtual Mouse Input
Control using Hand Gestures," 2023 7th International Conference on
Trends in Electronics and Informatics (ICOEI), Tirunelveli, India, 2023,
Fig. 8. Experiment results pp. 171-175.
[9] M. Ranawat, M. Rajadhyaksha, N. Lakhani and R. Shankarmani, "Hand
VI. CONCLUSION Gesture Recognition Based Virtual Mouse Events," 2021 2nd
International Conference for Emerging Technology (INCET), Belagavi,
The primary objective of the virtual mouse is to control India, 2021, pp. 1-4.
mouse cursor actions through hand gestures, eliminating the [10] N. S. TK and A. Karande, "Real-Time Virtual Mouse using Hand Gestures
need for a physical mouse. This system utilizes a webcam or an for Unconventional Environment," 2023 14th International Conference on
integrated camera to recognize and interpret hand gestures and Computing Communication and Networking Technologies (ICCCNT),
tips, enabling the execution of various mouse functions based Delhi, India, 2023, pp. 1-6.
on these detections.

Copyright © IEEE–2024 ISBN: 979-8-3503-8354-6 758


Authorized licensed use limited to: Zhejiang University. Downloaded on November 22,2024 at 14:27:18 UTC from IEEE Xplore. Restrictions apply.

You might also like