0% found this document useful (0 votes)
45 views

Cursor Movement Using Hand Gesture

The document discusses developing a gesture recognition system for controlling cursor movement. It covers background research, objectives, system design, hardware and software requirements, and algorithms for hand gesture recognition and control. The algorithm utilizes computer vision and machine learning to map hand landmarks to gestures and execute corresponding commands in real-time.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views

Cursor Movement Using Hand Gesture

The document discusses developing a gesture recognition system for controlling cursor movement. It covers background research, objectives, system design, hardware and software requirements, and algorithms for hand gesture recognition and control. The algorithm utilizes computer vision and machine learning to map hand landmarks to gestures and execute corresponding commands in real-time.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

1

A SYNOPSIS ON

CURSOR MOVEMENT USING HAND GESTURE

Submitted in partial fulfilment of the requirement for the award of the degree of

BACHELOR OF COMPUTER APPLICATIONS

Submitted by:

ABHISHEK SINGH MANRAL University Roll No. 2102575

Under the Guidance of

Mr. JAI SHANKAR BHATT


Assistant Professor

Department of Computer Applications


Graphic Era (Deemed to be University)
Dehradun, Uttarakhand
APRIL-2024
2

CANDIDATE’S DECLARATION

I/we hereby certify that the work which is being presented in the Synopsis entitled “Cursor Movement Using
Hand Gesture” in partial fulfillment of the requirements for the award of the Degree of Bachelor of Computer
Applications in the Department of Computer Applications of the Graphic Era (Deemed to be University),
Dehradun shall be carried out by the undersigned under the supervision of Mr. Jai Shankar Bhatt, Assistant
Professor, Department of Computer Applications, Graphic Era (Deemed to be University), Dehradun.

Abhishek Singh Manral University Roll no -2102575 signature-

Signature Signature
Supervisor Head of the Department

Internal Evaluation (By DPRC Committee)

Status of the Synopsis: Accepted / Rejected


Any Comments:

Name of the Committee Members: Signature with Date


1.
2.
3

Table of Contents

Chapter No. Description Page No.


Chapter 1 Introduction and Problem Statement 4
Chapter 2 Background and Literature Survey 5
Chapter 3 Objective 6
Chapter 4 Designing The User Interface 7
Chapter 5 Algorithm for Hand Gesture Recognition 8-10
and Control
4

Chapter 1
Introduction and Problem Statement
Chapter 1: Introduction and Problem Statement
1.1 Introduction: Gesture recognition is a field of computer science and language technology that
aims to interpret human gestures via mathematical algorithms. This type of recognition involves
capturing and interpreting hand movements or facial expressions as commands. The significance of
gesture-based interfaces has grown with the advent of touchless technology, providing a more natural
way for humans to interact with machines.

1.2 Problem Statement: Despite the advancements in gesture recognition technology, there remain
significant challenges such as achieving high accuracy in various lighting conditions, handling
occlusions, and interpreting gestures in real-time. Current systems often fail to provide seamless
interaction, which hinders their widespread adoption. This project aims to address these issues by
developing a robust gesture recognition system specifically for controlling cursor movements.

1.3 Objectives: The primary objectives of this project are to:

• Develop a gesture recognition system that accurately interprets hand movements.


• Ensure real-time processing to provide immediate feedback and control.
• Create an intuitive user interface that can be easily used by individuals with minimal technical
knowledge.
5

Chapter 2
Background and Literature Survey

2.1 Historical Background: The journey of gesture recognition began in the early 1980s with
research focused on sign language interpretation. Over the decades, significant milestones include the
development of various algorithms for hand tracking, facial recognition, and full-body motion capture.

2.2 Related Work: Several studies and systems have been developed for gesture recognition. For
instance, Microsoft Kinect uses depth-sensing technology to capture body movements. Leap Motion
focuses on capturing hand and finger gestures using infrared sensors. A comparative analysis of these
systems reveals strengths such as high precision in controlled environments and weaknesses like
performance degradation under variable lighting.

2.3 Technological Advancements: Advancements in camera technology, such as depth sensors


and high-resolution cameras, have greatly enhanced gesture recognition capabilities. Software
advancements include the use of machine learning algorithms and neural networks to improve accuracy
and adaptability.

2.4 Gaps in Existing Research: Despite these advancements, current systems struggle with issues
like high computational requirements, limited gesture vocabularies, and lack of robustness in diverse
environments. Addressing these gaps is crucial for developing a more reliable and user-friendly gesture
recognition system..
6

Chapter 3
Objective

3.1 Primary Objectives:


The primary objectives of this project are:
• To achieve high accuracy in gesture recognition irrespective of environmental conditions.
• To ensure the system processes gestures in real-time for immediate cursor control.
• To enhance user experience through intuitive and responsive interaction mechanisms.

3.2 Secondary Objectives:


Secondary objectives include:
• Developing a cost-effective solution that can be widely adopted.
• Ensuring the system is scalable and adaptable for various applications beyond cursor control.
• Simplifying the integration process with existing hardware and software systems.
7

Chapter 4
Designing the User Interface

4.1 Hardware Requirements:

The system requires a high-definition camera capable of capturing detailed hand movements. The processing
unit should have a multi-core CPU and a dedicated GPU for efficient handling of image processing tasks.
Additional peripherals might include infrared sensors to improve accuracy in low-light conditions.

4.2 Software Requirements

The development environment will primarily use Python and C++ for programming. Essential libraries
include OpenCV for image processing, TensorFlow for machine learning, and PyAutoGUI for controlling
cursor movements. The system should be compatible with major operating systems like Windows, macOS,
and Linux.

4.3 System Configuration:

Minimum system configuration includes:

• A quad-core CPU
• 8GB of RAM
• A standard GPU Recommended configuration includes:
• An octa-core CPU
• 16GB of RAM
• A high-end GPU with at least 4GB of VRAM
8

Chapter 5
Algorithm for Hand Gesture Recognition and Control

This chapter details the algorithm implemented in the hand gesture recognition and control system. The
provided code utilizes several libraries and custom classes to achieve this functionality. Below are the main
components and their roles:

Pseudocode
5.1 Imports and Initial Setup:
mathematica
Copy code
Import necessary libraries
Set pyautogui.FAILSAFE to False
Initialize MediaPipe hands and drawing utilities

5.2 Gesture Encodings and Handedness Labels:


rust
Copy code
Define Gest enumeration for gesture encodings
Define HLabel enumeration for handedness labels

5.3 HandRecog Class (Converts Landmarks to Gestures)

1. Initialization:
o Initialize attributes: finger, ori_gesture, prev_gesture, frame_count, hand_result, hand_label
2. update_hand_result:
o Update hand_result with provided landmarks
3. get_signed_dist:
o Calculate signed Euclidean distance between two points
4. get_dist:
o Calculate Euclidean distance between two points
5. get_dz:
9

o Calculate absolute difference on z-axis between two points


6. set_finger_state:
o Determine finger state by calculating distance ratios for each finger
o Update finger attribute based on the ratios
7. get_gesture:
o Determine current gesture based on finger state and distance calculations
o Handle noise by checking if the current gesture matches the previous gesture for a few frames
o Return the stabilized gesture

5.4 Controller Class (Executes Commands Based on Gestures):

1. Attributes:
o Initialize various flags and attributes for controlling gestures
2. getpinchylv:
o Calculate y-axis distance for pinch gesture
3. getpinchxlv:
o Calculate x-axis distance for pinch gesture
4. changesystembrightness:
o Adjust system brightness based on pinch gesture
5. changesystemvolume:
o Adjust system volume based on pinch gesture
6. scrollVertical:
o Perform vertical scroll based on pinch gesture
7. scrollHorizontal:
o Perform horizontal scroll based on pinch gesture
8. get_position:
o Calculate stabilized cursor position based on hand landmarks
9. pinch_control_init:
o Initialize attributes for pinch gesture control
10. pinch_control:
o Determine if the pinch gesture is horizontal or vertical
o Call the appropriate control function based on the direction
11. handle_controls:
o Execute commands based on the detected gesture
o Implement various gesture functionalities such as mouse movement, clicks, and pinch controls
10

5.5 GestureController Class (Main Entry Point for Gesture Control):

1. Attributes:
o Initialize camera, frame dimensions, and hand recognition objects
2. Initialization:
o Set gc_mode to 1
o Initialize camera and frame dimensions
3. classify_hands:
o Classify hands as left or right based on MediaPipe results
o Set major and minor hands based on dominant hand
4. start:
o Capture video frames
o Process frames to detect hands and landmarks
o Update hand results and finger states for major and minor hands
o Handle controls based on detected gestures
o Display the processed video with hand landmarks
o Break loop and release resources on exit

You might also like