Smart Hand Gloves
Smart Hand Gloves
BELAGAVI-590018
Project Report on
“IoT-Based Messaging Device for Deaf, and Dumb People”
Submitted in the partial fulfilment of the requirements for the award of the Degree of
Bachelor of Engineering in Electronics and Communication Engineering
i
THE OXFORD COLLEGE OF ENGINEERING
(NAAC Accredited)
BOMMANAHALLI, HOSUR ROAD, BENGALURU-560 068
(Affiliated to VTU and approved by AICTE)
CERTIFICATE
Certified that the Project work entitled “IoT-Based Messaging Device for Deaf,
and Dumb People”carried out by BHUVANESHWARI V (1OX20EC011)
,CHANDAN BV (1OX20EC012), DEVATI SWARUP (1OX20EC019), K NAVYA
MADHURI (1OX20EC027) bonafied students of The Oxford College of Engineering,
Bengaluru in partial fulfilment for the award of the Degree of Bachelor of Engineering in
Electronics and Communication Engineering of the Visvesvaraya Technological
University, Belagavi during the year 2022-2023. It is certified that all corrections/suggestions
indicated for Internal Assessment have been incorporated in the report deposited in the
departmental library. The Project report has been approved as it satisfies the academic
requirements in respect of Project work prescribed for the said Degree.
1.______________________ _______________________
2._______________________ _______________________
ii
ACKNOWLEDGEMENT
A project is a job of great enormity and it can’t be accomplished by an individual all by them.
Eventually, we are grateful to a number of individuals whose professional guidance, assistance
and encouragement have made it a pleasant endeavour to undertake this mini project.
It gives us great pleasure in expressing our deep sense of gratitude to our respected Founder
Chairman Late. Sri S. Narasa Raju, and to our respected Chairman Sri S.N.V.L Narasimha
Raju, for having provided us with great infrastructure and well-furnished labs.
We take this opportunity to express our profound gratitude to our respected Principal Dr. N
KANNAN for his support.
We are grateful to the Head of the Department Dr. Manju Devi, for her unfailing
encouragement and suggestion given to us in the course of our mini project work. Guidance
and deadlines play a very important role in successful completion of the mini project on time.
We also convey our gratitude to our internal project guide Dr. A Chrispin Jiji, for having
constantly guided and monitored the development of the mini project. Finally, a note of thanks
to the Department of Electronics and Communication, both teaching and non-teaching staff for
their co-operation extended to us.
We thank our parents for their constant support and encouragement. Last, but not the least, we
would like to thank our peers and friends.
iii
DECLARATION
Bhuvaneshwari V (1OX20EC011)
Chandan BV (1OX20EC012)
Devati Swarup (1OX20EC019)
K Navya Madhuri (1OX20EC027)
Place: Bengaluru
Date:
iv
ABSTRACT
Hand gestures play a crucial role in facilitating communication for individuals who are deaf
and dumb, enabling them to express their thoughts, emotions, and needs. This abstract
introduces a novel hand gesture recognition system designed to bridge the communication gap
between deaf and dumb individuals and the general population. The system aims to provide an
intuitive and efficient means of communication, empowering deaf and dumb individuals to
interact with others effectively. The system's flexibility and adaptability will make it a valuable
tool for people with a wide range of communication challenges, providing them with greater
independence and social participation
To achieve robust and accurate recognition, a machine learning model is trained using a large
dataset of labeled hand gesture samples. Various machine learning techniques, such as
convolutional neural networks (CNNs) or recurrent neural networks (RNNs), are employed to
learn the complex patterns and associations between hand gestures and their
corresponding meanings.
Through rigorous testing and evaluation, the hand gesture recognition system has shown
promising results, achieving high accuracy in recognizing a diverse range of hand gestures
commonly used in sign language. It has the potential to significantly improve the quality of life
for dumb and deaf individuals, empowering them to express themselves more efficiently and
interact with the world around them.
Overall, this abstract highlights the development of a hand gesture recognition system tailored
to the unique needs of the dumb and deaf community. The system's accurate recognition and
translation of hand gestures have the potential to bridge the communication gap, opening up
new avenues for effective interaction and inclusion of dumb and deaf individuals in society.
v
TABLE OF CONTENT
1. Acknowledgement iii
2. Abstract v
3. Table of Contents vi
5. Chapter 1: Introduction
1.3 Objectives 3
7.
Chapter 3: Proposed Methodology
3.1 Introduction 6
3.2 Block diagram of the model 6-9
4.2.4 Speaker 16
vi
4.2.5 LCD 16-17
4.26 HC05 17
6.1 Conclusion 27
vii
List of figures
Figure no. Figure name Page no.
1. Communication difficulties 2
4. Arduino UNO 11
6. Flex sensors 13
7. APR9600 module 14
8. 2x16 LCD 15
List of tables
Table number Table name Page no.
1. Various Hand gesture 22-24
viii
IoT-Based Messaging Device for Deaf, and Dumb People
CHAPTER-1 INTRODUCTION
1.1 Overview
How frequently will we come across the mute people communicating with the normal people?
On comparison of communication between the blind and a normal sight person, the
communication between a deaf and normal person is a serious problem. Amongst the deaf
people in world, sign language is a nonverbal form of intercommunication. This sign language
doesn’t have a common origin and hence it is difficult to understand and translate for normal
people.
According to the world health organization, about 1 million people are dumb and 300 million
people are deaf in the world. The power of communication can either be a blessing or a curse.
It helps to express thoughts and feelings. At times GOD plays his pranks on human beings and
steal from them the ability to listen and speak the so-called normal peoples calls them DEAF
AND DUMB. They are normal in all aspects except that they can’t communicate like others.
This inability always differentiates them from others in society. They use sign language as the
only medium for communication. Sign language uses both facial expressions and hand gestures
to convey the essence of what an individual is trying to express. Each country generally has its
own, native sign language, and some have more than one. To achieve the human-computer
interaction for the disabled people, the human hand could be an input device. Various
approaches have been proposed for enabling hand gesture recognition.
With the massive influx of computers in society, human computer interaction, or HCI, has
become an increasingly important part of daily lives. Current user interaction devices with
keyboard, mouse and pen are not sufficient for physically challenged people and Virtual
Environment (VE) which induce many new types of representation and interaction. Gesture,
speech, and touch inputs are few possible methods of meeting such user’s need to solve this
problem.
Dumb and deaf individuals face significant challenges in communicating with others due to
their inability to speak or hear. While sign language is commonly used as a visual means of
communication, it requires both parties to be proficient in it, limiting effective communication
in certain situations. Hand gestures, on the other hand, provide a more intuitive and universal
form of communication. However, the lack of standardized gestures and the difficulty in
accurately interpreting them often hinder effective communication.
By addressing these challenges, the proposed solution aims to empower dumb and deaf
individuals by providing them with a reliable and efficient means of communication through
hand gestures. This would enhance their ability to express themselves, engage in social
interactions, and access information, ultimately fostering greater inclusivity and
participation in society.
1.3 Objectives
2. To develop a portable and affordable speaking system that can be used by individuals
with communication difficulties in a variety of settings, including at home, school, and
in the community.
4. To develop wireless Bluetooth module with sensor data transmission and wearable
technology that can enhance or restore an individual’s ability to hear and communicate
effectively
1.4 Motivation
Communication: Sign language allows deaf and dumb individuals to express themselves,
engage in conversations, and convey their thoughts and feelings to others. By using hand
gestures, they can communicate effectively without relying solely on spoken language or
written communication.
Inclusivity: Hand gestures and sign language help create a more inclusive society by breaking
down barriers between people with different abilities. By learning and using sign language,
individuals with normal hearing can communicate directly with deaf and dumb individuals,
fostering a sense of equality and understanding.
Independence: Hand gestures empower deaf and dumb individuals to become more
independent in their daily lives. They can communicate their needs, participate in social
interactions, and access information without relying on an interpreter or
written communication.
This project has the potential to significantly enhance the lives of many people, providing them
with the tools they need to connect with others and participate in society.
In [2] Motivation of this project is to help the speech impaired communities by developing an
electronic speaking system. Arduino is main control unit for this system. Arduino was
programmed in such way that configuration settings can readily change without changing the
entire program code.
In [3] the mute person will ho through the complete sentence which he wants to communicate
with others. Then the sentence read by the person will be able to translate it into speech, which
will be understood audible to everyone.
In [4] A scheme using a database-driven hand gesture recognition based upon skin color model
approach and thresholding approach along with an effective template matching with can be
effectively used for human robotics applications and similar other applications. Initially, hand
region segmented by applying skin color model in YCbCr color space. In the next stage
thresholding is applied to separate foreground and background. Finally, for recognition
Principal Component Analysis is used for template-based matching.
In [5] Different research is done to analyze and evaluate how the device can reduce the
difficulty in Communication among people having listening and speech disability and find out
the limitations of the device in comparison to the other technologies and devices working
towards a similar objective. Their communications with others only involve the use of motion
by their hands and expressions and designed an artificial speaking mouth for dumb people.
This will also help other people to understand impaired people.
In [6] Generally, communication between impaired people and normal people is done through
synthesized speech which is known as sign language. Using a flex sensor and Arduino Mega
2560 Microcontroller information is converted into voice command and then an impaired
people can have communication with the normal people [1].
3.1 Introduction
The System involves the use of flex sensors to detect hand movements and convert them into
signals. These signals are then processed by an Arduino Uno microcontroller to recognize the
corresponding hand gesture. The recognized gesture is then mapped to a specific word or
phrase stored in the APR9600 voice recording module. The module plays the recorded sound
of the word or phrase through a speaker. Additionally, an LCD screen is used to display the
recognized gesture and the corresponding word or phrase. This system aims to provide a
versatile and easy-to-use communication solution for people with speech impairments or
communication difficulties.
The figure shows the basic block diagram of the proposed model
Hc05
Mobile
Hardware Setup:
Arduino Uno: You'll need an Arduino Uno board as the main microcontroller for this project.
Gesture Sensor: Choose a suitable gesture sensor module, such as an accelerometer or a gesture
recognition sensor, to detect hand movements.
Power Supply: Ensure you have a power supply for the Arduino board and the gesture sensor.
Software Setup:
Arduino Uno: You'll need an Arduino Uno board as the main microcontroller.
Flex Sensors: Choose suitable flex sensors that can measure the bending of fingers.
Power Supply: Ensure you have a power supply for the Arduino board and the flex sensors.
Resistors: Use resistors to create a voltage divider circuit for the flex sensors.
Connecting wires: Connect the flex sensors to the Arduino Uno using wires.
Software Setup:
Ensure that the connections are secure and there are no wires
3.2.5 Speaker
The speaker is an essential component as it is used to play back the audio messages stored in
the APR9600 module. When a particular hand gesture is recognized by the system, the
corresponding audio message is played through the speaker, allowing individuals with
communication difficulties to convey their thoughts and needs effectively. The speaker also
allows for the system to be used in various environments, such as noisy environments, where
the audio output needs to be loud and clear. Therefore, the speaker plays a crucial role in
making the speaking system more effective and user-friendly.
3.2.6 LCD
The LCD is used in to display the converted text from the hand gestures made by the user. The
system recognizes the hand gestures made by the user using flex sensors and converts them
into text using an algorithm. The converted text is then displayed on the LCD screen for the
user and the listener to see, making communication easier. The LCD screen can display
characters, numbers, and symbols, which makes it an ideal component to display the text output
from the system. Additionally, the LCD screen is easy to read and is low-power, which makes
it suitable for use in a portable and versatile speaking system.
3.27 HC05
Communication Assistance: The hand gesture recognition system can be used to assist deaf
and mute individuals in communicating with others. By recognizing and interpreting their hand
gestures, the system can convert them into corresponding text or speech output, enabling
effective communication with hearing individuals.
Education and Learning: The methodology can be integrated into educational tools for deaf
and mute individuals. It can help them learn sign language more effectively by providing real-
time feedback on the accuracy of their gestures. This can be particularly useful for self-learning
or practicing sign language outside of formal classroom settings
The above flow chart shows us the basic working flow of the proposed model
• Initialization: When the system is turned on, it starts by initializing all the components,
such as the Arduino, flex sensors, speaker, and LCD.
• Calibration: The flex sensors are calibrated to map the analog output to the
corresponding finger movement. This step involves measuring the minimum and
maximum resistance values of the flex sensors to determine the range of motion for
each finger.
• The Arduino detects the analog signal from the flex sensor by using its built-in analogto-
digital converter (ADC).
• Flex sensors are variable resistors that change their resistance based on the amount of
bending or flexing they experience. The resistance of the flex sensor is measured by
applying a small voltage across its two terminals and measuring the resulting voltage
drop using the ADC of the Arduino.The ADC converts the analog voltage signal from
the flex sensor into a digital value that can be processed by the Arduino. The digital
value represents the relative position or degree of bending of the flex sensor.
• Gesture Recognition: After calibration, the system is ready to recognize hand gestures.
The user wears the flex sensors on their fingers and moves their fingers in different
patterns to form various gestures. The flex sensors measure the resistance values of the
fingers, which are then converted into digital signals by the Arduino.
• Signal Processing: The digital signals are processed by the Arduino using signal
processing techniques to remove noise and improve the accuracy of the gesture
recognition.
• Gesture Mapping: The digital signals are then mapped to specific gestures based on the
pre-defined mapping table. The mapping table is a lookup table that maps the digital
signals to their corresponding hand gestures.
• Text-to-Speech Conversion: Once the hand gesture is recognized, the corresponding
text is displayed on the LCD screen, and the system converts the text into speech using
the APR9600 module.
• Audio Output: The speech is then output through the speaker, allowing the user to
communicate with others.
• Shutdown: Once the communication is complete, the system shuts down, and all the
components are turned off.
The Arduino Uno is a microcontroller board based on the ATmega328.It has 14 digital
input/output pins (of which 6 can used as PWM outputs),6 analog inputs, a 16 MHz crystal
oscillator, a USB connection, a power jack, an ICSP header, and a reset button. It contains
everything needed to support the microcontroller; it is simply connected it to a computer with
a USB cable or power it with a AC-to-DC adapter or battery to get started. “UNO” means one
in Italian and is named to mark the upcoming release of Arduino1.0.
The Uno and version 1.0 will be the reference versions of Arduino. Moving forward, the Uno
is the latest in a series of USB Arduino boards, and the reference model for Arduino platform.
The Arduino Uno can be powered via the USB connection or with an external power supply.
The power source is selected automatically. The board can operate on an external supply of
620volts.If supplied with less than 7V, however the 5V pin may supply less than five volts and
the board may be unstable. If using more than 12V, the voltage regulator may overheat and
damage the board. The recommended range is 7 to 12 volts.
Fig (5): Pin Configuration of Arduino Uno (AT MEGA 328) The power pins
are as follows:
• VIN: The input voltage to the Arduino board when it’s using an external power source
(as opposed to 5 volts from the USB connection or other regulated power source). You
can supply voltage through this pin, or, if supplying voltage via the power jack, access
it through this pin.
• 5V: The regulated power supply used to power the microcontroller and other
components on the board. This can come either from VIN via an on-board regulator, or
be supplied by USB or another regulated 5V supply.
• 3V3: A 3.3-volt supply generated by the on-board regulator. Maximum current draw is
50mA.
• GND: Ground pin
The Atmega328 has 32KB of flash memory for storing code (of which 0.5KB is used for the
bootloader), it also 2KB of SRAM and 1KB of EEPROM (which is described in EEPROM
library.
Each of the 14 digital pins on the Uno can be used as an input or output using pinMode(),
digitalWrite(),and digitalRead() functions. They operate at 5volts. Each pin can provide or
receive maximum of 40 mA and has an internal pull-up resistor (disconnected by default) of
20-50 KOhms. In addition, some pins have specialized functions:
Serial: 0(RX) and 1(TX), used to receive (RX) and transmit (TX) TTL serial data. These pins
are connected to the corresponding pins of the ATmega8U2 USB-to-TTL Serial chip External
interrupts: 2 and 3, These pins can be configured to trigger an interrupt on a low value, a rising
or falling edge, or a change in value.
PWM: 3,5,6,9,10 and 11,Provide 8-bit PWM output with the analogWrite() function.
TYPES OF INPUT
1. Direct Manner-Which is directly connected to the UNO board.
2. Semi Manual Manner-Provided by switch of 0V or +5V.
A flex sensor is a kind of sensor which is used to measure the amount of defection otherwise
bending. The designing of this sensor can be done by using materials like plastic and carbon.
The carbon surface is arranged on a plastic strip as this strip is turned aside then the sensor’s
resistance will be changed. Thus, it is also named a bend sensor. As its varying resistance can
be directly proportional to the quantity of turn thus it can also be employed like a goniometer.
The pin configuration of the flex sensor is shown below. It is a two-terminal device, and the
terminals are like p1 & p2. This sensor doesn’t contain any polarized terminal such as diode
otherwise capacitor, which means there is no positive & negative terminal. The required
voltage of this sensor to activate the sensor ranges from 3.3V -5V DC which can be gained
from any type of interfacing.
4.2.3 APR9600
The APR9600 device offers true single-chip voice recording, non-volatile storage, and
playback capability for 40 to 60 seconds. The device supports both random and sequential
access of multiple messages. Sample rates are user-selectable, allowing designers to customize
their design for unique quality and storage time needs. Integrated output amplifier, microphone
amplifier, and AGC circuits greatly simplify system design. the device is ideal for use in
portable voice recorders, toys, and many other consumers and industrial applications.
4.2.4 Speaker
Speakers are one of the most common output devices used with computer systems. Some
speakers are designed to work specifically with computers, while others can be hooked up to
any type of sound system. Regardless of their design, the purpose of speakers is to produce
audio output that can be heard by the listener.
1. 16x2 LCD is named so because, it has 16 columns and 2 rows. There are a lot of
combination available like,8x1, 8x2, 10x2, 16x1, etc. But the most used one is the 16*2
LCD.
2. All the above-mentioned LCD display will have 16 pins and the programming approach
is also the same. Above is the pin description of 16*2 LCD Module.
3.2.7 HCO5
The HC-05 is a popular module which can add two-way (full-duplex) wireless
functionality to your projects. You can use this module to communicate
between two microcontrollers like Arduino or communicate with any device
with Bluetooth functionality like a Phone or Laptop. There are many android
applications that are already available which makes this process a lot easier.
Power supply:
Output:
LCD connection:
• Connect the VSS pin of the LCD to GND pin of the Arduino.
• Connect the VDD pin of the LCD to 5V pin of the Arduino.
• The Data pins of LCD i.e., D0-D7 pins of the LCD are connected to Pin numbers A4,
A5,4,5,6,7 of the Arduino respectively.
ARP9600 connection:
• Connect the VCC pin of the ARP9600 to the 5V pin of the Arduino.
• Connect the GND pin of the ARP9600 to the GND pin of the Arduino.
• Messages in APR module are stored in different pins, messages M1-M7 pins are
connected to pin numbers 8,9,10,11,12,2,3 of the Arduino.
Speaker connection:
• Connect the positive (+) terminal of the speaker to the SP+ pin of the ARP9600 module.
• Connect the negative (-) terminal of the speaker to the SP- pin of the ARP9600 module.
• Turn on the Bluetooth in the device in which you need to see the output
• Select the device HC05 in the application
• And pair with the device
• Enter the code 1234 for pairing the device
Fig (11): APR voice module with REC/PLAY switch & MIC
Recording
Playing
• To play the voice message the switch should be always in PLAY mode.
• To play any recoded message, that pin can be touched to the ground of the circuit.
• Once the wire touches ground, we can get the voice message through the speaker.
• The message pins are always kept high in the code, when the Arduino detects the hand
gesture it pulls down the matched gesture- message pin to low, meaning ground, that
acts like the trigger and the voice message is heard through the speaker, once the voice
message is played, the Arduino pulls up the pin high again.
4.3.1 Introduction
• The main code, also known as a sketch, created on the IDE platform will ultimately
generate a Hex File which is then transferred and uploaded in the controller on the
board.
• The IDE environment mainly contains two basic parts: Editor and Compiler where
former is used for writing the required code and later is used for compiling and
uploading the code into the given Arduino Module.
• This environment supports both C and C++ languages.
1. START
2. Initialize the LiquidCrystal library for LCD display.
3. Define analog input pins fs1, fs2, fs3, and fs4 and output pins Voice1, Voice2, Voice3,
Voice4, Voice5, Voice6, and Voice7.
4. Set the output pins for Voice1-Voice7 to HIGH.
5. Clear the LCD and display a message for 2 seconds.
6. Clear the LCD and display another message for 2 seconds.
7. Enter a continuous loop.
• Read the analog value of fs1 and store it in variable a.
Display the value of a on the LCD. Delay
• Read the analog value of fs2 and store it in variable b.
Display the value of b on the LCD. Delay
• Read the analog value of fs3 and store it in variable c.
Display the value of c on the LCD. Delay
• Read the analog value of fs4 and store it in variable d.
Display the value of d on the LCD. Delay
• If a <= 100 and b >= 100 and c >= 100 and d >= 100, set Voice1 to LOW and
display "I NEED HELP!!" on the LCD for 2 seconds.
• If a >= 100 and b <= 100 and c >= 100 and d >= 100, set Voice2 to LOW and
display "I WANT WATER" on the LCD for 2 seconds.
• If a >= 100 and b >= 100 and c <= 100 and d >= 100, set Voice3 to LOW and
display "I WANT FOOD" on the LCD for 2 seconds.
• If a <= 100 and b <= 100 and c >= 100 and d >= 100, set Voice4 to LOW and
display "I AM NOT FEELING WELL" on the LCD for 2 seconds.
• If a >= 100 and b >= 100 and c <= 100 and d <= 100, set Voice5 to LOW and
display "I AM FINE" on the LCD for 2 seconds.
• If a <= 100 and b <= 100 and c <= 100 and d >= 100, set Voice6 to LOW and
display "I WANT TO GO TO TOILET" on the LCD for 2 seconds.
• If a <= 100 and b <= 100 and c <= 100 and d <= 100, set Voice7 to LOW and
display "THANK YOU" on the LCD for 2 seconds.
CHAPTER-5
RESULT ANALYSIS
Hand gestures
MESSAGE GESTURE
IDLE
1ST GESTURE
“I NEED HELP”
2ND GESTURE
“I WANT WATER”
3RD GESTURE
“I WANT FOOD”
4TH GESTURE
5TH GESTURE
“I AM FINE”
6TH GESTURE
“I WANT TO GO TO TOILET”
7TH GESTURE
“THANK YOU”
CHAPTER-6
6.1 CONCLUSION
• The proposed versatile speaking system for anyone with communication difficulties
based on hand gestures has been successfully developed and tested in this project. The
system uses flex sensors to detect hand gestures and an Arduino Uno microcontroller
to process the data and generate corresponding speech signals. The APR9600 module
and speaker are used for sound output, while an LCD display provides visual feedback.
• The system has been tested with various hand gestures and has shown high accuracy in
recognizing the intended gestures and generating the corresponding speech output. This
system has wide applications in helping individuals with communication difficulties
due to speech impairment, physical disabilities, or other medical conditions.
• Field trials and user studies: Conducting field trials and user studies to evaluate the
effectiveness and user satisfaction of the system could provide valuable feedback for
further improvements.
REFERENCES
[1]. Hrushikesh R. Mundhe, Sahil D. Patil, Vedant J. Mhatre, Prof. Mithun Nair, Prof.Jayesh Rane
“Hand Gesture Analyzing and Speaking System for Mute People” International Journal of
Research in Engineering and Science (IJRES) Volume 10 Issue 6,PP. 986-991(2022).
[2]. "Hand Gesture Recognition for Human-Computer Interaction: A Review of Recent Advances" by L. Zhao
et al. (2021).
[3]. "Design and Implementation of a Hand Gesture Recognition System using Machine Learning and Sensor
Fusion Techniques" by R. K. Yadav et al. (2021).
[4]. "Gesture Recognition using Machine Learning Techniques: A Comprehensive Survey" by S. Mishra and
R. K. Singh (2020).
[5]. "Design and Development of an Assistive Hand Gesture Recognition System for People with Disabilities"
by A. V. Mohan and S. D. Selvaraj (2020).
[6]. Suri et al., "Development of sign language using flex sensors", International Conference on Smart
Electronics and Communication (ICOSEC), pp. 102-106, 2020.
[7]. "Hand Gesture Recognition for Sign Language Interpretation using a 3D Depth Sensor" by R. K. Singh et
al. (2020).
[8]. M. Yasen and S. Jusoh, ‘‘A systematic review on hand gesture recognition techniques, challenges and
applications,’’ PeerJ Comput. Sci., vol. 5, p. e218, Sep. 2019.
[9]. "Gesture Recognition for American Sign Language Using a Soft Wearable Sensing Glove" by X. Zhao et
al. (2019).