0% found this document useful (0 votes)
65 views36 pages

Smart Hand Gloves

This project report describes the development of an IoT-based messaging device for deaf and dumb people. The proposed system uses flex sensors to detect hand gestures which are recognized using machine learning models. The recognized gestures are then translated to text or voice messages which are displayed or played through a speaker. The system is intended to improve communication abilities for deaf and dumb individuals. It was implemented using hardware components like an Arduino, flex sensors, speaker and LCD. Software algorithms were developed to sense gestures, recognize patterns using models, and translate to messages. Testing showed promising accuracy in recognizing various sign language gestures. The system has the potential to significantly enhance quality of life for deaf and dumb users by enabling more efficient expression and interaction.

Uploaded by

devatiswarup
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
65 views36 pages

Smart Hand Gloves

This project report describes the development of an IoT-based messaging device for deaf and dumb people. The proposed system uses flex sensors to detect hand gestures which are recognized using machine learning models. The recognized gestures are then translated to text or voice messages which are displayed or played through a speaker. The system is intended to improve communication abilities for deaf and dumb individuals. It was implemented using hardware components like an Arduino, flex sensors, speaker and LCD. Software algorithms were developed to sense gestures, recognize patterns using models, and translate to messages. Testing showed promising accuracy in recognizing various sign language gestures. The system has the potential to significantly enhance quality of life for deaf and dumb users by enabling more efficient expression and interaction.

Uploaded by

devatiswarup
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

VISVESVARAYA TECHNOLOGICAL UNIVERSITY

BELAGAVI-590018

Project Report on
“IoT-Based Messaging Device for Deaf, and Dumb People”

Submitted in the partial fulfilment of the requirements for the award of the Degree of
Bachelor of Engineering in Electronics and Communication Engineering

Submitted by Under the guidance of


Bhuvaneshwari V (1OX20EC011) Dr. A Chrispin Jiji
Chandan BV (1OX20EC012) Associate Professor
Devati Swarup (1OX20EC019) Dept of ECE, TOCE
K. Navya Madhuri (1OX20EC027)

Project carried out at

THE OXFORD COLLEGE OF ENGINEERING


(NAAC Accredited)
Bengaluru - 560 068

DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING


(NBA Accredited)

THE OXFORD COLLEGE OF ENGINEERING


BOMMANAHALLI, HOSUR ROAD, BENGALURU - 560 068
2022-23

i
THE OXFORD COLLEGE OF ENGINEERING
(NAAC Accredited)
BOMMANAHALLI, HOSUR ROAD, BENGALURU-560 068
(Affiliated to VTU and approved by AICTE)

Department of Electronics & Communication Engineering


(NBA Accredited)

CERTIFICATE
Certified that the Project work entitled “IoT-Based Messaging Device for Deaf,
and Dumb People”carried out by BHUVANESHWARI V (1OX20EC011)
,CHANDAN BV (1OX20EC012), DEVATI SWARUP (1OX20EC019), K NAVYA
MADHURI (1OX20EC027) bonafied students of The Oxford College of Engineering,
Bengaluru in partial fulfilment for the award of the Degree of Bachelor of Engineering in
Electronics and Communication Engineering of the Visvesvaraya Technological
University, Belagavi during the year 2022-2023. It is certified that all corrections/suggestions
indicated for Internal Assessment have been incorporated in the report deposited in the
departmental library. The Project report has been approved as it satisfies the academic
requirements in respect of Project work prescribed for the said Degree.

Guide HOD Principal

Dr. A Chrispin Jiji Dr. Manju Devi Dr.N Kannan


Asso. Professor Professor and HOD TOCE
Dept. of ECE, TOCE Dept. of ECE, TOCE
External Viva

Signature with Date


Name of the Examiners

1.______________________ _______________________

2._______________________ _______________________
ii
ACKNOWLEDGEMENT

A project is a job of great enormity and it can’t be accomplished by an individual all by them.
Eventually, we are grateful to a number of individuals whose professional guidance, assistance
and encouragement have made it a pleasant endeavour to undertake this mini project.

It gives us great pleasure in expressing our deep sense of gratitude to our respected Founder
Chairman Late. Sri S. Narasa Raju, and to our respected Chairman Sri S.N.V.L Narasimha
Raju, for having provided us with great infrastructure and well-furnished labs.

We take this opportunity to express our profound gratitude to our respected Principal Dr. N
KANNAN for his support.

We are grateful to the Head of the Department Dr. Manju Devi, for her unfailing
encouragement and suggestion given to us in the course of our mini project work. Guidance
and deadlines play a very important role in successful completion of the mini project on time.

We also convey our gratitude to our internal project guide Dr. A Chrispin Jiji, for having
constantly guided and monitored the development of the mini project. Finally, a note of thanks
to the Department of Electronics and Communication, both teaching and non-teaching staff for
their co-operation extended to us.

We thank our parents for their constant support and encouragement. Last, but not the least, we
would like to thank our peers and friends.

iii
DECLARATION

We, BHUVANESHWARI V (1OX20EC011)


, CHANDAN BV (1OX20EC012), DEVATI SWARUP (1OX20EC019), K NAVYA
MADHURI (1OX20EC027) of 6th semester B.E, in the Department of Electronics and
Communication Engineering, The Oxford College of Engineering,
Bengaluru declare that the Mini project work entitled “IoT-Based Messaging Device for
Deaf, and Dumb People” has been
carried out by us and submitted in partial fulfilment of the course requirements for the award
of degree in Bachelor of Engineering in Electronics and Communication Engineering
discipline of Visvesvaraya Technological University, Belagavi during the academic year
2022-23. Further, the matter embodied in the dissertation has not been submitted previously by
anybody for the award of any degree or diploma to any other university.

Bhuvaneshwari V (1OX20EC011)
Chandan BV (1OX20EC012)
Devati Swarup (1OX20EC019)
K Navya Madhuri (1OX20EC027)

Place: Bengaluru

Date:

iv
ABSTRACT

Hand gestures play a crucial role in facilitating communication for individuals who are deaf
and dumb, enabling them to express their thoughts, emotions, and needs. This abstract
introduces a novel hand gesture recognition system designed to bridge the communication gap
between deaf and dumb individuals and the general population. The system aims to provide an
intuitive and efficient means of communication, empowering deaf and dumb individuals to
interact with others effectively. The system's flexibility and adaptability will make it a valuable
tool for people with a wide range of communication challenges, providing them with greater
independence and social participation

To achieve robust and accurate recognition, a machine learning model is trained using a large
dataset of labeled hand gesture samples. Various machine learning techniques, such as
convolutional neural networks (CNNs) or recurrent neural networks (RNNs), are employed to
learn the complex patterns and associations between hand gestures and their
corresponding meanings.

Through rigorous testing and evaluation, the hand gesture recognition system has shown
promising results, achieving high accuracy in recognizing a diverse range of hand gestures
commonly used in sign language. It has the potential to significantly improve the quality of life
for dumb and deaf individuals, empowering them to express themselves more efficiently and
interact with the world around them.

Overall, this abstract highlights the development of a hand gesture recognition system tailored
to the unique needs of the dumb and deaf community. The system's accurate recognition and
translation of hand gestures have the potential to bridge the communication gap, opening up
new avenues for effective interaction and inclusion of dumb and deaf individuals in society.

v
TABLE OF CONTENT

SL.NO TOPIC NAME PAGE NO

1. Acknowledgement iii
2. Abstract v

3. Table of Contents vi

4. List of Figures viii

5. List of table viii

5. Chapter 1: Introduction

1.1 Overview 1-2

1.2 Problem Statement 2

1.3 Objectives 3

1.4 Motivation 3-4

1.5 Scope of the project 4

1.6 Organization of the report 4

6. Chapter 2: Literature Review 5-6

7.
Chapter 3: Proposed Methodology
3.1 Introduction 6
3.2 Block diagram of the model 6-9

3.3 Applications of Proposed Methodology 9-10

8. Chapter 4: Implementation Details

4.1 Flow chart of the proposed model 10-12


4.2 Hardware Components 12-20
4.2.1 Arduino UNO 12-15

4.2.2 Flex sensors 15


4.2.3 APR9600 16

4.2.4 Speaker 16

vi
4.2.5 LCD 16-17

4.26 HC05 17

4.2.6 Hardware Connections 18-19

4.2.7 Procedure to REC/PLAY voice in APR module 20

4.3 Software Implementation

4.3.1 Introduction 21-22

4.3.2 Arduino Code Algorithm 22-23

9. Chapter 5: Result Analysis 24-26

10. Chapter 6: conclusion and Future Scope

6.1 Conclusion 27

6.2 Future Scope 27-28

11. Chapter 7: References 28

vii
List of figures
Figure no. Figure name Page no.

1. Communication difficulties 2

2. Block diagram of the model 6

3. Flow chart of Proposed model 9

4. Arduino UNO 11

5. Arduino Pin Configuration 12

6. Flex sensors 13

7. APR9600 module 14

8. 2x16 LCD 15

9. Hardware of the model 16

10. APR voice module with REC/PLAY switch & 18


MIC

11. Arduino IDE 19

List of tables
Table number Table name Page no.
1. Various Hand gesture 22-24

viii
IoT-Based Messaging Device for Deaf, and Dumb People

CHAPTER-1 INTRODUCTION

1.1 Overview

How frequently will we come across the mute people communicating with the normal people?
On comparison of communication between the blind and a normal sight person, the
communication between a deaf and normal person is a serious problem. Amongst the deaf
people in world, sign language is a nonverbal form of intercommunication. This sign language
doesn’t have a common origin and hence it is difficult to understand and translate for normal
people.
According to the world health organization, about 1 million people are dumb and 300 million
people are deaf in the world. The power of communication can either be a blessing or a curse.
It helps to express thoughts and feelings. At times GOD plays his pranks on human beings and
steal from them the ability to listen and speak the so-called normal peoples calls them DEAF
AND DUMB. They are normal in all aspects except that they can’t communicate like others.
This inability always differentiates them from others in society. They use sign language as the
only medium for communication. Sign language uses both facial expressions and hand gestures
to convey the essence of what an individual is trying to express. Each country generally has its
own, native sign language, and some have more than one. To achieve the human-computer
interaction for the disabled people, the human hand could be an input device. Various
approaches have been proposed for enabling hand gesture recognition.

Communication is an essential human function that allows individuals to interact, express


themselves, and connect with others. However, for individuals with communication
difficulties, such as those with physical disabilities, speech impairments, or other limitations,
communication can be challenging, often resulting in social isolation and reduced quality of
life.

With the massive influx of computers in society, human computer interaction, or HCI, has
become an increasingly important part of daily lives. Current user interaction devices with
keyboard, mouse and pen are not sufficient for physically challenged people and Virtual

Dept of ECE, TOCE 1


IoT-Based Messaging Device for Deaf, and Dumb People

Environment (VE) which induce many new types of representation and interaction. Gesture,
speech, and touch inputs are few possible methods of meeting such user’s need to solve this
problem.

1.2 Problem statement


It’s very difficult for mute people to convey their message to regular people. Since regular
people are not trained on hand sign language, the communication becomes very difficult. In
emergency or other times when a mute person travelling or among new people communication
with nearby people or conveying a message becomes very difficult.

Dumb and deaf individuals face significant challenges in communicating with others due to
their inability to speak or hear. While sign language is commonly used as a visual means of
communication, it requires both parties to be proficient in it, limiting effective communication
in certain situations. Hand gestures, on the other hand, provide a more intuitive and universal
form of communication. However, the lack of standardized gestures and the difficulty in
accurately interpreting them often hinder effective communication.

By addressing these challenges, the proposed solution aims to empower dumb and deaf
individuals by providing them with a reliable and efficient means of communication through
hand gestures. This would enhance their ability to express themselves, engage in social
interactions, and access information, ultimately fostering greater inclusivity and
participation in society.

Fig (1): Communication difficulties

Dept of ECE, TOCE 2


IoT-Based Messaging Device for Deaf, and Dumb People

1.3 Objectives

1. Our objective is to develop A Versatile Speaking System for Anyone with


Communication Difficulties Based on Hand Gestures.

2. To develop a portable and affordable speaking system that can be used by individuals
with communication difficulties in a variety of settings, including at home, school, and
in the community.

3. To develop a system which can help to communicate in the native language of


the user i.e., the local language of that State, here we focus on Kannada.

4. To develop wireless Bluetooth module with sensor data transmission and wearable
technology that can enhance or restore an individual’s ability to hear and communicate
effectively

1.4 Motivation
Communication: Sign language allows deaf and dumb individuals to express themselves,
engage in conversations, and convey their thoughts and feelings to others. By using hand
gestures, they can communicate effectively without relying solely on spoken language or
written communication.

Inclusivity: Hand gestures and sign language help create a more inclusive society by breaking
down barriers between people with different abilities. By learning and using sign language,
individuals with normal hearing can communicate directly with deaf and dumb individuals,
fostering a sense of equality and understanding.

Independence: Hand gestures empower deaf and dumb individuals to become more
independent in their daily lives. They can communicate their needs, participate in social
interactions, and access information without relying on an interpreter or
written communication.

Dept of ECE, TOCE 3


IoT-Based Messaging Device for Deaf, and Dumb People

This project has the potential to significantly enhance the lives of many people, providing them
with the tools they need to connect with others and participate in society.

1.5 Scope of the project

• The scope of A Versatile Speaking System for Anyone with Communication


Difficulties Based on Hand Gestures is to provide an innovative and universal solution
for individuals with communication limitations. The system aims to recognize a broad
range of hand gestures commonly used in sign language and translate them into spoken
language using text-to-speech technology.
• The system is based on flex sensors and microcontroller that can accurately recognize
hand gestures in real-time. The system will be designed to be flexible and versatile,
allowing it to be used by individuals of all ages and abilities, including those with
physical disabilities, older adults, and patients who cannot speak due to illness or injury.
• The project will involve designing and developing the hardware, testing it for accuracy
and reliability, and refining the system based on user feedback. The ultimate goal is to
provide an accessible and reliable communication solution that enhances the quality of
life for individuals with communication difficulties.
• The system has potential applications in a variety of settings, including healthcare
facilities, schools, and public spaces, where individuals with communication limitations
may need to communicate with others. By developing a speaking system based on hand
gestures, we aim to provide an innovative and practical solution for individuals with
communication difficulties.

1.6 Organization of the report

The report is organized as follows:


• Chapter 1: Introduction gives the details of the project.
• Chapter 2: Presents the literature survey of the project.
• Chapter 3: Deals with the proposed methodology.
• Chapter 4: Gives the implementation details of the project.
• Chapter 5: Shows the experimental results of the project.
• Chapter 6: Gives the conclusion and future scope of the project. Then at last some
references are given

Dept of ECE, TOCE 4


IoT-Based Messaging Device for Deaf, and Dumb People

CHAPTER-2 LITERATURE REVIEW

In [1] Gesturing is an instinctive way of communication to present a specific meaning or intent.


In this paper, sign language interpretation system using a wearable hand glove is proposed.
This wearable system uses five flex-sensors, two pressure sensors, and a three-axis inertial
motion sensor to differentiate the characters in the American Sign Language alphabet. The
whole system consists of three units: a wearable device with a sensor module, a processing
module and a display unit mobile application module. Mobiles that are based on android
application were developed with a text-to-voice function that converts the received text into
audible output.

In [2] Motivation of this project is to help the speech impaired communities by developing an
electronic speaking system. Arduino is main control unit for this system. Arduino was
programmed in such way that configuration settings can readily change without changing the
entire program code.
In [3] the mute person will ho through the complete sentence which he wants to communicate
with others. Then the sentence read by the person will be able to translate it into speech, which
will be understood audible to everyone.

In [4] A scheme using a database-driven hand gesture recognition based upon skin color model
approach and thresholding approach along with an effective template matching with can be
effectively used for human robotics applications and similar other applications. Initially, hand
region segmented by applying skin color model in YCbCr color space. In the next stage
thresholding is applied to separate foreground and background. Finally, for recognition
Principal Component Analysis is used for template-based matching.

In [5] Different research is done to analyze and evaluate how the device can reduce the
difficulty in Communication among people having listening and speech disability and find out

Dept of ECE, TOCE 5


IoT-Based Messaging Device for Deaf, and Dumb People

the limitations of the device in comparison to the other technologies and devices working
towards a similar objective. Their communications with others only involve the use of motion
by their hands and expressions and designed an artificial speaking mouth for dumb people.
This will also help other people to understand impaired people.

In [6] Generally, communication between impaired people and normal people is done through
synthesized speech which is known as sign language. Using a flex sensor and Arduino Mega
2560 Microcontroller information is converted into voice command and then an impaired
people can have communication with the normal people [1].

CHAPTER-3 PROPOSED METHODOLOGY

3.1 Introduction

The System involves the use of flex sensors to detect hand movements and convert them into
signals. These signals are then processed by an Arduino Uno microcontroller to recognize the
corresponding hand gesture. The recognized gesture is then mapped to a specific word or
phrase stored in the APR9600 voice recording module. The module plays the recorded sound
of the word or phrase through a speaker. Additionally, an LCD screen is used to display the
recognized gesture and the corresponding word or phrase. This system aims to provide a
versatile and easy-to-use communication solution for people with speech impairments or
communication difficulties.

3.2 Block diagram of the model

The figure shows the basic block diagram of the proposed model

Dept of ECE, TOCE 6


IoT-Based Messaging Device for Deaf, and Dumb People

Hc05

Mobile

Fig (2): Block diagram of the model

3.2.1 Power supply


Power supply is an essential component because all electronic components in the system
require a stable and reliable source of power to operate correctly. the power supply is needed
to power the microcontroller, flex sensors, and any other electronic components used to detect
and generate speech output from hand gestures. The power supply must be carefully chosen to
ensure that it meets the requirements of the different components used in the project. For
example, the voltage and current ratings of the power supply must be appropriate for the
microcontroller and any other electronic components used
3.2.2 Arduino UNO

Hardware Setup:
Arduino Uno: You'll need an Arduino Uno board as the main microcontroller for this project.
Gesture Sensor: Choose a suitable gesture sensor module, such as an accelerometer or a gesture
recognition sensor, to detect hand movements.
Power Supply: Ensure you have a power supply for the Arduino board and the gesture sensor.

Software Setup:

Dept of ECE, TOCE 7


IoT-Based Messaging Device for Deaf, and Dumb People

Install the Arduino IDE (Integrated Development Environment) on your computer.


Connect the Arduino Uno to your computer via USB.

Connect the Hardware:


Connect the gesture sensor module to the appropriate pins on the Arduino Uno, following the
sensor's documentation.
Double-check the wiring and connections to ensure everything is properly connected.

3.2.3 Flex sensors:


Hardware Setup:

Arduino Uno: You'll need an Arduino Uno board as the main microcontroller.

Flex Sensors: Choose suitable flex sensors that can measure the bending of fingers.

Power Supply: Ensure you have a power supply for the Arduino board and the flex sensors.

Resistors: Use resistors to create a voltage divider circuit for the flex sensors.

Connecting wires: Connect the flex sensors to the Arduino Uno using wires.

Software Setup:

Install the Arduino IDE (Integrated Development Environment) on your computer.

Connect the Arduino Uno to your computer using a USB cable.

Connect the Hardware:


Connect the flex sensors to the Arduino Uno following the wiring diagram and pin
assignments.
Create a voltage divider circuit for each flex sensor by connecting a resistor in series
with the sensor and connecting the common point between them to the Arduino's
analog input pins.

Ensure that the connections are secure and there are no wires

Dept of ECE, TOCE 8


IoT-Based Messaging Device for Deaf, and Dumb People

3.2.4 APR9600 module


The APR9600 is a voice recording and playback module that can store and play back audio
messages in a variety of applications. the APR9600 module is used to record and store
prerecorded audio messages that correspond to the hand gestures recognized by the system.
When a particular hand gesture is recognized, the corresponding audio message is played back
to communicate the intended message. This makes it easier for individuals with communication
difficulties to convey their thoughts and needs.

3.2.5 Speaker
The speaker is an essential component as it is used to play back the audio messages stored in
the APR9600 module. When a particular hand gesture is recognized by the system, the
corresponding audio message is played through the speaker, allowing individuals with
communication difficulties to convey their thoughts and needs effectively. The speaker also
allows for the system to be used in various environments, such as noisy environments, where
the audio output needs to be loud and clear. Therefore, the speaker plays a crucial role in
making the speaking system more effective and user-friendly.

3.2.6 LCD
The LCD is used in to display the converted text from the hand gestures made by the user. The
system recognizes the hand gestures made by the user using flex sensors and converts them
into text using an algorithm. The converted text is then displayed on the LCD screen for the
user and the listener to see, making communication easier. The LCD screen can display
characters, numbers, and symbols, which makes it an ideal component to display the text output
from the system. Additionally, the LCD screen is easy to read and is low-power, which makes
it suitable for use in a portable and versatile speaking system.
3.27 HC05

3.3 Applications of proposed methodology


Assistive Technology for Special Education: The system can be used in special education
settings, where students with disabilities can use the system to communicate with teachers and
peers.

Communication Assistance: The hand gesture recognition system can be used to assist deaf
and mute individuals in communicating with others. By recognizing and interpreting their hand

Dept of ECE, TOCE 9


IoT-Based Messaging Device for Deaf, and Dumb People

gestures, the system can convert them into corresponding text or speech output, enabling
effective communication with hearing individuals.

Education and Learning: The methodology can be integrated into educational tools for deaf
and mute individuals. It can help them learn sign language more effectively by providing real-
time feedback on the accuracy of their gestures. This can be particularly useful for self-learning
or practicing sign language outside of formal classroom settings

Human-Computer Interaction: The hand gesture recognition system can be employed as a


means of interacting with computers and other electronic devices. Deaf and mute individuals
can use hand gestures to control devices, navigate user interfaces, and perform tasks without
relying on traditional input methods like keyboards or mice.

Human-Computer Interaction: The hand gesture recognition system can be employed as a


means of interacting with computers and other electronic devices. Deaf and mute individuals
can use hand gestures to control devices, navigate user interfaces, and perform tasks without
relying on traditional input methods like keyboards or mice.

Rehabilitation and Therapy: Hand gesture recognition can be utilized in rehabilitation


programs for individuals with speech or hearing impairments. It can assist therapists in
monitoring and assessing patients' progress by analyzing their hand gestures during
therapy sessions.

CHAPTER-4 IMPLEMENTATION DETAILS

4.1 Flow chart of Proposed Method

Dept of ECE, TOCE 10


IoT-Based Messaging Device for Deaf, and Dumb People

fig (3): Flow chart of Proposed model

The above flow chart shows us the basic working flow of the proposed model

• Initialization: When the system is turned on, it starts by initializing all the components,
such as the Arduino, flex sensors, speaker, and LCD.
• Calibration: The flex sensors are calibrated to map the analog output to the
corresponding finger movement. This step involves measuring the minimum and
maximum resistance values of the flex sensors to determine the range of motion for
each finger.
• The Arduino detects the analog signal from the flex sensor by using its built-in analogto-
digital converter (ADC).
• Flex sensors are variable resistors that change their resistance based on the amount of
bending or flexing they experience. The resistance of the flex sensor is measured by
applying a small voltage across its two terminals and measuring the resulting voltage
drop using the ADC of the Arduino.The ADC converts the analog voltage signal from
the flex sensor into a digital value that can be processed by the Arduino. The digital
value represents the relative position or degree of bending of the flex sensor.

Dept of ECE, TOCE 11


IoT-Based Messaging Device for Deaf, and Dumb People

• Gesture Recognition: After calibration, the system is ready to recognize hand gestures.
The user wears the flex sensors on their fingers and moves their fingers in different
patterns to form various gestures. The flex sensors measure the resistance values of the
fingers, which are then converted into digital signals by the Arduino.
• Signal Processing: The digital signals are processed by the Arduino using signal
processing techniques to remove noise and improve the accuracy of the gesture
recognition.
• Gesture Mapping: The digital signals are then mapped to specific gestures based on the
pre-defined mapping table. The mapping table is a lookup table that maps the digital
signals to their corresponding hand gestures.
• Text-to-Speech Conversion: Once the hand gesture is recognized, the corresponding
text is displayed on the LCD screen, and the system converts the text into speech using
the APR9600 module.
• Audio Output: The speech is then output through the speaker, allowing the user to
communicate with others.
• Shutdown: Once the communication is complete, the system shuts down, and all the
components are turned off.

4.2 Hardware Components:

4.2.1 Arduino UNO

fig (4): Arduino UNO

Dept of ECE, TOCE 12


IoT-Based Messaging Device for Deaf, and Dumb People

The Arduino Uno is a microcontroller board based on the ATmega328.It has 14 digital
input/output pins (of which 6 can used as PWM outputs),6 analog inputs, a 16 MHz crystal
oscillator, a USB connection, a power jack, an ICSP header, and a reset button. It contains
everything needed to support the microcontroller; it is simply connected it to a computer with
a USB cable or power it with a AC-to-DC adapter or battery to get started. “UNO” means one
in Italian and is named to mark the upcoming release of Arduino1.0.

The Uno and version 1.0 will be the reference versions of Arduino. Moving forward, the Uno
is the latest in a series of USB Arduino boards, and the reference model for Arduino platform.

The Arduino Uno can be powered via the USB connection or with an external power supply.
The power source is selected automatically. The board can operate on an external supply of
620volts.If supplied with less than 7V, however the 5V pin may supply less than five volts and
the board may be unstable. If using more than 12V, the voltage regulator may overheat and
damage the board. The recommended range is 7 to 12 volts.

Table 1: TECHNICAL SPECIFICATION OF ARDUINO UNO (AT mega 328)

Fig (5): Pin Configuration of Arduino Uno (AT MEGA 328) The power pins
are as follows:

Dept of ECE, TOCE 13


IoT-Based Messaging Device for Deaf, and Dumb People

• VIN: The input voltage to the Arduino board when it’s using an external power source
(as opposed to 5 volts from the USB connection or other regulated power source). You
can supply voltage through this pin, or, if supplying voltage via the power jack, access
it through this pin.
• 5V: The regulated power supply used to power the microcontroller and other
components on the board. This can come either from VIN via an on-board regulator, or
be supplied by USB or another regulated 5V supply.
• 3V3: A 3.3-volt supply generated by the on-board regulator. Maximum current draw is
50mA.
• GND: Ground pin
The Atmega328 has 32KB of flash memory for storing code (of which 0.5KB is used for the
bootloader), it also 2KB of SRAM and 1KB of EEPROM (which is described in EEPROM
library.
Each of the 14 digital pins on the Uno can be used as an input or output using pinMode(),
digitalWrite(),and digitalRead() functions. They operate at 5volts. Each pin can provide or
receive maximum of 40 mA and has an internal pull-up resistor (disconnected by default) of
20-50 KOhms. In addition, some pins have specialized functions:

Serial: 0(RX) and 1(TX), used to receive (RX) and transmit (TX) TTL serial data. These pins
are connected to the corresponding pins of the ATmega8U2 USB-to-TTL Serial chip External
interrupts: 2 and 3, These pins can be configured to trigger an interrupt on a low value, a rising
or falling edge, or a change in value.

PWM: 3,5,6,9,10 and 11,Provide 8-bit PWM output with the analogWrite() function.

SPI:10 (SS),11(MOSI),12(MISO),13(SCK), These pins support SPI communication, which


although provided by the underlying hardware, is not currently included in the Arduino
language.
LED: 13, there is a built-in LED connected to digital pin 13.When the pin is HIGH value, the
LED is on, when the pin is LOW, it's off.

TYPES OF INPUT
1. Direct Manner-Which is directly connected to the UNO board.
2. Semi Manual Manner-Provided by switch of 0V or +5V.

Dept of ECE, TOCE 14


IoT-Based Messaging Device for Deaf, and Dumb People

3. Automatical Manner-Provided by digital sensors.


GPIO (General purpose input/output)
GPIO is a type of pin found on an integrated circuit that does not have a specific function.
While most pins have a dedicated purpose, such as sending a signal to a certain component, the
function of a GPIO pin is customizable and can be controlled by software.
4.2.2 Flex Sensors

Fig (6): Flex sensors

A flex sensor is a kind of sensor which is used to measure the amount of defection otherwise
bending. The designing of this sensor can be done by using materials like plastic and carbon.
The carbon surface is arranged on a plastic strip as this strip is turned aside then the sensor’s
resistance will be changed. Thus, it is also named a bend sensor. As its varying resistance can
be directly proportional to the quantity of turn thus it can also be employed like a goniometer.

The pin configuration of the flex sensor is shown below. It is a two-terminal device, and the
terminals are like p1 & p2. This sensor doesn’t contain any polarized terminal such as diode
otherwise capacitor, which means there is no positive & negative terminal. The required
voltage of this sensor to activate the sensor ranges from 3.3V -5V DC which can be gained
from any type of interfacing.

Dept of ECE, TOCE 15


IoT-Based Messaging Device for Deaf, and Dumb People

4.2.3 APR9600

Fig (7): APR9600 IC

The APR9600 device offers true single-chip voice recording, non-volatile storage, and
playback capability for 40 to 60 seconds. The device supports both random and sequential
access of multiple messages. Sample rates are user-selectable, allowing designers to customize
their design for unique quality and storage time needs. Integrated output amplifier, microphone
amplifier, and AGC circuits greatly simplify system design. the device is ideal for use in
portable voice recorders, toys, and many other consumers and industrial applications.

4.2.4 Speaker

Speakers are one of the most common output devices used with computer systems. Some
speakers are designed to work specifically with computers, while others can be hooked up to
any type of sound system. Regardless of their design, the purpose of speakers is to produce
audio output that can be heard by the listener.

4.2.5 LCD (Liquid crystal display)

A liquid-crystal display is a flat-panel display or other electronically modulated optical device


that uses the light-modulating properties of liquid crystals. Liquid crystals do not emit light
directly, instead using a backlight or reflector to produce images in colour or monochrome.

Dept of ECE, TOCE 16


IoT-Based Messaging Device for Deaf, and Dumb People

Fig (9) :2x16 LCD

1. 16x2 LCD is named so because, it has 16 columns and 2 rows. There are a lot of
combination available like,8x1, 8x2, 10x2, 16x1, etc. But the most used one is the 16*2
LCD.
2. All the above-mentioned LCD display will have 16 pins and the programming approach
is also the same. Above is the pin description of 16*2 LCD Module.

3.2.7 HCO5

The HC-05 is a popular module which can add two-way (full-duplex) wireless
functionality to your projects. You can use this module to communicate
between two microcontrollers like Arduino or communicate with any device
with Bluetooth functionality like a Phone or Laptop. There are many android
applications that are already available which makes this process a lot easier.

Dept of ECE, TOCE 17


IoT-Based Messaging Device for Deaf, and Dumb People

4.2.7 Hardware Connections

Fig(10): Hardware of the model Input:

Power supply:

• Connect the 12V adapter to the power supply


• Connect the J2 pin of the Power supply to the Vin & Gnd pin of the Arduino

Flex sensor connection:

• Connect one end of the flex sensor to the 5V pin of Arduino.


• Connect the other end of the flex sensor to the A0 analog input pin of Arduino.
• Connect a resistor between the A0 pin and ground pin of Arduino.
• So, 4 flex sensors have 4 of its pins connected to the ground and the other pins connected
as input to the Arduino, Pin A0, A1, A2, A3 respectively.

Output:

LCD connection:

• Connect the VSS pin of the LCD to GND pin of the Arduino.
• Connect the VDD pin of the LCD to 5V pin of the Arduino.

Dept of ECE, TOCE 18


IoT-Based Messaging Device for Deaf, and Dumb People

• The Data pins of LCD i.e., D0-D7 pins of the LCD are connected to Pin numbers A4,
A5,4,5,6,7 of the Arduino respectively.

ARP9600 connection:

• Connect the VCC pin of the ARP9600 to the 5V pin of the Arduino.
• Connect the GND pin of the ARP9600 to the GND pin of the Arduino.
• Messages in APR module are stored in different pins, messages M1-M7 pins are
connected to pin numbers 8,9,10,11,12,2,3 of the Arduino.

Speaker connection:

• Connect the positive (+) terminal of the speaker to the SP+ pin of the ARP9600 module.
• Connect the negative (-) terminal of the speaker to the SP- pin of the ARP9600 module.

Hc05 bluetooth connection:

• Connect VCC pin of HC05 to the 3.3V pin of the Arduino


• Connect GND pin of HC05 to the any GND pin of the Arduino
• Connect Rx pin of HC05 to the Tx pin of the Arduino
• Connect Tx pin of HC05 to the Rx pin of the Arduino

TTS Bluetooth Application:

• Turn on the Bluetooth in the device in which you need to see the output
• Select the device HC05 in the application
• And pair with the device
• Enter the code 1234 for pairing the device

Dept of ECE, TOCE 19


IoT-Based Messaging Device for Deaf, and Dumb People

4.2.8 Record/Play APR voice module

Fig (11): APR voice module with REC/PLAY switch & MIC

Recording

• The APR module consists of REC/PLAY switch.


• To record the audio the switch is kept to REC mode.
• Take any message pin wire and touch the ground pin of the circuit, when you touch the
ground pin we get a beep sound, once you get the beep sound you can start recording
any voice by speaking near the mic in the circuit.
• The user can record any sort of voice messages as per requirement it can be any
language English, Kannada, Telugu, Tamil, Malayalam, etc.
• Once the voice is recorded, take off the message pin wire from the ground, the voice is
saved and stored in the voice module and switch it back to PLAY mode.
• Same procedure is followed to record all messages.

Playing

• To play the voice message the switch should be always in PLAY mode.
• To play any recoded message, that pin can be touched to the ground of the circuit.
• Once the wire touches ground, we can get the voice message through the speaker.
• The message pins are always kept high in the code, when the Arduino detects the hand
gesture it pulls down the matched gesture- message pin to low, meaning ground, that
acts like the trigger and the voice message is heard through the speaker, once the voice
message is played, the Arduino pulls up the pin high again.

Dept of ECE, TOCE 20


IoT-Based Messaging Device for Deaf, and Dumb People

4.3 Software Implementation

4.3.1 Introduction

Fig (12): Arduino IDE

The Arduino Integrated Development Environment - or Arduino Software (IDE) - contains a


text editor for writing code, a message area, a text console, a toolbar with buttons for common
functions and a series of menus. It connects to the Arduino hardware to upload programs and
communicate with them.
• Arduino IDE is an open-source software that is mainly used for writing and compiling
the code into
• the Arduino Module.
• It is an official Arduino software, making code compilation too easy that even a
common person with
• no prior technical knowledge can get their feet wet with the learning process.
• It is easily available for operating systems like MAC, Windows, Linux and runs on the
Java Platform
• that comes with inbuilt functions and commands that play a vital role for debugging,
editing and compiling the code in the environment.
• A range of Arduino modules available including Arduino Uno, Arduino Mega, Arduino
Leonardo, Arduino Micro and many more.
• Each of them contains a microcontroller on the board that is actually programmed and
accepts the information in the form of code.

Dept of ECE, TOCE 21


IoT-Based Messaging Device for Deaf, and Dumb People

• The main code, also known as a sketch, created on the IDE platform will ultimately
generate a Hex File which is then transferred and uploaded in the controller on the
board.
• The IDE environment mainly contains two basic parts: Editor and Compiler where
former is used for writing the required code and later is used for compiling and
uploading the code into the given Arduino Module.
• This environment supports both C and C++ languages.

4.3.3 Arduino code Algorithm

Here the following represents :


• a : value of the first flex sensor (fs1)
• b : value of the second flex sensor (fs2)
• c : value of the third flex sensor (fs3)
• d : value of the fourth flex sensor (fs4)

The following is the Arduino code logic algorithm:

1. START
2. Initialize the LiquidCrystal library for LCD display.
3. Define analog input pins fs1, fs2, fs3, and fs4 and output pins Voice1, Voice2, Voice3,
Voice4, Voice5, Voice6, and Voice7.
4. Set the output pins for Voice1-Voice7 to HIGH.
5. Clear the LCD and display a message for 2 seconds.
6. Clear the LCD and display another message for 2 seconds.
7. Enter a continuous loop.
• Read the analog value of fs1 and store it in variable a.
Display the value of a on the LCD. Delay
• Read the analog value of fs2 and store it in variable b.
Display the value of b on the LCD. Delay
• Read the analog value of fs3 and store it in variable c.
Display the value of c on the LCD. Delay
• Read the analog value of fs4 and store it in variable d.
Display the value of d on the LCD. Delay

Dept of ECE, TOCE 22


IoT-Based Messaging Device for Deaf, and Dumb People

• If a <= 100 and b >= 100 and c >= 100 and d >= 100, set Voice1 to LOW and
display "I NEED HELP!!" on the LCD for 2 seconds.

• If a >= 100 and b <= 100 and c >= 100 and d >= 100, set Voice2 to LOW and
display "I WANT WATER" on the LCD for 2 seconds.

• If a >= 100 and b >= 100 and c <= 100 and d >= 100, set Voice3 to LOW and
display "I WANT FOOD" on the LCD for 2 seconds.

• If a <= 100 and b <= 100 and c >= 100 and d >= 100, set Voice4 to LOW and
display "I AM NOT FEELING WELL" on the LCD for 2 seconds.

• If a >= 100 and b >= 100 and c <= 100 and d <= 100, set Voice5 to LOW and
display "I AM FINE" on the LCD for 2 seconds.

• If a <= 100 and b <= 100 and c <= 100 and d >= 100, set Voice6 to LOW and
display "I WANT TO GO TO TOILET" on the LCD for 2 seconds.

• If a <= 100 and b <= 100 and c <= 100 and d <= 100, set Voice7 to LOW and
display "THANK YOU" on the LCD for 2 seconds.

8. If any other conditions are met, do nothing.


9. Set all output pins (Voice1-Voice7) back to HIGH.

Dept of ECE, TOCE 23


IoT-Based Messaging Device for Deaf, and Dumb People

CHAPTER-5

RESULT ANALYSIS
Hand gestures

MESSAGE GESTURE
IDLE

1ST GESTURE

“I NEED HELP”

2ND GESTURE

“I WANT WATER”

Dept of ECE, TOCE 24


IoT-Based Messaging Device for Deaf, and Dumb People

3RD GESTURE

“I WANT FOOD”

4TH GESTURE

“I AM NOT FEELING WELL”

5TH GESTURE

“I AM FINE”

6TH GESTURE

“I WANT TO GO TO TOILET”

Dept of ECE, TOCE 25


IoT-Based Messaging Device for Deaf, and Dumb People

7TH GESTURE

“THANK YOU”

Table (3) : Various Hand gestures

Dept of ECE, TOCE 26


A Versatile Speaking System for Anyone with Communication Difficulties Based on Hand Gestures

CHAPTER-6

CONCLUSION AND FUTURE SCOPE

6.1 CONCLUSION

• The proposed versatile speaking system for anyone with communication difficulties
based on hand gestures has been successfully developed and tested in this project. The
system uses flex sensors to detect hand gestures and an Arduino Uno microcontroller
to process the data and generate corresponding speech signals. The APR9600 module
and speaker are used for sound output, while an LCD display provides visual feedback.

• The system has been tested with various hand gestures and has shown high accuracy in
recognizing the intended gestures and generating the corresponding speech output. This
system has wide applications in helping individuals with communication difficulties
due to speech impairment, physical disabilities, or other medical conditions.

• In conclusion, the developed system provides a cost-effective, user-friendly, and


efficient solution for speech-impaired individuals, enabling them to communicate
effectively with others using hand gestures. Further improvements and enhancements
can be made to this system, such as adding more gestures, improving the accuracy of
recognition, and integrating it with other assistive technologies.

6.2 FUTURE SCOPE

• Improvement in accuracy: The accuracy of hand gesture recognition can be improved


by exploring different machine learning algorithms, deep learning techniques, and
feature extraction methods.
• Gesture set expansion: Expanding the gesture set to include more hand gestures could
improve the communication capabilities of the system.

Dept of ECE, TOCE 27


A Versatile Speaking System for Anyone with Communication Difficulties Based on Hand Gestures

• Multimodal integration: Combining hand gesture recognition with other modalities,


such as facial expression recognition, eye gaze tracking, or voice recognition, could
enhance the system's performance.
• Real-time gesture recognition: Developing a real-time hand gesture recognition
system that can detect and recognize gestures in real-time could make the system more
user-friendly.
• Customization: Customization of the system to accommodate different languages,
dialects, and cultures could make the system more versatile.
• Integration with other devices: Integrating the system with other devices, such as
smartphones or tablets, could make it more accessible and convenient to use.

• Field trials and user studies: Conducting field trials and user studies to evaluate the
effectiveness and user satisfaction of the system could provide valuable feedback for
further improvements.

REFERENCES

[1]. Hrushikesh R. Mundhe, Sahil D. Patil, Vedant J. Mhatre, Prof. Mithun Nair, Prof.Jayesh Rane
“Hand Gesture Analyzing and Speaking System for Mute People” International Journal of
Research in Engineering and Science (IJRES) Volume 10 Issue 6,PP. 986-991(2022).
[2]. "Hand Gesture Recognition for Human-Computer Interaction: A Review of Recent Advances" by L. Zhao
et al. (2021).
[3]. "Design and Implementation of a Hand Gesture Recognition System using Machine Learning and Sensor
Fusion Techniques" by R. K. Yadav et al. (2021).
[4]. "Gesture Recognition using Machine Learning Techniques: A Comprehensive Survey" by S. Mishra and
R. K. Singh (2020).
[5]. "Design and Development of an Assistive Hand Gesture Recognition System for People with Disabilities"
by A. V. Mohan and S. D. Selvaraj (2020).
[6]. Suri et al., "Development of sign language using flex sensors", International Conference on Smart
Electronics and Communication (ICOSEC), pp. 102-106, 2020.
[7]. "Hand Gesture Recognition for Sign Language Interpretation using a 3D Depth Sensor" by R. K. Singh et
al. (2020).
[8]. M. Yasen and S. Jusoh, ‘‘A systematic review on hand gesture recognition techniques, challenges and
applications,’’ PeerJ Comput. Sci., vol. 5, p. e218, Sep. 2019.
[9]. "Gesture Recognition for American Sign Language Using a Soft Wearable Sensing Glove" by X. Zhao et
al. (2019).

Dept of ECE, TOCE 28

You might also like