Artificial Intelligence Based Visually Impaired
Assist System
Deepti Shinghal Kshitij Shinghal* Shuchita Saxena
Regular Member Dept. of E&C Engg. Dept. of E&C Engg.
International Association for Moradabad Institute of Technology Moradabad Institute of Technology
Engineerig & Technology, Moradabad, U.P., India Moradabad, U.P., India
Singapore [email protected] [email protected]
[email protected] *corresponding author
2022 International Conference on Advances in Computing, Communication and Materials (ICACCM) | 978-1-6654-7439-9/22/$31.00 ©2022 IEEE | DOI: 10.1109/ICACCM56405.2022.10009591
Amit Saxena Nishant Saxena Amit Sharma
Dept. of E&C Engg. Dean Academics Dept. of E&C Engg.,
Moradabad Institute of Technology Tula’s Institute Teerthanker Mahaveer University,
Moradabad, U.P., India Dehradun, Uttrakhand, India Moradabad, U.P., India
[email protected] [email protected] [email protected] Abstract— In present work, a system is proposed which is
unique in a way that there is a requirement of visually impaired
friendly buildings. In current scenario when a visually impaired
person enters a building which is Visually Impaired (VI)
friendly, an attendant hands him over braille based navigation
chart or electronic guide system The proposed system
automatically detects a visually impaired person makes an
announcement, generates an alert message from the basket
where VI person enabled braille based guide maps are kept. The
system was tested and it is able to detect blind persons with good
accuracy.
Keywords — Visually Impaired, Artificial intelligence.
Machine learning, Raspberry pi-4
Figure 1 (a) Blind person navigating with aid of stick and guide dog (b) Blind
I. INTRODUCTION person navigating with help of technology and assistance devices
According to National library of medicine, community of
Eye health, more than 500 million people are visually Present work gives the design and evaluation of artificial
impaired worldwide. Out of these 47 million are fully blind intelligence and machine learning based visually impaired
and rest 453 million are suffering from moderate to severe assist system. The proposed system has a dedicated raspberry
visual impairment. Several organizations and NGOs are pi-4 embedded in it. To deploy this in a campus that is visually
working to design building, premises specially recreation impaired friendly, a module based on raspberry pi that will be
zones, educational institutes with navigation aids, for people loaded with a tensor flow visually impaired detection model.
who are visually impaired. Today with these specially This tensor flow model will be able to recognize visually
designed navigational facilities in the buildings visually impaired persons on the basis of objects carried by them like
impaired people live increasingly independent lifestyles. With blind glasses, sticks, guide dogs etc. It will also have a custom
these modern day navigation facilities blind people are able to visually impaired person detection module to recognize
go out in the community with the help of technology and specific persons may be on the basis of pre issued id cards,
assistance equipment’s to maneuverer alongside other people wrist bands, e-watches, e-sticks for blinds etc. The proposed
with normal vision. One of the most important senses of the models are made to run faster using coral to reduce the
human beings is the vision. People who don't have vision face response time.
lots of problem in their daily routine activities and in
navigating from one place to another. Therefore this paper
proposes a technique for such persons who cannot see or have
mild vision. The system offer various advantages of
functionalities in both the indoor and outdoor environment
and provides the virtual help to the users. Most of the devices
available in the market have different issues like performance
capability, cost issue, complexity etc. So a system is needed
which can overcome all these issues and work efficiently and
accurately. People who are Visually Impaired suffer from
many challenges whenever they perform any task especially
in navigating their path. Thus a need arises to build systems or
models which could increase the range of assistance provided Figure 2 Proposed AI based visually impaired assist system
to them to move independently.
XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE
Authorized licensed use limited to: Zhejiang University. Downloaded on January 23,2025 at 09:02:21 UTC from IEEE Xplore. Restrictions apply.
The figure 2 depicts the proposed visually impaired assist working with more accuracy and efficiently as compared to
system. As soon as the visually impaired person enters as VI the wearable mask and the stick [9]. C. H. Chen et al. designed
friendly campus equipped with proposed design, at the entry a unit, which comprises of RNN neural network technology
itself the module is deployed in polling mode. It continuously and WiFi. The hardware used are Cameras & microphones.
senses and checks each visitor and compares it with training This unit guides the blind people in path detection [10]. N. E.
data available in its database. The system module keeps Shandu et al. presented a novel approach and assistive help to
training and upgrading thereby improving accuracy of the the blind people using Raspberry Pi, sensors and GPS. The
system. On detection of visually impaired person, the designed aid is the walking stick which will help the person in
proposed module announces a message for blind person to navigating the location [11]. O. Gamal et al. proposed an
pick up a braille based navigation card placed in smart basket outdoor system in their work for navigation for the visually
near entry point. The smart basket is equipped with a buzzer impaired persons using deep learning methods and the ICT
which produces sound to make it easier for visually impaired technologies [12]. P. S. Rajendran et al. provides a novel
person to locate it. Taking the card/electronic navigation approach for the blind persons which could help them to trace
system from basket, visually impaired person can easily the location and move freely without any problem. The paper
navigate through large premises, independently without the gives the design ideas for the wearable smart glasses for them
help of other persons, with this module he will be able to move which sends an audio command to the user [13]. M. A. Khan
through the campus premises of building on his own free will Shishir et al. proposed an idea to develop an android
independently alongside with the people with normal vision. application which could be run on any smart-phone to help the
The paper is organized as follows. A brief review of various VIP ( Visually Impaired People) in tracing their location. It
available systems is presented in Section 2, while Section 3 is uses the machine learning algorithm for path funding [14]. P.
devoted to the simulation of the proposed system with the T. Mahida et al. presents an idea to build a system for persons
discussion of the obtained results in section 4. The conclusions who could not see and face a problem in detecting the
and future developments are described in Section 5. locations. The idea includes the use of sensors and bluetooth
technology and makes the device work in the dark areas also
II. LITERATURE REVIEW [15]. S. M. Felix et al. in their work designed a system in
G. Wang et al. in the paper proposed a unique model based which the blind persons can interact with the surrounding
on IoT for blind persons which help them to coordinate with through artificial intelligence and audio aids and sensors [16].
the local traffic and the device would guide them the route and K. Yang et al. proposed a model using deep learning
path to their destination [1]. S. Zafar et al. discussed in their architecture for the pedestrians, who could not see (Visually
work about the various assistive devices available for the blind Impaired). Experimental analysis is done resulting the
persons. These devises work on the principle of IoT and proposed model to be more accurate and efficient than the
Machine learning. The comparative study and analysis on the existing systems [17].
effectiveness of these devices is done so that the most suitable
device can be selected for the given condition [2]. V. Bharati III. SIMULATION SETUP AND PROPOSED METHODOLOGY
et al. proposed an assistive technology for the visually For simulation and evaluating the performance of
impaired persons in which the assistive device will guide them proposed model tensor flow lite is used. Tensor flow lite is a
and give them the instructions to follow the track. The system library to run tensor flow machine learning models on edge
uses the sensor, cameras and the principles of Artificial devices like raspberry pi, android phones or even web
Intelligence [3]. U. Das et al. proposed a reliable, low cost way browsers. To run tensor flow, raspberry pi operating system is
finding system for the persons who could not see and locate needed to load pi camera drivers. The algorithm for blind
their track to their destination. The model uses the image detection is encoded in python with .py extension. Then on
processing technique to give the information of the path [4]. executing Python detect .poi, the module detects the visually
G. Dimas et al. in the paper proposed a design of a system that impaired persons. Because this visually impaired person
is based on object detection algorithms to locate the path to detection model is trained on a data set of visually impaired
the destination. The simulation is done and the high risk areas persons carrying either or all of the objects carried by visually
are detected as obstacles [5]. P. Chitra et al. proposed a blind persons. Therefore, on detection of any of the object like
wearable device for the blind people to navigate. The system blind goggles, blind stick, guide dogs e-sticks, e navigation
uses the light detection sensors and the vibration sensors and bands for visually impaired etc., the module will detect a
the audio system for the sound generation to guide the user visually impaired person and activate announcement system
[6]. D. M. L. V. Dissanayake et al. proposed a design for the and smart basket. Figure 3 shows a Raspberry Pi with Pi
Visually Impaired people to track their path in the indoor camera.
environment. It uses Bluetooth technology, sensors, and
machine learning algorithms to interface the different units of
the model [7]. Q. Wang et al. in their work proposed a model
which could replace the use of dogs for the blind people to
guide them the location and help them in navigation. The
model is based on the concepts of image recognition, motion
sensors and the artificial intelligence [8]. N. Kumar et al.
proposed as system which provides assistance to the visually
impaired (VI) persons to navigate in an environment. It makes
use of the YOLO algorithm and the Image processing tools to Figure 3 Raspberry Pi-4 with Pi camera
detect the path for the destination. The system found to be
Authorized licensed use limited to: Zhejiang University. Downloaded on January 23,2025 at 09:02:21 UTC from IEEE Xplore. Restrictions apply.
Figure-4 depicts a pictorial view of tensor flow shell
interface. Figure-5 depicts training a visually impaired person
detection with tensor flow VI person detect API.
Figure 4 Tensor flow shell interface
IV. RESULTS AND DISCUSSIONS
The evaluation set of metrics for visually impaired person
detect API is installed and loaded for evaluation in tensor
flow. This VI person detect API needed a set of useful metrics
for visually impaired person detection: mean average,
precision and recall.
Figure 5 Training model of VI person detect API
Figure 6 Detection boxes Mean Average Precision plot
Figure 6 (a) – (e) shows tensor board plot of detection
boxes for mean average precision plot. A change in curve can
be observed in each plot when any of the object associated
Authorized licensed use limited to: Zhejiang University. Downloaded on January 23,2025 at 09:02:21 UTC from IEEE Xplore. Restrictions apply.
with visually impaired person is defected. Figure 7 shows real Techniques (IST), 2021, pp. 1-6, doi:
10.1109/IST50367.2021.9651326.
time detection of objects associated with visually impaired
person. [6] P. Chitra, V. Balamurugan, M. Sumathi, N. Mathan, K. Srilatha and R.
Narmadha, "Voice Navigation Based guiding Device for Visually
Impaired People," 2021 International Conference on Artificial
Intelligence and Smart Systems (ICAIS), 2021, pp. 911-915, doi:
10.1109/ICAIS50930.2021.9395981.
[7] D. M. L. V. Dissanayake, R. G. M. D. R. P. Rajapaksha, U. P.
Prabhashawara, S. A. D. S. P. Solanga and J. A. D. C. A. Jayakody,
"Navigate-Me: Secure voice authenticated indoor navigation system
for blind individuals," 2021 21st International Conference on Advances
in ICT for Emerging Regions (ICter), 2021, pp. 219-224, doi:
10.1109/ICter53630.2021.9774790.
[8] Q. Wang, K. Zhang, K. Zhao and M. Liao, "Smart Seeing Eye Dog
Wheeled Assistive Robotics," 2021 3rd International Symposium on
Robotics & Intelligent Manufacturing Technology (ISRIMT), 2021,
pp. 104-108, doi: 10.1109/ISRIMT53730.2021.9596792.
[9] N. Kumar and A. Jain, "Smart Navigation Detection using Deep-
learning for Visually Impaired Person," 2021 IEEE 2nd International
Conference On Electrical Power and Energy Systems (ICEPES), 2021,
pp. 1-5, doi: 10.1109/ICEPES52894.2021.9699479.
[10] C. H. Chen and M. -F. Shiu, "RNN-based Dialogue Navigation System
for Visually Impaired," 2020 International Conference on Pervasive
Artificial Intelligence (ICPAI), 2020, pp. 140-143, doi:
Figure 7 visually impaired persons identifying objects detected 10.1109/ICPAI51961.2020.00033.
[11] N. E. Shandu, P. A. Owolawi, T. Mapayi and K. Odeyemi, "AI Based
V. CONCLUSION AND FUTURE WORK Pilot System for Visually Impaired People," 2020 International
Conference on Artificial Intelligence, Big Data, Computing and Data
The proposed work is designed and evaluated to help Communication Systems (icABCD), 2020, pp. 1-7, doi:
visually impaired persons and to make the modern day 10.1109/icABCD49160.2020.9183857.
buildings visually impaired people friendly. So that they are [12] O. Gamal, S. Thakkar and H. Roth, "Towards Intelligent Assistive
able to navigate through the building without the help of System for Visually Impaired People: Outdoor Navigation System,"
2020 24th International Conference on System Theory, Control and
special attendants. The proposed system was designed and Computing (ICSTCC), 2020, pp. 390-397, doi:
evaluated for performance. The system is able to detect 10.1109/ICSTCC50638.2020.9259682.
visually impaired person with good accuracy. In future instead [13] P. S. Rajendran, P. Krishnan and D. J. Aravindhar, "Design and
of using braille based navigation guide maps, electronic Implementation of Voice Assisted Smart Glasses for Visually Impaired
gadgets with voice alert system may be used in conjunction People Using Google Vision API," 2020 4th International Conference
on Electronics, Communication and Aerospace Technology (ICECA),
with the proposed system. 2020, pp. 1221-1224, doi: 10.1109/ICECA49313.2020.9297553.
[14] M. A. Khan Shishir, S. Rashid Fahim, F. M. Habib and T. Farah, "Eye
REFERENCES
Assistant : Using mobile application to help the visually impaired,"
[1] G. Wang, L. Li, J. Fan, S. Shi, Y. Xu and Y. Wang, "Active Guide 2019 1st International Conference on Advances in Science,
System for The Blind Based on The Internet of Things and Engineering and Robotics Technology (ICASERT), 2019, pp. 1-4, doi:
Collaborative Perception," 2022 11th International Conference of 10.1109/ICASERT.2019.8934448.
Information and Communication Technology (ICTech)), 2022, pp. 22- [15] P. T. Mahida, S. Shahrestani and H. Cheung, "Indoor positioning
27, doi: 10.1109/ICTech55460.2022.00012. framework for visually impaired people using Internet of Things,"
[2] S. Zafar et al., "Assistive Devices Analysis for Visually Impaired 2019 13th International Conference on Sensing Technology (ICST),
Persons: A Review on Taxonomy," in IEEE Access, vol. 10, pp. 13354- 2019, pp. 1-6, doi: 10.1109/ICST46873.2019.9047704.
13366, 2022, doi: 10.1109/ACCESS.2022.3146728. [16] S. M. Felix, S. Kumar and A. Veeramuthu, "A Smart Personal AI
[3] V. Bharati, "LiDAR + Camera Sensor Data Fusion On Mobiles With Assistant for Visually Impaired People," 2018 2nd International
AI-based Virtual Sensors to Provide Situational Awareness for the Conference on Trends in Electronics and Informatics (ICOEI), 2018,
Visually Impaired," 2021 IEEE Sensors Applications Symposium pp. 1245-1250, doi: 10.1109/ICOEI.2018.8553750.
(SAS), 2021, pp. 1-6, doi: 10.1109/SAS51076.2021.9530102. [17] K. Yang, R. Cheng, L. M. Bergasa, E. Romera, K. Wang and N. Long,
[4] U. Das, V. Namboodiri and H. He, "PathLookup: A Deep Learning- "Intersection Perception Through Real-Time Semantic Segmentation
Based Framework to Assist Visually Impaired in Outdoor to Assist Navigation of Visually Impaired Pedestrians," 2018 IEEE
Wayfinding," 2021 IEEE International Conference on Pervasive International Conference on Robotics and Biomimetics (ROBIO),
Computing and Communications Workshops and other Affiliated 2018, pp. 1034-1039, doi: 10.1109/ROBIO.2018.8665211.
Events (PerCom Workshops), 2021, pp. 111-116, doi:
10.1109/PerComWorkshops51409.2021.9431007.
[5] G. Dimas, E. Cholopoulou and D. K. Iakovidis, "Self-Supervised Soft
Obstacle Detection for Safe Navigation of Visually Impaired People,"
2021 IEEE International Conference on Imaging Systems and
Authorized licensed use limited to: Zhejiang University. Downloaded on January 23,2025 at 09:02:21 UTC from IEEE Xplore. Restrictions apply.