0% found this document useful (0 votes)
5 views

Smart_Autonomous_Voice_Control_Obstacle_Avoiding_Robot (1)

The document presents a study on the development of a smart autonomous voice-controlled robot capable of obstacle avoidance and speed reduction using voice commands and ultrasonic sensors. The robot, designed for individuals with disabilities, utilizes an Arduino UNO and Bluetooth technology to navigate its environment safely by detecting obstacles in real-time. The research aims to enhance mobility for users and has potential applications across various industries, including healthcare and education.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Smart_Autonomous_Voice_Control_Obstacle_Avoiding_Robot (1)

The document presents a study on the development of a smart autonomous voice-controlled robot capable of obstacle avoidance and speed reduction using voice commands and ultrasonic sensors. The robot, designed for individuals with disabilities, utilizes an Arduino UNO and Bluetooth technology to navigate its environment safely by detecting obstacles in real-time. The research aims to enhance mobility for users and has potential applications across various industries, including healthcare and education.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

2023 International Conference on Computer Communication and Informatics (ICCCI -2023), Jan.

23 – 25, 2023, Coimbatore, INDIA

Smart Autonomous Voice Control Obstacle Avoiding


Robot
Ashok Reddy Kandula Kalyanapu Srinivas Sri Rama Krishna Datta Movva
2023 International Conference on Computer Communication and Informatics (ICCCI) | 979-8-3503-4821-7/23/$31.00 ©2023 IEEE | DOI: 10.1109/ICCCI56745.2023.10128608

Assistant Professor, Dept. of AI&DS Professor, Dept. of AI&DS UG Student, Dept. of AI&DS
Seshadri Rao Gudlavalleru Seshadri Rao Gudlavalleru Seshadri Rao Gudlavalleru
Engineering College Engineering College Engineering College
Gudlavalleru, India Gudlavalleru, India Gudlavalleru, India
[email protected] [email protected] [email protected]
Poojitha Pagolu Sindhu Pasupuleti Sai Charan Nancharla
UG Student, Dept. of AI&DS UG Student, Dept. of AI&DS UG Student, Dept. of AI&DS
Seshadri Rao Gudlavalleru Seshadri Rao Gudlavalleru Seshadri Rao Gudlavalleru
Engineering College Engineering College Engineering College
Gudlavalleru, India Gudlavalleru, India Gudlavalleru, India
[email protected] [email protected] [email protected]

Abstract—The study aims to build voice-controlled robots with A. Obstacle Avoidance


autonomous braking, obstacle avoidance, and speed reduction
capabilities. This research will enable the development of robots Obstacle avoidance is a scenario in which the intended
that can accurately and autonomously assess and respond to system identifies the obstacles or barriers around it and strives
environmental conditions, such as obstacles in their path. The to avoid them using the knowledge it has acquired. In the
primary objective of the robot is to use voice commands to steer designed robotic system, we employ ultrasonic sensors to
the robotic vehicle and analyze speech input to carry out the identify obstructions. The ultrasonic sensor measures distance
necessary actions to avoid obstacles. The research is expected to and is immediately linked to the L293D motor driver shield,
significantly improve the safety of autonomous vehicles by which is subsequently linked to the Arduino UNO. Ultrasonic
allowing them to detect and respond in real time to changes in sensors calculate the objects at a specific distance around them,
their environment. The robot is voice-activated and connected to often using sonar. The ultrasonic sensor comprises a
an Android smartphone through Bluetooth. The robot will also transmitter and receiver, which can emit the sonar wave to
be able to recognize obstacles with the help of an ultrasonic detect the object and receive the same sonar echo from the
sensor. The physically disabled and the primarily blind will object, as well as the total time, is taken by the transmitter.
benefit from further developments in this technology, among
many other groups of people. This technology can fundamentally B. Speech Recognition
alter how people view mobility and provide many opportunities A machine's ability to recognize and interpret a human
to those who would otherwise need to rely on other modes of voice or to understand and carry out spoken commands is
transportation. referred to as speech recognition [2]. Everyone wants to use
Keywords- Obstacle Avoiding, Voice Control, Speech speech recognition, which is becoming increasingly vital.
Recognition, Autonomous System, Ultrasonic Sensor, Bluetooth Many workouts are performed by speech recognition systems
Control System. such as Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and
Google. Speech recognition systems are built on algorithms.
I. INTRODUCTION A small system-avoiding voice control robot was
Modern automation technology and the field of robotics introduced as part of the automation. This was primarily
have advanced to the point where every firm has begun to introduced due to the drawbacks of a few existing systems and
automate its processes with machines and robots. Automation to create a highly efficient autonomous system with a lower
systems, commonly known as robots, are built by mixing cost and greater accuracy for physically disabled people and
multiple disciplines, such as electrical, mechanical, and people with disabilities. This system was designed with a
electronic engineering [1]. The expansion of these sectors is a combination of sensors, including an ultrasonic sensor and
significant factor in the efficiency of every industry, promising infrared sensors, to measure distances and obstacles from
us a safer environment in difficult and perceptive situations by different directions. This system also incorporated an
making human jobs more straightforward, wiser, and precise. intelligent voice recognition system to enable the robot to
Automation has a substantial presence in most major industries recognize a range of spoken commands, allowing the user to
due to its high accuracy, including education, medicine, navigate it in an automated manner. This voice recognition
agriculture, software, and many more. system allowed for greater accuracy and efficiency than any of
the existing manual systems, providing users with greater
control over their mobility and allowing for greater autonomy
for people with physical disabilities.

979-8-3503-4821-7/23/$31.00 ©2023 IEEE

Authorized licensed use limited to: VIT-Amaravathi campus. Downloaded on January 07,2025 at 10:39:41 UTC from IEEE Xplore. Restrictions apply.
2023 International Conference on Computer Communication and Informatics (ICCCI -2023), Jan. 23 – 25, 2023, Coimbatore, INDIA

This technique is not only beneficial to the physically Created and developed a robotic car using Bluetooth and
handicapped and disabled, but it can also be employed in a sensor technology. It is the ability to remotely control the car
variety of industries such as the toy industry, health care, the using GPS and steer it without needing a driver [7]. The
armed forces, education, domestic chores, agriculture, ultrasonic ranging sensor avoided colliding with obstacles in
aquaculture, and many more. the opposite direction. Images captured by the camera were
As autonomy was the trending technology that every firm saved in the database and analyzed.
or industry wanted to turn on, this type of system works well [8] Created an autonomous robotic car with an Arduino
because it offers many useful capabilities that can be employed Uno R3 as the brain. The Bluetooth module and ultrasonic
in multiple disciplines rather than just one. This system sensor were also used. Because of the QR codes, the robot
primarily improves the performance of self-driving scanning the codes could move along the road autonomously.
automobiles and allows for additional research into improving The text-to-speech feature also allowed voice communication
technology by creating several breakthroughs in the future. with the Android device.
Created a military robot. [9] The robot was instrumental in
II. LITERATURE REVIEW detecting explosives, and the surroundings were visible thanks
to the camera on the Android device. [10] This robot system
Speech-to-text technology is used to help disabled people comprises an Android device, a Bluetooth module, a micro
who cannot drive themselves. They will eventually be able to controller (the Arduino Uno), DC motors, a motor driver, a
drive their car safely and avoid unexpected hit-and-runs due to wireless camera, and a metal detector proposed using multiple
automatic braking or a slowing feature [3]. This study's ultrasonic sensors to improve obstacle detection precision [11].
primary focus is speech recognition technology, which It improved obstacle avoidance efficiency.
converts speech into text messages. Previously, controlling
hardware with speech was impractical. This investigation will III. PROPOSED SYSTEM
assist us in realizing this innovation for the disabled, who
cannot drive the vehicle independently. It also makes using In this, an autonomous and intelligent system called an
speech to control mobile devices more feasible [4]. A "obstacle-avoidance voice control robot" is built. It is capable
Bluetooth module establishes a correspondence connection of detecting obstructions and avoiding them on its own. This
between the vehicle and human voice commands. system's development uses the Arduino UNO, an ultrasonic
Bluetooth's RF transmitter can accept human voice sensor, and the HC-05 Bluetooth control module.
Various system sensors are coupled to the Arduino Uno,
commands converted to encoded advanced data for an
acceptable range from the car. The recipient decodes the data primarily serving as the system's brain. Here, an ultrasonic
before passing it to the micro controller, which uses it to power sensor detects the obstruction in front of the rover, while an IR
DC engines via engine driver L298D. An Arduino Uno has sensor detects the obstacle on the back of the rover. The HC-05
been modified to read voice commands and respond Bluetooth control module, which is connected to the Android
appropriately. Ultrasonic sensors linked to the Arduino can aid application, controls the entire system [12].
in snag detection designed and developed an obstacle-avoiding The rover operates by transmitting vocal commands from
robot. A robot with a few mechanical components adds two the Android application to the Bluetooth module, which then
new functions to the main body: a laptop holder and a camera transforms them into digital signals and transmits them to the
holder [5]. Arduino UNO. The system responds to the signal that the
Arduino UNO receives, and at the same time, the ultrasonic
The relatively expensive cameras are fixed and adjusted on
sensor begins detecting the obstructions. The ultrasonic sensor
the camera holder for proper computer vision calibration. Users
contains a transmitter and receiver, sending sonar waves and
establish serial communication between the upper laptop and
receiving their echoes. Using these, it can detect the presence
the lower development board via the USB port. The laptop will
of an obstruction and prevent the rover from colliding with it.
indicate the motor's condition to the development board [6].
This system is created in a way that it can simulate the lives of
The vehicle was successfully navigated using simple
those who are disabled and those who are ardent exercisers but
algorithms to steer and reduce the turning radius. Finally, the
are unable to do so on their own.
group successfully interfaced with all the initial planned
components. A timer prevents the generation of IR pulses.

Fig. 1. Block Diagram of The Complete System

Authorized licensed use limited to: VIT-Amaravathi campus. Downloaded on January 07,2025 at 10:39:41 UTC from IEEE Xplore. Restrictions apply.
2023 International Conference on Computer Communication and Informatics (ICCCI -2023), Jan. 23 – 25, 2023, Coimbatore, INDIA

IV. METHODOLOGY
The methodology mainly consists of four parts:
1. Arduino IDE (Integrated Development Environment)
2. Android Application
3. Arduino UNO
4. HC-05 Bluetooth Module
A. Arduino IDE (Integrated Development Environment)
Similar to a text editor, the Arduino software's integrated Fig.3. Bluetooth Controller Android Application
development environment (IDE) refers to each file as a
"sketch" and saves them with the ".ino" extension. The C. Arduino UNO
Arduino integrated development includes many examples to
test the crucial hardware parts. It allows users to select the kind The Arduino UNO serves as the system's brain. The
of Arduino boards on which the application should operate. Arduino UNO has the following pins: GND (ground), VIN,
The upper left corner of the box has a check mark for finishing and 5 V. (voltage). Analog pins A0 to A5 correspond to the
the sketch and an arrow for uploading it. The Arduino software Arduino's digital pins D0 through D13, which accept digital
includes a built-in console that automatically displays mistakes signals as inputs and produce binary data, with zero and one
as they occur. serving as the transmitter and receiver, respectively. These
Any operating system, including macOS, Linux, and analog pins receive analog signals as inputs, whilst this pin
Windows, can write code for the open-source Arduino produces analog data.
platform, which can then be uploaded to the board. The
Arduino Integrated Environment, a Java program, incorporates
C-like syntax.

Fig. 4. Arduino UNO

The Arduino Uno receives instructions from the Bluetooth


module and the ultrasonic sensor. It then verifies that the
signals operate per the code programmed into the Arduino
Uno.

D. HC-05 Bluetooth Module


Fig. 2. Arduino UNO IDE
The Bluetooth module, also known as the HC-05 Module,
The code is originally written in the Arduino IDE and has a 5-volt internal capacity. It functions as both a
uploaded to the Arduino Uno to enable the system to run as communication tool and a system between the communicator
specified by the code. The code for the ultrasonic sensor is and the system. The transmitter, receiver, voltage, and GND
written here, with a base detection range of 30 cm. It is also are the four connecting pins that make up most of the HC-05
written for the L298D motor driver shield to activate the DC Module (ground). These are immediately linked to the
motors with a maximum speed of 255 rpm. Finally, we will use Arduino, and the receiver and transmitter pins are wired so that
the servo package to turn out the servomotor for rotation in the they are connected in the opposite order, i.e., 0 is the Arduino's
angular degrees of 90 and 180. RX and is attached to the HC-05's transmitter, and 1 is the
Arduino's TX and is connected to the receiver.
B. Android Application
The Android application communicates with the HC-05
Bluetooth module and conveys voice commands. The Android
application first requests that the HC-05 Bluetooth module be
connected before opening a tab for receiving input in the form
of voice commands. Here, we will primarily use a select Fig. 5. HC-05 Bluetooth Module
number of commands, such as move forward, backward, turn
left, and turn right. In this instance, the Bluetooth module is connected to the
Arduino Uno using VCC, GND, TX, and Rx pins. At the same
time, it is connected to an Android application, where it
receives voice commands from the application and transmits
them to the Arduino Uno using the TX and Rx pins to make the
system function as instructed.

Authorized licensed use limited to: VIT-Amaravathi campus. Downloaded on January 07,2025 at 10:39:41 UTC from IEEE Xplore. Restrictions apply.
2023 International Conference on Computer Communication and Informatics (ICCCI -2023), Jan. 23 – 25, 2023, Coimbatore, INDIA

Fig. 6. Flow Chart Of The System Execution

The flowchart above illustrates how the entire system ultrasonic sensor's echo and trig are connected to the Arduino
functions. Initially, the HC-05 Bluetooth module was Uno's digital pin ports; its VCC is connected to the board's 5-
connected to the Android application, and voice commands volt input, and its ground is connected to the Arduino Uno's
were sent. Although voice commands were sent, the ultrasonic ground. When the system is turned on, this makes the DC
sensor, which was connected to the Arduino UNO, received the motors operate.
echo transmitted as sonar waves. As soon as these sonar waves The motor works as planned since the ultrasonic sensor's
hit the obstruction, they reflected in the form of an echo, and VCC pin, through which the input travels, is used to send and
this echo was converted into the distance, which was then used receive sonar and echo data. For locating and avoiding barriers,
by the system to perform the task here. If the distance exceeded a base distance of 30 cm is supplied in this instance. The
30 cm, it completed all the remaining jobs. If the distance was system moves in response to the order supplied if the distance
less than 30 cm, the system came to a halt and looked for an is equal to or greater than 30 cm. The system halts and notifies
alternate option to travel, which was the obstruction-free path. the user of the obstruction if the distance is less than 30 cm.

TABLE I. WORKING OUT WITH DIFFERENT COMMANDS V. RESULTS AND DISCUSSIONS


Command Output During the testing procedure, the robot would make real-
time recognition and perform the relevant action. The
Move Forward (command Sent) Rover moves forward recognition result is accurate if the associated movement
occurs after the voice instruction. The two testing
Move Backward (command Sent) Rover moves backward environments are the outdoors and the noisy environment. The
Turn Left (command Sent) Rover turns left speech recognition system works flawlessly in a normal
Turn Right (command Sent) Rover turns right
environment. It shows how dependable the technology is that
the recognition accuracy reaches 100%.
Stop (command Sent) Rover stops
TABLE II. DATA IN A NORMAL ENVIRONMENT
E. Simulation
The project uses the free and open-source Tinker CAD
software to test and simulate the system by incorporating the Command No. of trials No. of Successes
code. Both 2D and 3D models of circuits can be designed using
Thinker Cad. The Arduino and software test the individual Move Forward 8 8
modules and the complete set system. Move Backward 8 8
Fig. 7 below illustrates how an ultrasonic sensor installed Turn Left 8 8
on an Arduino Uno and an L298D motor driver is used to Turn Right 8 8
identify impediments. The ultrasonic sensor's four separate Stop 8 8
connections are the echo, trig, VCC, and ground pins. The

Authorized licensed use limited to: VIT-Amaravathi campus. Downloaded on January 07,2025 at 10:39:41 UTC from IEEE Xplore. Restrictions apply.
2023 International Conference on Computer Communication and Informatics (ICCCI -2023), Jan. 23 – 25, 2023, Coimbatore, INDIA

Fig. 7. Simulation Circuit Diagram of the System

TABLE III. DATA IN A NOISY ENVIRONMENT


1. When the distance is less than 30cm.
Command No. of trials No .of Successes
2. When distance greater than 30cm.
Move Forward 8 7
Move Backward 8 7
If we take out the first case, which is a distance of less than
Turn Left 8 6
30cm when a command is given Move Forward, the system
Turn Right 8 7
finds the obstacle in front and moves back. While moving
Stop 8 8
back, it could find the obstacle, check out its left and right for
it, and move accordingly.
The results of the experiment in a typical setting are shown If we take out the second case, a distance greater than or
in Table II. The voice recognition system's accuracy would equal to 30cm, we have two cases in which there was an
drop substantially in a noisy environment. The results of the obstacle or none. When it detects the obstacle, it can act
experiment in the noisy environment are shown in Table III. according to the first scenario, whereas in the No condition, it
The investigation's findings demonstrated that noise is the continues to work accordingly.
primary factor affecting recognition performance. It continues
to be among the most difficult things to complete. A. Benefits
TABLE IV. POSSIBLE CONDITIONS EXECUTED Obstacle avoidance is a crucial requirement of any
autonomous robot. Robots can traverse through unfamiliar
Obstacle Presence Commands possible executed territory by avoiding collisions with the help of obstacle-
CONDITION avoidance technology. A running robot identifies impediments
Go Back in its route, avoids them, and then keeps running. The project's
Turn Right
Distance < 30cm Yes
Turn Left voice-controlled obstacle-avoiding robot can travel farther than
Stop wireless antenna robots because we used a Bluetooth module,
Go Ahead which is more effective than a wireless antenna and delivers
Yes Go Back our speech input data from the Android app to the robot’s brain
Distance >= 30cm Turn Right (Arduino UNO).
No Turn Left
Stop
I. Increased Safety
Table IV above depicts the conditions and the possible Fully automated vehicles can offer a high level of safety
commands that can be executed when the obstacle is present due to their adaptability, architecture, and error-free object
and when the obstacle is not; this was mainly classified into detection. Automated vehicles can sense their surroundings and
two categories. act quickly. Self-driving vehicles are programmed to strictly
follow traffic regulations, making for a safer and more

Authorized licensed use limited to: VIT-Amaravathi campus. Downloaded on January 07,2025 at 10:39:41 UTC from IEEE Xplore. Restrictions apply.
2023 International Conference on Computer Communication and Informatics (ICCCI -2023), Jan. 23 – 25, 2023, Coimbatore, INDIA

effective driving experience. Avoiding collisions with large VII. IMPLICATIONS FOR FUTURE ROBOTICS
objects secures the area and saves lives. Additionally, the Despite all the advantages of reducing traffic, the visual
autonomous vehicles' incorporation of artificial intelligence perception issue continues to be the biggest obstacle to self-
(AI) enables them to pick up on and anticipate other vehicles' driving technology. The human brain uses billions of neurons
and roadside obstacles' movements. while you're driving to process information at a supercomputer
II. Increased Efficiency level while using less energy than a light bulb. Specifically, the
robots need a solution that can operate at 75 tear-operations-
Due to their extensive training with various algorithms,
per-second (TOPS) per watt to be practical.
fully automated vehicles can perform tasks with a high level of
efficiency. The likelihood of accidents caused by human error Current automotive solutions rely on technology, like
is decreased because these autonomous vehicles can use the the GPU. As a result, they have limited processing power and
information from their sensors to build an image of the can only enable some degree of autonomy. A novel, purpose-
environment around them and make decisions with a high built platform must be incorporated into cars to enable self-
degree of accuracy. They can use sophisticated onboard driving and reduce traffic. This solution, in contrast to GPUs,
systems to help lower the risk of human error while accurately which use a tremendous amount of energy, must be optimized
and quickly detecting objects. Furthermore, the AI in for computing and power, given the recent trend toward
autonomous vehicles can recognize and process changes in the electric vehicles.
environment quickly so that they can react to unexpected
events more efficiently than human driver's error is decreased REFERENCES
because these autonomous vehicles can use the information [1] P. Narendra Ilaya Pallavan, S. harish, C. Dhachinamoorthi, "Voice
from their sensors to build an image of the environment around Controlled Robot with Real Time Barrier Detection and Advertising",
them and make decisions with a high degree of accuracy. They International Research Journal of Engineering and Technology (IRJET),
can use sophisticated onboard systems to help lower the risk of vol. 6 issue. 1, 7 January 2019.
human error while accurately and quickly detecting objects. [2] Zhizeng L, Jingbing Z (2004) Speech recognition and its application in a
Furthermore, the AI in autonomous vehicles can quickly voice-based robot control system 2004 International Conference on
recognize and process environmental changes to react to Intelligent Mechatronics and Automation Proceedings.
unexpected events more efficiently than human drivers. This [3] Lumelsky, V., and Skewis, T., "Incorporating RangeSensing in the
allows autonomous vehicles to accurately and safely navigate Robot Navigation Function." IEEE Transactions on Systems, Man, and
various conditions, including unpredictable ones, since it uses Cybernetics, 20:1990, pp. 1058–1068.
various sensors to increase accuracy, including high-definition
[4] T. Ishida, Y. Kuroki, J. Yamaguchi, M. Fujita, and T.T. Doi, "Motion
cameras, radars, GPS modules for determining location, and Entertainment by a Small Humanoid Robot Based on OPEN-R", Int.
numerous other safety measuring systems modules. Conf. on Intelligent Robots and Systems (IROS-01), pp. 1079-1086,
2001.
III. Reduced Cost
[5] Katsushi Sakai, Tsutomu Asada, Developing a Service Robot with
Ride-sharing services will also deploy autonomous vehicles Communication Abilities, International Workshop on Robots and
because they will be less expensive than hiring drivers. Self- Human Interactive Communication, 91-96, 2005.
driving cars minimize transportation expenses since computers
do not require vacation days, paid time off, overtime, social [6] Rahmadi Kurnia, Md. Altab Hossain, Akio Nakamura, and Yoshinori
Kuno Object Recognition through Human-Robot Interaction by Speech,
security, or sick leave. Autonomous vehicles will be far more International Workshop on Robot and Human Interactive
autonomous than conventional vehicles. Furthermore, the cost Communication, 619-624, 2004.
reductions connected with driver less technology are
substantial thanks to many automatic monitoring systems and [7] Bojan Kulji, Simon János, and Szakáll Tibor, Mobile robot controlled by
voice, International Symposium on Intelligent Systems and
analytical functions. Informatics,189-192, 2007.
VI. CONCLUSION [8] G. Bauzil, M. Briot and P. Ribes, "A navigation sub-system using
ultrasonic sensors for the mobile robot HILAIRE", Proc. 1st Int. Conf.
It will be an autonomous robot with the ability to avoid any Robot Vision and Sensory Controls, pp. 47-698, 1981-Apr
barriers in its way. It will make use of a servo motor and an
ultrasonic distance sensor. The robot will determine the next [9] J. A. Boyle, "Robotic steering" in Internal Project of the Trent
course of action after determining how far away the nearest Polytechnic, England, Nottingham, 1982.
obstacle is in each direction. The distance sensor's direction is [10] R. A. Brooks, "Solving the find-path problem by the good representation
controlled by a servo, which rotates the sensor in different of free space", Proc. Nat. Conf. Artificial Intelligence AAAI-82, pp. 381-
directions if the robot encounters an obstacle. Once the robot is 386, 1982-Aug.
persuaded that a particular direction is free of obstacles, it will [11] G. Hoffstatter, "Using the Polaroid ultrasonic ranging system", Robotics
turn in that direction and move straight along that direction Age, pp. 35-37, Sept. 1984.
until it encounters the next obstacle. This process of decision-
making and navigation allows the robot to determine which [12] M. Fujita and K. Kageyama, "An Open Architecture for Robot
Entertainment", Proc. International Conference on Autonomous Agents,
direction is the most viable path forward, or if none is pp. 435-440, 1997.
available, it can then execute a full 180° turn and determine if a
different path is available [13] N, S., N.S, R., P, N. D., & V, V. (2021). Congestion Control early
warning system using Deep Learning. International Journal of Computer
Communication and Informatics, 3(2), 35-50.

Authorized licensed use limited to: VIT-Amaravathi campus. Downloaded on January 07,2025 at 10:39:41 UTC from IEEE Xplore. Restrictions apply.

You might also like