Low-cost multi-sensing fire-fighting robot with obstacle avoidance mechanism
Low-cost multi-sensing fire-fighting robot with obstacle avoidance mechanism
Corresponding Author:
Adekunle Taofeek Oyelami
Department of Mechatronics Engineering, Federal University of Agriculture
Abeokuta, Nigeria
Email: [email protected]
1. INTRODUCTION
Fire is a chemical reaction that produces heat and light as it consumes a fuel source in the presence
of oxygen. A firefighter can be simply put as a person who puts out fires, utilizing tools with the capability to
extinguish and contain fires, thereby preventing the loss of lives and properties [1]. Firefighting is a
physically demanding, critical, and hazardous task that often puts the firefighter's lives at risk [2]–[4].
Advances in technology have made firefighting easier, bridging the gap between firefighting and machines,
thereby creating a more efficient and effective method of firefighting [5]–[7].
A robot is an automated intelligent mechanical being, i.e. a machine designed to behave like a
human or other elements to carry out complex tasks by moving physically after being programmed [8]–[11].
Oyelami et al. [12] discussed the significant rise in robot usage across several fields. Robots can be divided
into various groups, with some grouped based on the mode of operation. These include Android robots,
which are designed to act like humans and mimic their actions, and autonomous robots, which are capable of
acting on their own or independently [13]–[15]. Mobile robots, unlike fixed robots, have a movable base and
can navigate using instructions from human beings. Tele-robots and Telepresence robots are quite similar in
operation, the technical difference between them is that the latter gives feedback in digital formats like a
video, sound clip, and other media data.
The main challenges for the development of autonomous walking robots as summarized by Nonami
et al. [16] are i) the need for energy-efficient actuators with optimal weight-to-torque and volume-to-torque
ratios; ii) the availability of reliable and economical sensors; iii) the use of lightweight but mechanically
strong materials for construction; iv) the requirement for small but high computing power in onboard
computers; and v) The necessity of lightweight power sources for extended operational duration.
An autonomous firefighting robot is not entirely new, research works have shown progress
using different technologies, but also with notable constraints. The development of a firefighting robot by
Aliff et al. [8] uses Arduino Uno as its microcontroller equipped with a webcam for visual feedback, an
ultrasonic sensor for obstacle avoidance, a flame sensor, a water pump, a direct current motor, and also a
transmitter and remote control for controlling the robot remotely. The robot has a limited flame sensing range
of 40 cm which can be manually monitored by using a camera that connects to a smartphone or remote
devices and controls it to the site of the flame. Archana and Suma [5] incorporated an LM35 temperature
sensor, with the major limitation to the design being the sensing range. The intelligent wireless fire
extinguishing robot by Islam et al. [9], however, took advantage of the Internet of Things to make an
internet-controlled robot. It uses Arduino Uno to control the robot, and Arduino Yun has built-in Ethernet
and Wi-Fi for external communication and video feedback through the webcam. There was also an instance
of utilizing a PID controller as opposed to an Arduino microcontroller to achieve more precise control of the
robot's movements and responses, as well as more accurate fire extinguishing capabilities [14].
It can be noted that the designs mentioned above featured limited flame sensing range, while others
relied on remote control or internet connectivity, which may not be practical in all firefighting scenarios. This
project addresses these limitations by developing an autonomous firefighting robot with an extended sensing
range and enhanced autonomous capabilities. The proposed solution includes a static sensing unit equipped
with multiple flame sensors and a transmitter to relay fire location information to the robot. The robot itself is
designed with a proportional motor control system for bi-directional movement, an ultrasonic sensor for
obstacle avoidance, and three infrared flame sensors for comprehensive fire detection. These components
interact with a microcontroller that operates a water pump to extinguish detected fires [17], aiming to provide
an extended range, more effective, and autonomous firefighting solution.
2. METHOD
The construction of this project utilized several key components, including the Arduino Mega and
Arduino Uno microcontrollers, flame sensors, servo motors, L298N motor driver, water pump, water tank,
ultrasonic sensors, lithium-ion batteries, 433 MHz transmitter and receiver, and the QMC 5883L
magnetometer. These components were meticulously selected based on their reliability, cost-effectiveness,
and compatibility with the design specifications of the autonomous firefighting robot. These components
were further divided into two separate units: static and mobile (robot).
IAES Int J Rob & Autom, Vol. 13, No. 4, December 2024: 373-379
IAES Int J Rob & Autom ISSN: 2722-2586 375
The motion of the robot is controlled based on the Arduino Mega through the L293N module motor
driver based on the information received from the flame, ultrasonic sensor, and radio frequency receiver.
Flame sensors have a sensing range of about 1 m and an angle of 60° thereby making it possible to combine
three flame sensors to give a total coverage angle of 180° [18], [19]. In this way, the sensors not only
increase the sensitivity range but also give the robot some sense of direction; left, forward, and right.
The ultrasonic sensor is positioned on the chassis at the front of the robot. The main function of the
ultrasonic sensor is to sense the presence of obstacles in the robot paths [20]–[23]. Obstacles may include a
wall or a solid object that throws the robot out of balance. If the static unit senses the fire, the robot proceeds
towards it in the direction of the sensing sensor using its ultrasonic sensor to avoid obstacles. It extinguishes
it by powering the water pump and servo motor that spray water in a sweep motion.
The environment for implementation was carefully mapped out into five key locations: 0, the initial
starting point, and 1, 2, 3, and 4, which correspond to the four corners of the environment and the positions of
the flame sensors connected to the static unit. This strategic arrangement ensures comprehensive coverage
and accurate fire detection throughout the entire area. The precise bearings and coordinates of all locations
relative to each other were carefully encoded into the robot's navigation system, allowing it to autonomously
and efficiently travel to any specified location within the mapped area [24]–[26]. This setup not only
enhances the robot's operational efficiency but also maximizes its ability to respond quickly and effectively
to fire incidents. A visual representation of this mapping can be seen in the accompanying Figure 3 and the
robot prototype in Figure 4.
IAES Int J Rob & Autom, Vol. 13, No. 4, December 2024: 373-379
IAES Int J Rob & Autom ISSN: 2722-2586 377
4. CONCLUSION
The developed robot is a prototype of a semi-advanced firefighting robot created on a limited
budget. Despite financial constraints, the robot demonstrates accurate and efficient obstacle avoidance and
fire detection capabilities. It can detect fire within a wavelength range of 760 to 1100 nm and a distance from
10 cm to 100 cm on the main. The range extension from the static units also provided an extended 100 cm
sensing range, marking a significant improvement and an additional advantage compared to other similar
setups.
Building an autonomous mobile robot navigation system is challenging, particularly in unknown or
unmapped environments. The system successfully integrates multiple sensors to gather comprehensive
environmental data, enabling it to generate appropriate behaviors and achieve its objectives autonomously.
This integration is crucial for the robot to operate effectively in environments that are out of human reach or
too dangerous for human intervention.
In conclusion, the successful development and testing of this autonomous firefighting robot
prototype demonstrates its potential as a practical and innovative solution for fire disaster management. By
addressing key challenges in fire detection and navigation, this research paves the way for future
advancements in autonomous firefighting technology, promising safer and more efficient operations. The
project's outcomes suggest a transformative impact on firefighting practices, significantly reducing the risks
faced by human firefighters and enhancing overall fire management strategies.
REFERENCES
[1] “Firefighter - job profile, 13 September 2017,” National Careers Service. https://nationalcareers.service.gov.uk/job-
profiles/firefighter (accessed Nov. 12, 2021).
[2] A. Heydari, A. Ostadtaghizadeh, A. Ardalan, A. Ebadi, I. Mohammadfam, and D. Khorasani-Zavareh, “Exploring the criteria and
factors affecting firefighters’ resilience: A qualitative study,” Chinese Journal of Traumatology - English Edition, vol. 25, no. 2,
pp. 107–114, Mar. 2022, doi: 10.1016/j.cjtee.2021.06.001.
[3] B. Yang, X. Xu, T. Zhang, Y. Li, and J. Tong, “An indoor navigation system based on stereo camera and inertial sensors with
points and lines,” Journal of Sensors, vol. 2018, pp. 1–14, Jul. 2018, doi: 10.1155/2018/4801584.
[4] V. Kumar, A. Hussain, and R. Pv, “Fire fighting robot with gsm technology and GPS,” International Journal of Creative
Research Thoughts, vol. 10, no. 5, 2022.
[5] B. T. Archana and K. Suma, “Design and fabrication of an autonomous fire fighting robot with obstacle detection and fire
detection using Arduino,” International Research Journal of Engineering and Technology, 2019.
[6] I. D. Lawrence, J. Agnishwar, and R. Vijayakumar, “Revolutionizing firefighting: an experimental journal on the design and
performance of drones in fire suppression,” European Chemical Bulletin, vol. 12, no. 12, pp. 1238–1252, 2023.
[7] R. Rossi, “Fire fighting and its influence on the body,” Ergonomics, vol. 46, no. 10, pp. 1017–1033, Aug. 2003, doi:
10.1080/0014013031000121968.
[8] M. Aliff, M. Yusof, N. S. Sani, and A. Zainal, “Development of fire fighting robot (QRob),” International Journal of Advanced
Low-cost multi-sensing fire-fighting robot with obstacle avoidance mechanism (Adekunle Taofeek Oyelami)
378 ISSN: 2722-2586
Computer Science and Applications, vol. 10, no. 1, pp. 142–147, 2019, doi: 10.14569/IJACSA.2019.0100118.
[9] A. Islam, N. Kaur, F. Ahmad, and P. Sathya, “Intelligent wireless fire extinguishing robot,” International Journal of Current
Engineering and Technology, vol. 6, no. 2, pp. 520–526, 2016.
[10] Y. Su, T. Wang, S. Shao, C. Yao, and Z. Wang, “GR-LOAM: LiDAR-based sensor fusion SLAM for ground robots on complex
terrain,” Robotics and Autonomous Systems, vol. 140, Jun. 2021, doi: 10.1016/j.robot.2021.103759.
[11] K. Zhang, C. Shen, Q. Zhou, H. Wang, Q. Gao, and Y. Chen, “A combined GPS UWB and MARG locationing algorithm for
indoor and outdoor mixed scenario,” Cluster Computing, vol. 22, no. S3, pp. 5965–5974, Jan. 2019, doi: 10.1007/s10586-018-
1735-9.
[12] A. T. Oyelami, E. W. Fisayo, and A. O. Emmanuel, “4-degree-of-freedom voice-controlled robotic arm,” IAES International
Journal of Robotics and Automation (IJ-RA), vol. 12, no. 4, pp. 341–351, Dec. 2023, doi: 10.11591/ijra.v12i4.pp341-351.
[13] H. J. Hyung, B. K. Ahn, B. Cruz, and D. W. Lee, “Analysis of android robot lip-sync factors affecting communication,” in
ACM/IEEE International Conference on Human-Robot Interaction, Mar. 2016, pp. 441–442, doi: 10.1109/HRI.2016.7451796.
[14] T. Rakib and M. A. R. Sarkar, “Design and fabrication of an autonomous fire fighting robot with multisensor fire detection using
PID controller,” in 2016 5th International Conference on Informatics, Electronics and Vision (ICIEV), May 2016, pp. 909–914,
doi: 10.1109/ICIEV.2016.7760132.
[15] A. T. Oyelami, O. M. Bamgbose, and O. A. Akintunlaji, “Mission-planner mapped autonomous robotic lawn mower,” Journal
Europeen des Systemes Automatises, vol. 56, no. 2, pp. 253–258, Apr. 2023, doi: 10.18280/jesa.560210.
[16] K. Nonami, R. K. Barai, A. Irawan, and M. R. Daud, “Historical and modern perspective of walking robots,” in Intelligent
Systems, Control and Automation: Science and Engineering, vol. 66, Springer Japan, 2014, pp. 19–40.
[17] N. R. Nazar Zadeh, A. H. Abdulwakil, M. J. R. Amar, B. Durante, and C. V. N. Reblando Santos, “Fire-fighting UAV with
shooting mechanism of fire extinguishing ball for smart city,” Indonesian Journal of Electrical Engineering and Computer
Science (IJEECS), vol. 22, no. 3, pp. 1320–1326, Jun. 2021, doi: 10.11591/ijeecs.v22.i3.pp1320-1326.
[18] S. Li et al., “An indoor autonomous inspection and firefighting robot based on SLAM and flame image recognition,” Fire, vol. 6,
no. 3, Feb. 2023, doi: 10.3390/fire6030093.
[19] B. Madhevan, R. Sakkaravarthi, G. Mandeep Singh, R. Diya, and D. K. Jha, “Modelling, simulation and mechatronics design of a
wireless automatic fire fighting surveillance robot,” Defence Science Journal, vol. 67, no. 5, pp. 572–580, Sep. 2017, doi:
10.14429/dsj.67.10237.
[20] N. A. L. Sathiabalan et al., “Autonomous robotic fire detection and extinguishing system,” Journal of Physics: Conference Series,
vol. 2107, no. 1, Nov. 2021, doi: 10.1088/1742-6596/2107/1/012060.
[21] B. O. Akinloye and G. O. Uzedhe, “Development of a dual-mode fire fighting robot,” FUPRE Journal of Scientific and Industrial
Research (FJSIR), vol. 7, no. 1, pp. 23–30, 2023.
[22] B. AL-Madani, F. Orujov, R. Maskeliūnas, R. Damaševičius, and A. Venčkauskas, “Fuzzy logic type-2 based wireless indoor
localization system for navigation of visually impaired people in buildings,” Sensors, vol. 19, no. 9, May 2019, doi:
10.3390/s19092114.
[23] Z. Wang, Y. Wu, and Q. Niu, “Multi-sensor fusion in automated driving: a survey,” IEEE Access, vol. 8, pp. 2847–2868, 2020,
doi: 10.1109/ACCESS.2019.2962554.
[24] I. Belkin, A. Abramenko, and D. Yudin, “Real-time lidar-based localization of mobile ground robot,” Procedia Computer
Science, vol. 186, pp. 440–448, 2021, doi: 10.1016/j.procs.2021.04.164.
[25] H. Du, Q. Li, T. Chen, Y. Liu, H. Zhang, and Z. Guan, “Research on active firefighting robot navigation based on the improved
AUKF algorithm,” 2023, pp. 96–105.
[26] A. Gupta and X. Fernando, “Simultaneous localization and mapping (SLAM) and data fusion in unmanned aerial vehicles: recent
advances and challenges,” Drones, vol. 6, no. 4, Mar. 2022, doi: 10.3390/drones6040085.
BIOGRAPHIES OF AUTHORS
IAES Int J Rob & Autom, Vol. 13, No. 4, December 2024: 373-379
IAES Int J Rob & Autom ISSN: 2722-2586 379
Low-cost multi-sensing fire-fighting robot with obstacle avoidance mechanism (Adekunle Taofeek Oyelami)