66 - 79 - New Real Time (M) Bug Algorithm For Path Bplanning V
66 - 79 - New Real Time (M) Bug Algorithm For Path Bplanning V
Abstract- This paper proposes a new sensor-based rescue missions fields, surveillance and exploration
path planning and collision avoidance algorithm for non- tasks such as forest fire monitoring, ... etc.
holonomic robot travelling in unknown 2D environment Autonomous mobile robots can be simply defined as
named as (M-Bug) algorithm in order to reduce the intelligent machines that are capable of performing
overall of traveled path. It uses the difference between
slope lines to find and choose the smallest angle located
certain tasks in their environment without explicit
between slope of straight line from mobile robot to goal human interaction, decrease the operating costs and
point and slope from mobile robot and two endpoints of complexity[1]. For the purpose of this research, only
edges for obstacle, by choosing the smallest one which the algorithms that allow for conditions of navigation
refers to nearest leave point to reach goal point and in an unknown environment using range on-board
repeat this action with other obstacles until mobile robot sensors and are suitable for real-time, embedded
reach to goal point. this algorithm based on using on- applications with an acceptable performance are
board sensors to generate information about the considered.
environment and obstacle located in front of mobile One of the fundamental fields of research, is path
robot. This algorithm creates better performance when
compared to the other traditional algorithms. It provides
planning [2], Path planning in robotics is defined as
real time obstacle avoidance, compatible with LIDAR navigation that will be collision free and most optimum
sensor and fast enough to be implemented on embedded for the autonomous vehicle to maneuver from a start
microcontrollers. The used algorithm is based on a point to goal point. The purpose of a path-planning
modification for existing path planning and obstacle method for a mobile robot is to find a continuous
voidance algorithms (Bug1, Bug2, Tangent-Bug, K-Bug, collision-free path from start point to goal point. The
Dist-Bug …...etc.), and The Vector Field Histogram path planning problem is divided into two main
(VFH) Algorithm in unknown environment. The main categories: known-environment and unknown-
advantage of this algorithm is that it is not limited to the environment. In the known-environment, the
origin-destiny line and changes the main line every leave
point. MATLAB 2018a is used in simulation to validate
information about the location and size of obstacles is
the effectiveness of the proposed algorithm, and it is sufficiently provided and the opposite for the
implemented on multiple scenarios to show the ability of unknown-environment [3].
the mobile robot to follow a path, detect obstacles, and There were many algorithms which are used in the
navigate around them to avoid collision. unknown environment. one of the most simple and
effective algorithms is Bug family algorithms which it
Keywords- Path planning, Collision avoidance, Bug will be discussed. Bug algorithms, a kind of
algorithm, Autonomous mobile robots, LIDAR, non- autonomous navigation algorithm for unknown
holonomic. environment and is proposed by Lumelsky and
I. INTRODUCTION Stepanov[4]. The method tries to find and specify a
Over the recent decades, the field of robotics and route with the least path points. the advantages of this
especially mobile robotics has great progress in principle are simple and easy to realize. The
technological advancement and degree of autonomy. assumption considered that on-board sensors can get
Aautonomous mobile robots are used in a wide range complete obstacle information. the robot is a particle
of fields; from factories production lines, Research and with perfect self-positioning capability, and according
( , ) ( ) (2)
26
ICCTA 2019, 29-31 October, Alexandria, Egypt
Algorithm Description
To implement this algorithm, first the 2-D
Cartesian co-ordinate of goal point and start point
putted in algorithm (or from GPS). There are three
main steps for the algorithm in order to implement the
autonomous navigation from the start point to the goal
point as shown in the flow chart in Fig.5. The three
points are summarized as follows
1) motion towards target.
2) scan obstacle surface or boundary
follower until find leaving point depending on
the relation between the main slope and the
Fig.4. The polar histogram used in VFH algorithm[9] edge slope.
27
ICCTA 2019, 29-31 October, Alexandria, Egypt
3) make a new start point and go …. Is angle of slope line between center of
forward to goal point. mobile robot (LIDAR sensor) and upper edge of
obstacle.
Then calculated two angles and shown in
Fig.7. located between mobile robot and two
boundaries with respect to origin-destiny line (dashed
line) from equations 6 and 7.
1 | 1 | (6)
Fig.5. Flow chart for proposed (M-Bug) algorithm 2 | 2 | (7)
where
The mobile robot starts to moving from start point …. Is the angle between upper edge and density
to goal point in straight line until the LIDAR sensor line.
rays hit the first obstacle in this path. the algorithm …. Is the angle between lower edge and density
scans the first obstacle defining the two-boundary line.
points Fig.6, and calculated the slope angles , After the small angle chosen by the algorithm, the
and from equations 3, 4 and 5. mobile robot moves to the direction for corresponding
boundary edge which will be leave point.
Under condition of non-holonomic mobile robot
which have front width of (L). adding to the offset
distance (L/2) to avoid hit obstacle and for smooth
rotation and will be the start point for the next step
Fig.8.
28
ICCTA 2019, 29-31 October, Alexandria, Egypt
Fig.9.Repeting algorithm steps to reach to goal point Fig.12.Obstacle with U shape (local minima effect)
The main advantage of this algorithm is that it is not 3)The worst case if the LIDAR sensor with 360
limited to the origin-destiny line and changes the main degree of scan and there is no visible path with width>
line every hit point until reach to the goal point throw W. The vehicle will enter in sleep mood until the
trajectory created by algorithm Fig.10. conditions changed.
IV. SIMULATION AND RESULTS ANALYSIS
In this work, two differential wheel non-holonomic
mobile robot model was used [15] as shown in Fig.13
with specifications, Look ahead Distance for LIDAR =
0.5m, Linear Velocity = 0.75 m/s, angular Velocity =
1.5 rad/s, Wheel radius R = 0.1 m, Wheelbase L = 0.5m
with controller created in MATLAB2018a for this
experiment as shown in the block diagram in Fig.14,
and the environment grade is (10X10)m .
Fig.11.Gap between two obstacles ≤ the width of mobile robot some simulations were shown to illustrate the proposed
front with (L) algorithm. All simulations were programmed using
2)in case of obstacle with U shape, local minima MATLAB2018a shown in Fig15, 16, 17, 18 and 19.
effect the LIDER will detect two boundary edges, First, the robot starts sensing the environment with
mobile robot will not enter inside to reduce time, lidar sensor and follows the steps of the algorithm to
distance and energy cost Fig.12. reach the goal point, to avoid collision based on the
29
ICCTA 2019, 29-31 October, Alexandria, Egypt
proposed algorithm stated above. When the robot In order to verify the algorithm, Fig.20 represent
detects an object with lidar sensor, it navigates around the distance taken by mobile robot to reach the goal
based on the smallest angle between the slope of the point in M-Bug, Bug2 and VFH algorithms and Fig.21
major line and the slopes of boundary lines which represent the time taken by mobile robot to reach the
correspond to short distance and therefore short time. goal point in M-Bug, Bug2 and VFH algorithms .
The robot will follow the same steps each time it
detects an obstacle and cerates its path again. Table.1 show the Rate of distance and time
decrease between M-Bug and two aerial algorithm
1)Environment #1 Bug2 and vector field histogram (VFH).
The start and goal points are represented by (1,1)
and (9,9), the robot can move from start point to goal
point independently, safely and without collision, and
the robot can cross the long and narrow channels,
complex environment and avoid local minima.
15
10
0
Fig.16. Simulation for environment #2 #1 #2 #3 #4 #5
M-Bug 11.775 12.225 11.925 12.15 11.475
3) Environment #3 Bug2 13.12 13.425 12.375 12.375 11.7
VFH 14.85 18.3 13.8 13.425 15.4
30
ICCTA 2019, 29-31 October, Alexandria, Egypt
1063, 1986.
Table .1. Rate of distance and time decrease
Rate of distance Rate of time
[5] J. Ng and T. Bräunl, “Performance
decrease decrease comparison of Bug navigation algorithms,” J.
Enviro- Operations M-Bug M-Bug M-Bug M-Bug Intell. Robot. Syst. Theory Appl., vol. 50, no.
nment with with with with 1, pp. 73–84, 2007.
Number
Number Bug2 VFH Bug2 VFH [6] I. Kamon and E. Rivlin, “Sensory-based
#1 10 10.3% 20.7% 10.4% 20.7%
motion planning with global proofs,” IEEE
#2 10 8.9% 33.2% 9.2% 32.7% Trans. Robot. Autom., vol. 13, no. 6, pp. 814–
822, 1997.
#3 10 3.6% 13.6% 3.9% 13.5%
[7] N. Buniyamin, W. A. J. Wan Ngah, and Z.
#4 10 1.8% 9.5% 1.9% 9.6% Mohamad, “PointsBug versus TangentBug
algorithm, a performance comparison in
#5 10 1.9% 25.5% 17.6% 23.2% unknown static environment,” 2014 IEEE
Sensors Appl. Symp. SAS 2014 - Proc., pp.
278–282, 2014.
V. CONCLUSION.
[8] Yufka, Alpaslan & Parlaktuna, Osman.
In this paper a new algorithm is introduced based
on sensor-based algorithms in 2D unknown (2009). Performance Comparison of BUG
environment to resolve the autonomous navigation in Algorithms for Mobile Robots.
the unknown 2D static environment considering the 10.13140/RG.2.1.2043.7920.
kinematical characteristics of the mobile robot. It [9] J. Borenstein, Y. Koren, S. Member, A. Arbor,
depends on the use of LIDAR sensor for generating A. T. Laboratories, and A. Arbor, “THE
information about the environment and obstacle. VECTOR FIELD HISTOGRAM -,” vol. 7,
The proposed algorism was compared with Bug2 no. 3, pp. 278–288, 1991.
and VFH algorithms. The simulation results proved [10] I. Ulrich and J. Borenstein, “Reliable Obstacle
that the proposed (M-Bug) algorithm decrease distance Avoidance for Fast Mobile Robots,” no. May,
and time taken by mobile robot and the algorithm pp. 1572–1577, 1998.
outperforms other methods in the length and time. [11] I. Ulrich and J. Borenstein, “VFH *: Local
This research could be further developed in 3D Obstacle Avoidance with Look-Ahead
unknown environment with static obstacle and /or Verification,” no. April, pp. 2505–2511, 2000.
dynamic environment. [12] A. Pfrunder, P. V. K. Borges, A. R. Romero,
G. Catt, and A. Elfes, “Real-time autonomous
REFERENCES ground vehicle navigation in heterogeneous
[1] B. Paden, M. Cap, S. Z. Yong, D. Yershov, environments using a 3D LiDAR,” IEEE Int.
and E. Frazzoli, “A Survey of Motion Conf. Intell. Robot. Syst., vol. 2017-Septe, pp.
Planning and Control Techniques for Self- 2601–2608, 2017.
driving Urban Vehicles,” pp. 1–27, 2016. [13] S. Ricardo, D. Bein, and A. Panagadan, “Low-
[2] Q. M. Nguyen, L. N. M. Tran, and T. C. cost, real-time obstacle avoidance for mobile
Phung, “A Study on Building Optimal Path robots,” 2017 IEEE 7th Annu. Comput.
Planning Algorithms for Mobile Robot,” Commun. Work. Conf. CCWC 2017, 2017.
Proc. 2018 4th Int. Conf. Green Technol. [14] M. M. Almasri, A. M. Alajlan, and K. M.
Sustain. Dev. GTSD 2018, pp. 341–346, 2018. Elleithy, “Trajectory Planning and Collision
[3] J. Giesbrecht, B. Touchton, T. Galluzzo, D. Avoidance Algorithm for Mobile Robotics
Kent, C. D. Crane, and J. Giesbrecht, “Global System,” IEEE Sens. J., vol. 16, no. 12, pp.
Path Planning for Unmanned Ground 5021–5028, 2016.
Vehicles,” From OAIster, Provid. by OCLC [15] M. A. A. Hemmat, Z. Liu, and Y. Zhang,
Coop., no. December, pp. 1–41, 2004. “Real-Time path planning and following for
[4] V. L. and A. Stepanov, “Dynamic path nonholonomic unmanned ground vehicles,”
planning for a mobile automation with limited Int. Conf. Adv. Mechatron. Syst. ICAMechS,
information on the environment,” IEEE Trans. vol. 2017-Decem, pp. 202–207, 2018.
Automat. Contr., vol. 31, no. 11, pp. 1058–
31