BSMS3
BSMS3
Mohammad Khoshnevisan
Physics Department
College of Science
Northeastern University
Boston, USA
[Link]@[Link]
Abstract— Sensor selection is an essential aspect of blind comprises sensors; so, the choice of sensors must be taken
spot detection (BSD) systems. Indeed, we must choose the seriously in order to achieve the desired performance in any
sensors appropriately to accomplish high accuracy and condition with the least number of sensors. The aim of BSD
performance in any driving condition. Each sensor has exclusive systems is warning whenever there is a vehicle in the driver’s
properties which are suitable for a few specific circumstances. blind spot in ADAS applications. Although, in the case of
Therefore, a comprehensive study is warranted to determine the AVs, the blind spot includes all of the vehicle’s surrounding
optimum number of sensors and the type of sensors for BSD. areas. The conventional sensors for BSD are Radar, Lidar,
Although sensors have some deficiencies which can deteriorate Camera, and Ultrasonic sensors. Each one of these sensors has
the whole system’s performance, a combination of multiple
particular merits and demerits, which makes it challenging to
types of sensors together with data fusion methods in most cases
discriminate between them. Therefore, a comprehensive study
can substantially compensate for sensors imperfection. In this
paper, we have concentrated on multi-sensor fusion in BSD
of these sensors is warranted.
system and its advantages which cannot be achieved in a single- The current BSD system for the vehicles of renowned
sensor BSD system. A sensing structure for the BSD system is companies such as Tesla, Toyota, Ford, Mercedes Benz, and
proposed considering the indispensable factors in BSD coupled Lexus consists of only radar sensors which are frequently
with sensors constraints, features, and specifications. installed on both corners of the rear bumper. Radar sensors
cannot function well in rainy or snowy weather, and they have
Keywords—Blind Spot Detection, Sensors, Radar, Camera,
low accuracy due to clutters. Consequently, the current BSD
Ultrasound, LIDAR, Object Detection.
system cannot satisfy the desired performance and has
I. INTRODUCTION deficiencies.
Nowadays, vehicles are subject to be fully automated due BSD has many challenges, including the choice of number
to the remarkable improvement in technology and the and the type of sensors that are employed coupled with object
increasing need for automation. As shown in fig. 1, few steps detection using sensors information. The constraints of
must be taken before reaching the full automation level, and sensors and their intrinsic properties, make sensor selection
advanced driver assistance systems (ADAS) are one of these difficult. For example, radar can directly measure the speed of
steps. Many of the accidents are caused by blind spots, drivers, objects while the other sensors cannot. However, radars
abnormal behavior, and severe driving conditions. For these performance degrades significantly in harsh weather. On the
irregularities, ADAS is imposed to increase the safety of other hand, the ultrasonic sensor can detect the distance of
driving. ADAS has many applications such as adaptive cruise objects in any weather conditions; but it has a short range of
control (ACC), lane change assist (LCA), park assist, and detection. Consequently, we have to overcome these
Blind spot detection (BSD). difficulties that are posed by sensors constraints in BSD.
Furthermore, detection of vehicles at night and in adverse
weather conditions are the challenges of camera-based BSD
system. We can extract edges, shadows, and bright lamps of
vehicles to detect them in the daytime and nighttime [1]. The
challenging task in object detection with Lidar sensors is their
enormous amount of data. This massive data indeed makes it
arduous to maintain a low processing time. The BirdNet
framework can be applied to Lidar data for vehicle detection
with low processing time [4]. The sensor data fusion will help
Figure 1. Levels of automation in vehicles [8].
to handle these challenges by taking advantage of each sensor
BSD is one of the necessities of autonomous vehicles exclusive properties, which result in a smaller number of
(AV) applications such as localization and planning. BSD misdetections and elimination of possible defects [6].
consists of 5 parts, which are software, sensors, object
detection, data fusion, and control. The central part of the BSD In this paper, we’re going to propose a structure of sensors
for BSD system by examining the constraints and features of
Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.
978-1-7281-2311-0/19/$31.00 ©2019 IEEE
2019 International Conference on Control, Automation and Information Sciences (ICCAIS)
each sensor. There are many factors to be considered when above the speed ranges of 10kph, which simply means that it
designing a BSD system, including a field of view (FOV) of cannot detect vehicles in scenarios like exiting a parking lot.
sensors, various driving scenarios, the overall system response Moreover, radar and Lidar sensors cannot work well in
time, and environmental effects. It should be noted that it is adverse weather conditions.
imperative to cover the whole blind spot with overlapping of
Data fusion can compensate sensors defects in order to
sensors FOV in order to increase the detection ratio and
obtain better performance and reduce uncertainties. The
accuracy. Also, the BSD system must have proper
fusion approach used in [6] and its promising results have
functionality in any driving scenario like exiting from a
drawn our attention to focus on sensor data fusion in our
parking lot or turning a road curvature. The overall system
proposed sensor structure for BSD system. Fusion at detection
response time will increase the possibility of an accident if it
level can lead to better accuracy and less false alarm rates [6].
exceeds a limit which should be less than 300ms according to
In the sensor configuration of [6], radars can provide data
ISO17387. Consequently, the processing time of sensors is
about the vehicle radial velocity and its range. Lidar sensors
one of the most important factors which is around 100ms for
can provide a region of interest (ROI) for cameras and give us
Lidar sensors in high accuracy settings. It can be concluded
information about the shape of objects. Cameras can detect the
that the single-sensor-type BSD systems cannot meet the
objects in ROI provided by Lidar sensors and then classify
entire requirements for detection of vehicles in various
them. The data obtained from these three sensors can be fused
conditions. Therefore, it is necessary to use at least two
to achieve the desired aim of the BSD system.
different types of sensor, which means that the sensor data
fusion methods should be included in BSD. Lidar sensors are costly, and using them in a BSD system
won’t be cost-effective. Furthermore, Lidar sensors are
The rest of this paper is organized as follows. Section II
suitable for mapping and localization, which are not the main
reviews the previous studies and the performance of proposed
goals of BSD systems. In our proposed structure, we have
methods. Section III includes the constraints, requirements,
focused on detecting objects range from ego vehicle and their
specifications, features of sensors, and a comparison between
velocity, assuming that the vehicle is autonomous. After
them. Section IV describes our proposed sensor structure for
detecting objects and gathering the features provided by each
BSD system and object detection methods using the sensors
sensor, we fuse the sensor data to reach better accuracy and
data. Finally, conclusion and future works are presented in
fewer misdetections.
section V.
Table 1
II. REVIEW
Comparison of the performance of methods used for
Object detection is one of the essential tasks in BSD. In
object detection.
order to achieve a good detection ratio with low computational
cost, we have to overcome the existing challenges in object method %DR in %DR in %FAR Processing time
detection using each sensor data. The constraints of sensors, an urban highway
area
environmental effects like light intensity, maintaining a low
processing time, in spite of heavy computations, are the B. Wu [1] 95.45 100 0 17ms
challenges that we should expect in object detection. O. Chavez 93.6 97.8 2.2 40ms
[6]
Cameras are susceptible to different light and adverse
weather conditions, which makes object detection a K. Schueler - - - 5ms
challenging task. Some of these challenges are vehicle [10]
detection at nighttime or rainy weather. During the daytime,
edges and shadows of vehicles can be used as features to
detect them. Nevertheless, in rainy days, shadows of vehicles III. SENSORS CONSTRAINTS AND SPECIFICATIONS
could be elongated, which contributes to misdetections and
In this section, we’re going to discuss the introduced
less accuracy. In the nighttime, the brightness of vehicles
sensors constraints, specifications, features, and their
lamps is the main feature for vehicle detection. This approach
corresponding formulas. The key features of each sensor
for vehicle detection has a reasonably good detection ratio
include detection range, FOV, processing time, and accuracy.
with low false alarm rate in urban areas and highways [1].
Fig. 2 shows the detection range and FOV of sensors.
Although, its performance deteriorates significantly as the
weather and light condition get worse. This undesirable
condition is due to the cameras poor vision in bad weather and
low light intensity. Furthermore, due to the limited FOV of
cameras, there is no possibility to detect the approaching
vehicles when turning a road curvature or to cover the whole
blind spots of the vehicle.
As stated in the introduction, we cannot accomplish the
defined goals in BSD with only one sensor type. In [10], Lidar
and radar sensors are used for object detection, which covers
the whole surrounding areas of the vehicle. This structure is
better than the camera-based BSD system. Occupancy map
and object module are used for localization and mapping of
obstacles around the vehicle. The processing time of the
occupancy map and object module is about 5ms [10]. The
problem with this structure is that radar sensors get activated
Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.
2019 International Conference on Control, Automation and Information Sciences (ICCAIS)
= = (2)
Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.
2019 International Conference on Control, Automation and Information Sciences (ICCAIS)
So we could calculate the distance of the object from the effect in the BSD system overall response time. The table 4
sensor. If we take the Doppler effect into account, the Doppler demonstrates Lidar sensors parameters typical value.
frequency shift can be expressed as:
2 Table 4
= (7) Lidar sensors parameters typical value
Where is the radial velocity of the detected object. Doppler Lidar type 2D solid 2D rotary 3D rotary
frequency shift divides beat frequency into two components Lidar Model Leddartech Sick Velodyne
which can be expressed as: Example Vu8 LMS1xx VLP-16
= − Azimuth 100° 270° 360°
Elevation 0.3° 0.15° 30°
= + (8) Horizontal 2.5° 0.25° 0.1°
resolution
By replacing with equation (7) we can calculate the radial
speed of the object: Vertical N/A N/A 2°
resolution
= ( − ) Detection 215m 50m 100m
4 range
Refresh rate 10ms 50ms 100ms
= ( + ) (9)
4
The distance of an object from the Lidar sensor can be
B. Ultrasonic calculated as:
Ultrasonic sensors are cheap and have good accuracy in
distance measurement. The most important property of these .
= (12)
sensors is their ability to work in any weather condition. 2
However, Ultrasonic sensors suffer from Bad angular
resolution, low sampling rate due to the low wave speed, and Where c is the speed of the light.
short detection range. The table 3 shows ultrasonic sensors D. Camera
parameters typical value. Cameras are useful sensors for classification and can
measure objects distance and angle. Nonetheless, cameras
pose challenges in the detection of vehicles at nighttime or
Table 3 adverse weather conditions. Furthermore, due to their limited
Ultrasonic sensor parameters typical value FOV and intrinsic properties, we cannot detect objects in some
cases, such as exiting from the parking lot. There are two types
Parameter Value of vision systems:
Detection Range 0.3…6m - Monovision: Monovision can indicate the presence of
vehicles. Since it has less accuracy in distance
Resolution 0.01m measurement, it is suitable for line detection rather
Dissemination angle 90° (Azimuth) than BSD [7].
Detection interval Typ. < 100ms - Stereo vision: the depth of objects can be determined
from the images provided by stereo vision [7]. We can
also extract features like edges, corners, shadows,
Dissemination angle of ultrasonic sensors is inversely bright objects from stereo vision images [7].
affected by wave frequency. Ultrasonic sensors can detect
vehicles with speed range up to 40kph. The strength of The features can be extracted using the histogram of
ultrasonic waves attenuates as they travel in the air due to oriented gradients (HOG), color histograms, and deep
diffusion, diffraction, and absorption loss. The distance of an learning methods. Cameras can detect vehicles with a speed
object from the sensor can be calculated as: range up to 10kph when passing and 70kph when being passed
[3]. The table 5 shows cameras some parameters value.
.
= (10) Table 5
2
Camera sensors parameters typical value
Where is the ultrasonic wave speed which depends on
environmental factors such as temperature and their Parameter Value
correlation can be formulated as:
Maximum Detection 251m
= 331.5 + 0.61 (11) Range
Maximum FOV 120°
C. Lidar
Lidar sensors can provide information about the distance Detection interval Typ. < 60ms
of objects with high accuracy, objects size, and their shape.
The noticeable disadvantages of these sensors are their vast
amount of data and refresh rate, which can have a profound
Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.
2019 International Conference on Control, Automation and Information Sciences (ICCAIS)
400
350
300
250
200
150
100
50
0
Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.
2019 International Conference on Control, Automation and Information Sciences (ICCAIS)
Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.