0% found this document useful (0 votes)
32 views6 pages

BSMS3

The document discusses the design and optimization of a blind spot detection (BSD) system for vehicles, emphasizing the importance of sensor selection and data fusion to enhance performance in various driving conditions. It reviews the limitations of conventional sensors like radar, lidar, and cameras, and proposes a multi-sensor fusion approach to improve detection accuracy and reduce misdetections. The paper outlines the constraints and specifications of different sensors and suggests a comprehensive structure for an effective BSD system.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views6 pages

BSMS3

The document discusses the design and optimization of a blind spot detection (BSD) system for vehicles, emphasizing the importance of sensor selection and data fusion to enhance performance in various driving conditions. It reviews the limitations of conventional sensors like radar, lidar, and cameras, and proposes a multi-sensor fusion approach to improve detection accuracy and reduce misdetections. The paper outlines the constraints and specifications of different sensors and suggests a comprehensive structure for an effective BSD system.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

2019 International Conference on Control, Automation and Information Sciences (ICCAIS)

Sensing Structure for Blind Spot Detection System


in Vehicles
Shayan Shirahmad Gale Bagi Hossein Gharaee Garakani Behzad Moshiri
School of Electrical & Computer IRAN Telecom Research School of Electrical & Computer
Engineering, College of Engineering, Tehran, IRAN Engineering, College of Engineering,
University of Tehran gharaee@[Link] University of Tehran
Tehran, IRAN Tehran, IRAN
[Link]@[Link] moshiri@[Link]

Mohammad Khoshnevisan
Physics Department
College of Science
Northeastern University
Boston, USA
[Link]@[Link]

Abstract— Sensor selection is an essential aspect of blind comprises sensors; so, the choice of sensors must be taken
spot detection (BSD) systems. Indeed, we must choose the seriously in order to achieve the desired performance in any
sensors appropriately to accomplish high accuracy and condition with the least number of sensors. The aim of BSD
performance in any driving condition. Each sensor has exclusive systems is warning whenever there is a vehicle in the driver’s
properties which are suitable for a few specific circumstances. blind spot in ADAS applications. Although, in the case of
Therefore, a comprehensive study is warranted to determine the AVs, the blind spot includes all of the vehicle’s surrounding
optimum number of sensors and the type of sensors for BSD. areas. The conventional sensors for BSD are Radar, Lidar,
Although sensors have some deficiencies which can deteriorate Camera, and Ultrasonic sensors. Each one of these sensors has
the whole system’s performance, a combination of multiple
particular merits and demerits, which makes it challenging to
types of sensors together with data fusion methods in most cases
discriminate between them. Therefore, a comprehensive study
can substantially compensate for sensors imperfection. In this
paper, we have concentrated on multi-sensor fusion in BSD
of these sensors is warranted.
system and its advantages which cannot be achieved in a single- The current BSD system for the vehicles of renowned
sensor BSD system. A sensing structure for the BSD system is companies such as Tesla, Toyota, Ford, Mercedes Benz, and
proposed considering the indispensable factors in BSD coupled Lexus consists of only radar sensors which are frequently
with sensors constraints, features, and specifications. installed on both corners of the rear bumper. Radar sensors
cannot function well in rainy or snowy weather, and they have
Keywords—Blind Spot Detection, Sensors, Radar, Camera,
low accuracy due to clutters. Consequently, the current BSD
Ultrasound, LIDAR, Object Detection.
system cannot satisfy the desired performance and has
I. INTRODUCTION deficiencies.
Nowadays, vehicles are subject to be fully automated due BSD has many challenges, including the choice of number
to the remarkable improvement in technology and the and the type of sensors that are employed coupled with object
increasing need for automation. As shown in fig. 1, few steps detection using sensors information. The constraints of
must be taken before reaching the full automation level, and sensors and their intrinsic properties, make sensor selection
advanced driver assistance systems (ADAS) are one of these difficult. For example, radar can directly measure the speed of
steps. Many of the accidents are caused by blind spots, drivers, objects while the other sensors cannot. However, radars
abnormal behavior, and severe driving conditions. For these performance degrades significantly in harsh weather. On the
irregularities, ADAS is imposed to increase the safety of other hand, the ultrasonic sensor can detect the distance of
driving. ADAS has many applications such as adaptive cruise objects in any weather conditions; but it has a short range of
control (ACC), lane change assist (LCA), park assist, and detection. Consequently, we have to overcome these
Blind spot detection (BSD). difficulties that are posed by sensors constraints in BSD.
Furthermore, detection of vehicles at night and in adverse
weather conditions are the challenges of camera-based BSD
system. We can extract edges, shadows, and bright lamps of
vehicles to detect them in the daytime and nighttime [1]. The
challenging task in object detection with Lidar sensors is their
enormous amount of data. This massive data indeed makes it
arduous to maintain a low processing time. The BirdNet
framework can be applied to Lidar data for vehicle detection
with low processing time [4]. The sensor data fusion will help
Figure 1. Levels of automation in vehicles [8].
to handle these challenges by taking advantage of each sensor
BSD is one of the necessities of autonomous vehicles exclusive properties, which result in a smaller number of
(AV) applications such as localization and planning. BSD misdetections and elimination of possible defects [6].
consists of 5 parts, which are software, sensors, object
detection, data fusion, and control. The central part of the BSD In this paper, we’re going to propose a structure of sensors
for BSD system by examining the constraints and features of

Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.
978-1-7281-2311-0/19/$31.00 ©2019 IEEE
2019 International Conference on Control, Automation and Information Sciences (ICCAIS)

each sensor. There are many factors to be considered when above the speed ranges of 10kph, which simply means that it
designing a BSD system, including a field of view (FOV) of cannot detect vehicles in scenarios like exiting a parking lot.
sensors, various driving scenarios, the overall system response Moreover, radar and Lidar sensors cannot work well in
time, and environmental effects. It should be noted that it is adverse weather conditions.
imperative to cover the whole blind spot with overlapping of
Data fusion can compensate sensors defects in order to
sensors FOV in order to increase the detection ratio and
obtain better performance and reduce uncertainties. The
accuracy. Also, the BSD system must have proper
fusion approach used in [6] and its promising results have
functionality in any driving scenario like exiting from a
drawn our attention to focus on sensor data fusion in our
parking lot or turning a road curvature. The overall system
proposed sensor structure for BSD system. Fusion at detection
response time will increase the possibility of an accident if it
level can lead to better accuracy and less false alarm rates [6].
exceeds a limit which should be less than 300ms according to
In the sensor configuration of [6], radars can provide data
ISO17387. Consequently, the processing time of sensors is
about the vehicle radial velocity and its range. Lidar sensors
one of the most important factors which is around 100ms for
can provide a region of interest (ROI) for cameras and give us
Lidar sensors in high accuracy settings. It can be concluded
information about the shape of objects. Cameras can detect the
that the single-sensor-type BSD systems cannot meet the
objects in ROI provided by Lidar sensors and then classify
entire requirements for detection of vehicles in various
them. The data obtained from these three sensors can be fused
conditions. Therefore, it is necessary to use at least two
to achieve the desired aim of the BSD system.
different types of sensor, which means that the sensor data
fusion methods should be included in BSD. Lidar sensors are costly, and using them in a BSD system
won’t be cost-effective. Furthermore, Lidar sensors are
The rest of this paper is organized as follows. Section II
suitable for mapping and localization, which are not the main
reviews the previous studies and the performance of proposed
goals of BSD systems. In our proposed structure, we have
methods. Section III includes the constraints, requirements,
focused on detecting objects range from ego vehicle and their
specifications, features of sensors, and a comparison between
velocity, assuming that the vehicle is autonomous. After
them. Section IV describes our proposed sensor structure for
detecting objects and gathering the features provided by each
BSD system and object detection methods using the sensors
sensor, we fuse the sensor data to reach better accuracy and
data. Finally, conclusion and future works are presented in
fewer misdetections.
section V.
Table 1
II. REVIEW
Comparison of the performance of methods used for
Object detection is one of the essential tasks in BSD. In
object detection.
order to achieve a good detection ratio with low computational
cost, we have to overcome the existing challenges in object method %DR in %DR in %FAR Processing time
detection using each sensor data. The constraints of sensors, an urban highway
area
environmental effects like light intensity, maintaining a low
processing time, in spite of heavy computations, are the B. Wu [1] 95.45 100 0 17ms
challenges that we should expect in object detection. O. Chavez 93.6 97.8 2.2 40ms
[6]
Cameras are susceptible to different light and adverse
weather conditions, which makes object detection a K. Schueler - - - 5ms
challenging task. Some of these challenges are vehicle [10]
detection at nighttime or rainy weather. During the daytime,
edges and shadows of vehicles can be used as features to
detect them. Nevertheless, in rainy days, shadows of vehicles III. SENSORS CONSTRAINTS AND SPECIFICATIONS
could be elongated, which contributes to misdetections and
In this section, we’re going to discuss the introduced
less accuracy. In the nighttime, the brightness of vehicles
sensors constraints, specifications, features, and their
lamps is the main feature for vehicle detection. This approach
corresponding formulas. The key features of each sensor
for vehicle detection has a reasonably good detection ratio
include detection range, FOV, processing time, and accuracy.
with low false alarm rate in urban areas and highways [1].
Fig. 2 shows the detection range and FOV of sensors.
Although, its performance deteriorates significantly as the
weather and light condition get worse. This undesirable
condition is due to the cameras poor vision in bad weather and
low light intensity. Furthermore, due to the limited FOV of
cameras, there is no possibility to detect the approaching
vehicles when turning a road curvature or to cover the whole
blind spots of the vehicle.
As stated in the introduction, we cannot accomplish the
defined goals in BSD with only one sensor type. In [10], Lidar
and radar sensors are used for object detection, which covers
the whole surrounding areas of the vehicle. This structure is
better than the camera-based BSD system. Occupancy map
and object module are used for localization and mapping of
obstacles around the vehicle. The processing time of the
occupancy map and object module is about 5ms [10]. The
problem with this structure is that radar sensors get activated

Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.
2019 International Conference on Control, Automation and Information Sciences (ICCAIS)

In BSD application, radar sensors get activated at speed


ranges above 10kph. The significant advantage of radar
sensors is their ability to measure the radial speed of objects.
Besides, radar waves can penetrate through objects which can
be a useful attribute in pedestrian detection or turning a road
curvature. Radar waves attenuate in snowy and rainy weather;
hence, their performance degrades in such weather conditions.
Moreover, radars have lower accuracy due to clutters and
noise. Single input multiple output (SIMO) radars require
more receive antennas to increase the angle resolution, while
in multiple input multiple output (MIMO) radars the same
angular resolution can be achieved by adding a fewer number
of transmit and receive antennas.
Radars are divided into three categories: pulse Doppler
(PD), unmodulated continuous wave (UCW), and frequency
modulated continuous wave (FMCW). FMCW radars are
commonly used in vehicle applications. By examining the
chirp signal used in FMCW radars, we can calculate the speed
and range of objects. We can formulate a transmitted up-chirp
signal as follows:
( ) = cos(2 + ) (1)
Where is the chirpiness of the chirp signal and is the
carrier frequency. If we assume that the chirp is linear, then
chirpiness will be equal to:

= = (2)

We should expect that the received signal would be attenuated


and delayed during propagation. Therefore, the received
Figure 2. Sensors FOV and detection range. signal can be formulated as:
A. Radar
= cos 2 ( − )+ ( − ) (3)
Radar sensors can measure the distance and speed of an
object directly, but they don’t provide any information about Where is the propagation delay. If we assume that the
the shape of objects. The table 2 shows radar sensors distance of the object from the radar sensor is R, then we can
parameters typical value. calculate the propagation delay:
2
Table 2 2 = → = (4)
Radar sensors parameters typical value
Where c is the speed of the light. The beat frequency is defined
Radar type Short Range Mid-Range Long Range as the difference in the frequency of the transmitted and the
Radar Radar Radar received signal at a particular time, as shown in fig. 3. Beat
frequency can be expressed as:
Radar Model AWR1642 MRR LRR Bosch
Example TI Bosch 2
= = (5)
Azimuth 60° 90° 20°
Elevation 10° 10° 5°
Horizontal 15° 7° 5°
resolution
Maximum 90kph 80kph 270kph
detectable
speed
Speed - 0.5kph 0.432kph
measurement Figure 3. The beat frequency [2]
accuracy
According to fig. 3, we can write the following
Detection 80m 160m 250m proportionality:
range
2
Refresh rate - 60ms 80ms = = → = (6)
2

Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.
2019 International Conference on Control, Automation and Information Sciences (ICCAIS)

So we could calculate the distance of the object from the effect in the BSD system overall response time. The table 4
sensor. If we take the Doppler effect into account, the Doppler demonstrates Lidar sensors parameters typical value.
frequency shift can be expressed as:
2 Table 4
= (7) Lidar sensors parameters typical value

Where is the radial velocity of the detected object. Doppler Lidar type 2D solid 2D rotary 3D rotary
frequency shift divides beat frequency into two components Lidar Model Leddartech Sick Velodyne
which can be expressed as: Example Vu8 LMS1xx VLP-16
= − Azimuth 100° 270° 360°
Elevation 0.3° 0.15° 30°
= + (8) Horizontal 2.5° 0.25° 0.1°
resolution
By replacing with equation (7) we can calculate the radial
speed of the object: Vertical N/A N/A 2°
resolution
= ( − ) Detection 215m 50m 100m
4 range
Refresh rate 10ms 50ms 100ms
= ( + ) (9)
4
The distance of an object from the Lidar sensor can be
B. Ultrasonic calculated as:
Ultrasonic sensors are cheap and have good accuracy in
distance measurement. The most important property of these .
= (12)
sensors is their ability to work in any weather condition. 2
However, Ultrasonic sensors suffer from Bad angular
resolution, low sampling rate due to the low wave speed, and Where c is the speed of the light.
short detection range. The table 3 shows ultrasonic sensors D. Camera
parameters typical value. Cameras are useful sensors for classification and can
measure objects distance and angle. Nonetheless, cameras
pose challenges in the detection of vehicles at nighttime or
Table 3 adverse weather conditions. Furthermore, due to their limited
Ultrasonic sensor parameters typical value FOV and intrinsic properties, we cannot detect objects in some
cases, such as exiting from the parking lot. There are two types
Parameter Value of vision systems:
Detection Range 0.3…6m - Monovision: Monovision can indicate the presence of
vehicles. Since it has less accuracy in distance
Resolution 0.01m measurement, it is suitable for line detection rather
Dissemination angle 90° (Azimuth) than BSD [7].

Detection interval Typ. < 100ms - Stereo vision: the depth of objects can be determined
from the images provided by stereo vision [7]. We can
also extract features like edges, corners, shadows,
Dissemination angle of ultrasonic sensors is inversely bright objects from stereo vision images [7].
affected by wave frequency. Ultrasonic sensors can detect
vehicles with speed range up to 40kph. The strength of The features can be extracted using the histogram of
ultrasonic waves attenuates as they travel in the air due to oriented gradients (HOG), color histograms, and deep
diffusion, diffraction, and absorption loss. The distance of an learning methods. Cameras can detect vehicles with a speed
object from the sensor can be calculated as: range up to 10kph when passing and 70kph when being passed
[3]. The table 5 shows cameras some parameters value.
.
= (10) Table 5
2
Camera sensors parameters typical value
Where is the ultrasonic wave speed which depends on
environmental factors such as temperature and their Parameter Value
correlation can be formulated as:
Maximum Detection 251m
= 331.5 + 0.61 (11) Range
Maximum FOV 120°
C. Lidar
Lidar sensors can provide information about the distance Detection interval Typ. < 60ms
of objects with high accuracy, objects size, and their shape.
The noticeable disadvantages of these sensors are their vast
amount of data and refresh rate, which can have a profound

Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.
2019 International Conference on Control, Automation and Information Sciences (ICCAIS)

400
350
300
250
200
150
100
50
0

Max detectable Speed (kph) Refresh Rate (ms)


Max. Detection Range (m) Azimuth (degree)
Elevation (degree) Resolution Azimuth (degree)

Figure 4. General comparison between sensors.

IV. PROPOSED SENSING STRUCTURE FOR BSD


In the BSD system, the main goal is to detect objects in the
vehicle blind spot and determine their corresponding speed.
The BSD system requires to work properly in any weather
condition and driving scenario with good accuracy and low
processing time. The most important factors in BSD are the
following:
- System’s response time, which should be less than
300 ms, according to ISO17387. Figure 5. Proposed sensing structure. Radar sensors, cameras and
ultrasonic sensors are shown in purple, green, and orange colors respectively.
- Detection range which must be at least 3 meters from
the sides of the vehicle according to ISO17387. Ultrasonic sensors will detect approaching vehicles at low
speed in any weather condition. The method of vehicle
- Maximum detectable speed of the vehicles which detection with distance data of ultrasonic sensors is described
should be at least 36kph according to ISO17387. in [10]. Short range and long-range radars will provide
- Proper performance in all driving conditions. information about the velocity and distance of objects with
high resolution. Clustering is an important step in radar
- 360-degree coverage of the vehicle’s surrounding. detections because of radar sensors high resolution [5].
Considering the factors above, the best type of sensors for Methods such as k-means and hierarchical clustering are
BSD can be determined by comparing their properties. Lidar sensitive to noise and outliers. Density-based spatial
sensors are too expensive and provide a massive amount of clustering of applications with noise (DBSCAN) could be a
data which is not necessary for BSD application. Therefore, suitable clustering method in radar detections which is
we narrow down our choices to the camera, radar, and described in [11]. A general algorithm for vehicle detection
ultrasonic sensors. According to fig. 3, the combination of using radar information is presented in [5]. Cameras can
these three sensors will satisfy the factors of the BSD system. extract the features of the object which can be fused with
The number of sensors should be chosen so that the sensing ultrasonic and radar sensors data for classifying the detected
structure would cover 360 degree of the vehicle’s objects and increasing the detection accuracy. Since
surrounding. Considering the sensors FOV, as shown in fig. processing time is one of the crucial factors in BSD, a
2 and fig. 4, the optimal number of sensors can be determined. combination of depthwise separable convolution, residual
The proposed sensing structure for BSD system is shown in
learning, and squeeze-and-excitation could be used for
fig. 5.
vehicle detection with cameras [12]. The sensors of the
proposed structure can provide redundant information which
will prevent the system failure when one or more of the
sensors fail. Moreover, the proposed sensing structure
consists of multiple types of sensors can provide
complementary information which cannot be obtained in
single-sensor BSD systems.

Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.
2019 International Conference on Control, Automation and Information Sciences (ICCAIS)

The proposed sensing structure is suitable for ADAS


applications rather than AVs. The performance of the
proposed BSD system is yet to be analyzed in highways and
urban environments. In our future work, we will implement
the proposed sensing structure and analyze its performance
using sensor measurements and data fusion methods.
REFERENCES
[1] Bing-Fei Wu, Chih-Chung Kao, Ying-Feng Li, and Min-Yu Tsai, “A
Real-Time Embedded Blind Spot Safety Assistance System,” Hindawi
International Journal of Vehicular Technology, vol. 2012, January
2012.
[2] R. Okorn et al., “Upward-looking L-band FMCW radar for snow cover
monitoring,” Cold Regions Sci. Technol., vol. 103, pp. 31–40, Jul.
2014.
[3] Forkenbrock, G., Hoover, R. L., Gerdus, E., Van Buskirk, T. R., &
Heitz, M. (2014, July). Blind spot monitoring in light vehicles —
System performance. (Report No. DOT HS 812 045). Washington, DC:
National Highway Traffic Safety Administration.
[4] Jorge Beltran, Carlos Guindel, Francisco Miguel Moreno, Daniel
Cruzado, Fernando Garcia, Member, IEEE and Arturo de la Escalera,
“BirdNet: a 3D Object Detection Framework from Lidar Information,”
arXiv, May 2018.
[5] E. Schubert, F. Meinl, M. Kunert, and W. Menzel, “Clustering of high-
resolution automotive radar detections and subsequent feature
extraction for classification of road users,” in 16th International Radar
Symposium(IRS), Dresden, pp. 174-179, 2015.
[6] R Omar Chavez-Garcia, Olivier Aycard, “Multiple Sensor Fusion and
Figure 6. Mercedes Benz BSD System [13] Classification for Moving Object Detection and Tracking,” IEEE
Transactions on Intelligent Transportation Systems, IEEE, 2015,
One of the Mercedes Benz BSD systems is shown in the PP(99), pp. 1-10.
figure above. Since this sensing structure includes only radar [7] A. Amditis, N. Floudas, A. Polychronopoulos, D. Bank, B. Broek, and
sensors, it lacks redundant and complementary information F. Oechsle. Sensor data fusion for lateral safe applications,
Proceedings of the 13th World Congress and Exhibition on Intelligent
which can reduce the system’s reliability under severe Transport Systems and Services, London, UK, October 12, 2006.
conditions. Moreover, data fusion techniques are not used in [8] Paul Pickering, “Radar and Ultrasonic Sensors Strengthen ADAS
single-sensor BSD systems, and this is the reason that they Object Detection,” 2017. [Online]. Available:
suffer from sensors constraints and inaccurate measurements. [Link]
sensors-strengthen-adas-object-detection.
Every single-sensor BSD system has these problems
[9] K. Schueler, T. Weiherer, E. Bouzouraa, and U. Hofmann, "360 degree
exemplifying the Mercedes Benz BSD system. multi sensor fusion for static and dynamic obstacles," in Intelligent
Vehicles Symposium (IV), 2012 IEEE, June 2012, pp. 692-697.
V. CONCLUSION AND FUTURE WORK
[10] Jo Y, Jung I. Analysis of vehicle detection with WSN-based ultrasonic
We reviewed the previous studies regarding BSD and sensors. Sensors (Basel). 2014;14(8):14050–14069. Published 2014
discussed their problems and challenges. In this paper a Aug 4. doi:10.3390/s140814050
sensing structure is proposed for BSD system considering [11] M. Ester, H.-P. Kriegel, J. Sander, X. Xu, “A Density-Based Algorithm
for Discovering Clusters in Large Spatial Databases with Noise,” Proc.
important factors, various driving conditions, and ISO 2nd Int. Conf. on Knowledge Discovery and Data Mining, Portland,
standards. Multiple types of sensors have been used in this OR, AAAI Press, 1996, pp. 226-231.
structure which can help to increase detection accuracy, [12] Zhao, Y.; Bai, L.; Lyu, Y.; Huang, X. Camera-Based Blind Spot
reduce response time of system, and provide more Detection with a General Purpose Lightweight Neural Network.
Electronics 2019, 8, 233.
comprehensive data. The proposed sensing structure includes
multi-sensor fusion which makes it distinct from single- [13] Blind Spot Assist Vehicle Safety Technology — Mercedes-Benz 2013
ML-Class. Youtube: Mercedes-Benz USA, 2012.
sensor BSD systems that suffer from sensors imperfection.

Authorized licensed use limited to: VTU Consortium. Downloaded on October 15,2024 at [Link] UTC from IEEE Xplore. Restrictions apply.

You might also like