Jianpingwu 2009
Jianpingwu 2009
Jianping Wu, Zhaobin Liu, Jinxiang Li, Caidong Gu, Maoxin Si, Fangyong Tan
JiangSu Province Support Software Engineering R&D Center for Modern Information Technology Application in Enterprise,
Suzhou, China, 215104
Suzhou Vocational University
Suzhou, China, 215104,
Abstract—This paper presents a new method based on digital limit. And another disadvantage is that radar sensor can only
image processing to realize the real-time automatic vehicle speed track one vehicle at any time.
monitoring using video camera. Based on geometric optics, it first
presents a simplified method to accurately map the coordinates in In this paper, we present a new algorithm that takes
image domain into real-world domain. The second part is focused advantage of the digital image processing and camera optics to
on the vehicle detection in digital image frames in video stream. automatically and accurately detect vehicle speed in real-time.
Experiment shows it requires only a single digital video camera The algorithm requires only a single video camera and an
and an onboard computer and can simultaneously monitor onboard processing computer to operate and can
vehicle speeds in multiple lanes. The detected vehicle speed’s simultaneously detect vehicle speeds in multiple lanes with
average error is below 4%. high accuracy, with less than 4% of error, in real time. The
algorithm only requires that the camera is set up directly above
Index Terms—Digital Image Processing, vehicle speed the target road section (at least 5 meters above the road to
detection, computer vision assure satisfactory accuracy) with its optical axis tilting a
certain angle downward from the highway forward direction.
I. INTRODUCTION
The calibration is very simple and is done directly on the video
Vehicle speed monitoring is very important for enforcing frames based on the position of an easy-to-get vanishing point
speed limit laws. It also tells the traffic conditions of the and a vehicle’s known length and width and its information
monitored section of the road of interest. (upper edge position and lower edge position) in a sample
Traditionally, vehicle speed monitoring or detection is image. The calibration does not require any information about
realized using radar technology, specifically radar gun and the camera, such as focal length. The only specification of the
radar detector. Radar is an electromagnetic pulse generated by camera that is needed is its frame rate, or frames per second.
a transmitter, which sends the radio-frequency pulse down the
highway to reflect off a moving vehicle. The reflection induces
a very slight frequency shift, called Doppler Shift. The II. THE MAPPING OF COORDINATES FROM IMAGE DOMAIN
frequency shift can be analyzed to determine the true speed of a TO REAL-WORLD DOMAIN
moving vehicle. Radar has been used to monitor vehicle speed
A. The Formula for Coordinates Mapping between Image
since the end of World War II and is now the ONLY tool
widely used to detect the vehicle speed by police. But the use Domain and Real-world Domain
of radar in speed detection has its limit. The cosine error rises The vehicles’ positions in video images must be
when the radar gun direction is not on the direct path of the transformed into their real-world coordinates, which is a 2D-to-
oncoming traffic. When the radar gun is located at the side of 3D mapping, generally impossible without additional
the road or above the road, the cosine error becomes a constrictions. However, because vehicles can not leave the
significant factor affecting its accuracy. For example, a 15o surface of road plane, vehicles’ motion is essentially a 2-D one,
deviation of the direct path could cause the reading speed 3% which makes the transformation of image coordinates of
less than the real speed and 30o deviation causes 13% error in vehicles’ positions into real-world coordinates a 2D-to-2D
speed reading. In addition, shadowing (radar wave reflection mapping and can be very accurately formulated. The following
from two different vehicles with different heights) and radio- sections are focused on the calculation of the mapping function
frequency interference (error caused by the existence of similar between vehicle’s image coordinates and real-world
band of RF in environment) are two other important factors that coordinates.
cause errors in radar speed detection. Because of these errors, First let’s take a look how the video camera is usually set
local police in United States usually does not ticket a vehicle up when video images are taken on the road traffic. As shown
with the detected speed less than 8 km or less over the speed in Figure 1, the camera is put H above the road with its optical
axis tilted at an angle θ from the road’s forward direction. We that the distance between an object and lens plane is z and the
can easily deduce the mapping function between image domain distance between an object and the optical axis of the lens is y.
and real-world domain based on geometrical optics. And we assume that the object’s image has a height v and its
distance to the lens plane is w.
1 1 1
= +
Y Y’ f z w
(1)
θ v w v y
Video Camera = , =
y z w z
x' = x
u
z ' = z cos θ + H sin θ (2)
y ' = z sin θ − H cos θ
Where H is the height of the camera from the road surface
Edge mark 1 Edge mark 2 and θ is the angle between z and z’ axis. Combined with
Equation (1), we have:
between CCD plane and lens plane, which is a constant after Similarly, we can also derive x from Equations (3) and (4)
the camera is set up.
( z cosθ + h sin θ )u tan θ
It is noted that when z − > ∞ or (u, v) converges to a x=
single point, which is called vanishing point that corresponds to Vv
the infinity distance point in the road surface . We use (Uv, Vv) (9)
u u
to represent vanishing point. = h cscθ = C3
Vv − v Vv − v
wx
U v = lim =0
z → ∞ z cosθ + h sin θ
(4) And the mapping function between the real-world
w( z sin θ − h cosθ ) coordinates and image coordinates can be written as:
Vv = lim = w tan θ
z →∞ z cosθ + h sin θ
⎧ C3u
Equation (4) shows that that after the camera is set up, the ⎪x = V − v
vanishing point is fixed as shown in Fig. 3. Its vertical ⎪ v
coordinate is proportional to tan θ . We can use the property ⎨ (10)
of the vanishing point to calibrate our mapping between image ⎪ z = C1
coordinate system and real-world coordinate system. ⎪⎩ Vv − v
In an image the vanishing point is very easy to find by getting Where (x, z) is the real-world coordinates of a point in the
the cross-section point of two edge mark lines as shown in Fig. road surface plane with z along the road forward direction and
3. We can use this information to calibrate the mapping x along the road’s transverse direction. And (u, v) is the image
function between image coordinates and real-world domain coordinates of the same point with u along the
coordinates. horizontal dimension and v along the vertical dimension. Vv is
We assume a point in the road surface has the real-world the road surface vanishing point vertical coordinate. As shown
coordinates (x, 0, z), and its corresponding image domain in Figure 3, we can use the vanishing point and a vehicle of
(u , v) . Then based on Equation (3) and (4), we known size (length and width) to calibrate Vv, C1 and C2.
coordinates
have: In an image the vanishing point is very easy to find by
getting the cross-section point of two parallel edge mark lines
v tan θ y ' z sin θ − H cos θ as shown in Figure 3. We can use this information to calibrate
= = the mapping function between image coordinates and real-
Vv z ' z cos θ + H sin θ (5) world coordinates.
The Equation (5) can be changed into Equation (6):
2 HVv
z= − H tan θ
sin 2θ (Vv − v) (6)
where H is the height of the location of the video camera
from the road surface and θ is the angle between optical axis
of the camera and road surface. Because both H and θ are
constants after the camera is set up. Equation (6) can be
simplified as:
C1
z= + C2 (7)
Vv − v
2 HVv
Where C1 = and C 2 = − H tan θ are two
sin 2θ
constants related to camera setup. And Because C2 is only a Figure 3 The vanishing point and calibration
translation constant in the road forward direction, we can
eliminate it by moving the origin of our real-world coordinates
by C2. And the Equation (7) can be further simplified to B. The Calibration of the Coordinates Mapping
The calibration is carried out as follows. As shown in
C1
z= (8) Figure 3, we first get the position of the vanishing point (Uv,
Vv − v Vv) of the road surface from the intersection of two edge mark
lines. Then we use a car of known length L and W in the image