Texture-Based Fruit Detection Via Images Using The Smooth Patterns On The Fruit
Texture-Based Fruit Detection Via Images Using The Smooth Patterns On The Fruit
I. I NTRODUCTION
Growers of fruit crops currently have access to very
limited information from the current state of the crop in
their field. Automated image analysis systems for fruit crops
provide growers with high resolution spatial data on their
crop yield. This data enables farmers to switch from old
inefficient farming one-size fits all paradigm to a more
cost-effective, site-specific field management system boosting the crop quality and conserving resources on the farm.
Such automated imaging systems can also be integrated into
fully robotic harvesting or thinning systems.
Currently there are no systems available for growers to
measure crop yield with high resolution during the growing
season. Crop yield is a desirable attribute to be monitored
and managed. The current process to estimate yield is by
monitoring the farm at harvest time, and recording data
during each growing season. However, yield can vary by
large amounts from year to year, and using harvest estimates
is an extremely coarse approximation of yield. In order to
get accurate dense measures of crop yield, the crop needs to
be continuously measured during the growing season. The
obvious solution would be to exhaustively monitor the fields.
*This work supported by the United States Department of Agriculture
(Award number: 2012-67021-19958) and the National Grape and Wine
Initiative ([email protected]).
1 Zania S Pothen, Carnegie Mellon University, 5000 Forbes Avenue,
Pittsburgh, PA 15213 [email protected]
2 Stephen Nuske, Carnegie Mellon University, 5000 Forbes Avenue,
Pittsburgh, PA 15213 [email protected]
(b) grape
(c) apple
While the approach might work well for small sized fields,
it becomes economically intractable for larger fields owing
to the labour intensive nature of the work. Additionally, the
manual counting mechanism is performed just before harvest.
Over the past few years, our research group has focused
on developing a vision-based system for automatic fruitdetection and high resolution yield-estimation. Our current
system is deployed on a vehicle operating at high velocities
(>1.5 m/s), and captures images using a custom hardware
configuration and high-powered flash lighting. The images
are processed for crop-yield statistics.
The broad steps employed in predicting yield automatically and non-destructively are:
Collect images of fruit-wall in each row of the field
using custom hardware (Figure 1a)
Detect and count fruit in individual images
Associate data between fruit locations in the image and
physical locations in the real-world
Generate high resolution yield estimates using the registered fruit information and the vehicle state information
The details of the system and approach is described in Nuske
et al. [12][14]
We are motivated to increase accuracy of our system and
this paper focuses on novel image processing approaches
to specifically leverage the unique smooth patterns on the
surface of the fruit (Figure 1(b-d)). The smooth texture on
the fruits surface results in a distinct intensity profile and
gradient orientation pattern. These patterns can be used to
distinguish between fruit and background foliage. We present
a novel keypoint detector, called Angular Invariant Maximal
Detector, for detecting smooth round fruit such as grapes
and apples, and it has the following novel attributes which
we see as the contributions of this work:
1) The Angular Invariant Maximal is scale invariant
2) Color-agnostic, operates in most challenging situation
of green immature fruit over a cluttered green leaf
background
3) Robust to partial occlusions and
4) Detects a variety of round fruit, such as grapes and
apples with very high precision and with little need for
manual parameter specification.
The rest of the paper is organized as follows: 1) a section
on related work on fruit detection, 2) implementation details
of the Maximal Orientation Detector, 3) a description of the
data-sets and the experimental setup and 4) the results and
conclusion.
II. R ELATED W ORK
Current approaches for detecting fruit in images are based
on three different types of visual cues. The three different
cues of fruit appearance correspond to color, shape, and
texture.
Fruit detection methods based on color are useful only for
segmenting fruit that are of different color to the green leaf
background. Examples have been shown for the following
fruit - mangoes (Payne et al. [15] ), apples and grapes (Dunn
and Martin [5]).
-45
45
-90
90
-135
135
-180
180
Fig. 2: Synthetic images of round fruit: The maximal intensity, Imax, formed
at the center of the fruit surface - is seen as a bright - white spot (a). From
this maximal intensity point the intensity drops gradually away from the
center towards the edges of the fruit. This results in (b) intensity bands of
decreasing strength (each intensity band is depicted in a different color)
and in (c) a distinct orientation pattern which varies from [180, 180],
represented using gradual variation in color tone, across the surface of the
fruit and which can be segmented into angular segments S1 to S8 .
,ym ,r,)
maxima I(xmr
|(r=0) = 0. We then examine the
region I(xm , ym , r, ) around these seed points to determine
whether they match the ideal patterns (Figure 2) found on
the fruit surface. For this, the region around each seed point
is divided into 8 sectors {S1 , . . . , S8 }. Each sector is grown
radially outward along three scan lines {l0 (r, ), l1 (r, +
C. Feature Descriptor
Once keypoints are found we then form feature descriptors
of the visual appearance around the keypoint. Color based
(a) Apples
Dataset
Location
Fruit
Attributes
Camera
Image
Resolution
Flash
Days to
harvest
GrannySmith
RedDelicious
HoneyCrisp
Rock Island,
WA
Rock Island,
WA
Bisleville, PA
Green
Nikon D300s
1072x712
Red
Nikon D300s
1072x712
Green
Nikon D300s
1072x712
ScarletRoyal
Pinot-Noir
Delano, CA
Green
4288x2848
90
Galt, CA
Green
4288x2848
90
Merlot
Paso Robles,
CA
Galt, CA
Green
Pointgrey
Grasshopper
Pointgrey
Grasshopper
Pointgrey
Grasshopper
Prosilica GE4000
AlienBees ABR800
ringflash
AlienBees ABR800
ringflash
AlienBees ABR800
ringflash
Xenon flashlamp (5-10J)
4288x2848
90
2800x2200
90
Petite-Sirah
Green
14
60
TABLE I: Dataset details- fruit variety and fruit attributes, sensor details and field conditions.
Dataset
Angular Invariant
Maximal
Radial
Symmetry
Invariant
Maximal
Recall
Precision
Recall
Precision
Recall
Precision
Granny Smith
Honeycrisp
Red Delicious
0.94
0.96
0.84
0.26
0.31
0.32
0.86
0.91
0.91
0.11
0.05
0.06
0.99
0.99
0.99
0.01
0.01
0.01
Mean
0.91
0.30
0.89
0.07
0.99
0.01
Pinot-Noir
Petite-Sirah
Merlot
Scarlet-Royal
0.94
0.94
0.96
0.98
0.2
0.10
0.2
0.14
0.89
0.92
0.91
0.84
0.14
0.06
0.09
0.16
0.94
0.98
0.84
0.97
0.12
0.08
0.09
0.09
Mean
0.96
0.16
0.89
0.11
0.93
0.01
TABLE II: Comparison of keypoint detection performance of the Angular Invariant Maximal, Radial Symmetry and Maximal Detector.
Classification Performance
F1 Score
0.9
0.8
0.7
0.6
Invariant Maximal
Angular Invariant Maximal
Radial Symmetry
0.5
0.4
Honeycrisp
Scarlett-Royal
Dataset
Merlot
Petite-Sirah
Pinot-Noir
Fig. 4: Comparison of the overall classification performance- F1 score- for each detector (Angular Invariant Maximal, Radial Symmetry and Maximal
Detector).
[8] Hung, C., Underwood, J., Nieto, J., & Sukkarieh, S. (2013b). A feature
learning based approach for automated fruit yield estimation. In9th
International Conference on Field and Service Robotics (FSR).
[9] Lowe, D. G. (2004). Distinctive image features from scale-invariant
keypoints. International Journal of Computer Vision, 60: pp. 91110.
[10] Loy, G., & Zelinsky, A. (2003). Fast radial symmetry for detecting
points of interest. IEEE Transactions on Pattern Analysis and Machine
Intelligence, 25: pp. 959973.
[11] Rabatel, G, and C Guizard. (2007). Grape berry calibration by computer vision using elliptical model fitting. European Conference on
Precision Agriculture, 6: pp. 581- 587.
[12] Nuske, S., Achar, S., Bates, T., Narasimhan, S., & Singh, S. (2011).
Yield estimation in vineyards by visual grape detection. InProceedings
of the 2011 IEEE/RSJ International Conference on Intelligent Robots
and Systems. pp. 2352-2358
[13] Nuske, S., Gupta, K., Narasimhan, S., & Singh, S. (2012). Modeling
and calibration visual yield estimates in vineyards. In Proceedings of