Image Processing Dealing With Texture 1st Edition Maria Petrou 2024 Scribd Download
Image Processing Dealing With Texture 1st Edition Maria Petrou 2024 Scribd Download
com
https://ebookname.com/product/image-processing-dealing-with-
texture-1st-edition-maria-petrou/
OR CLICK BUTTON
DOWLOAD NOW
https://ebookname.com/product/image-processing-the-
fundamentals-2nd-2nd-edition-maria-petrou/
https://ebookname.com/product/fuzzy-image-processing-and-
applications-with-matlab-1st-edition-tamalika-chaira/
https://ebookname.com/product/multivariate-image-processing-1st-
edition-jocelyn-chanussot/
https://ebookname.com/product/fundamentals-of-digital-image-
processing-a-practical-approach-with-examples-in-matlab-1st-
edition-breckon/
Image Processing for Cinema 1st Edition Marcelo
Bertalmío
https://ebookname.com/product/image-processing-for-cinema-1st-
edition-marcelo-bertalmio/
https://ebookname.com/product/digital-image-processing-and-
analysis-human-and-computer-vision-applications-with-cviptools-
second-edition-umbaugh/
https://ebookname.com/product/biosignal-and-medical-image-
processing-1st-edition-yukihiko-hara/
https://ebookname.com/product/dealing-with-d4-deviations-1st-
edition-john-cox/
https://ebookname.com/product/the-image-processing-handbook-6ed-
edition-russ-j-c/
Image Processing
Maria Petrou
Imperial College, London, UK
Pedro Garcı́a Sevilla
Jaume I University, Castellon, Spain
c 2006
Copyright John Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester,
West Sussex PO19 8SQ, England
All Rights Reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in
any form or by any means, electronic, mechanical, photocopying, recording, scanning or otherwise, except under
the terms of the Copyright, Designs and Patents Act 1988 or under the terms of a licence issued by the Copyright
Licensing Agency Ltd, 90 Tottenham Court Road, London W1T 4LP, UK, without the permission in writing of the
Publisher. Requests to the Publisher should be addressed to the Permissions Department, John Wiley & Sons Ltd,
The Atrium, Southern Gate, Chichester, West Sussex PO19 8SQ, England, or emailed to [email protected], or
faxed to (+44) 1243 770620.
Designations used by companies to distinguish their products are often claimed as trademarks. All brand names
and product names used in this book are trade names, service marks, trademarks or registered trademarks of their
respective owners. The Publisher is not associated with any product or vendor mentioned in this book.
This publication is designed to provide accurate and authoritative information in regard to the subject matter
covered. It is sold on the understanding that the Publisher is not engaged in rendering professional services. If
professional advice or other expert assistance is required, the services of a competent professional should be sought.
John Wiley & Sons Inc., 111 River Street, Hoboken, NJ 07030, USA
John Wiley & Sons Australia Ltd, 42 McDougall Street, Milton, Queensland 4064, Australia
John Wiley & Sons (Asia) Pte Ltd, 2 Clementi Loop #02-01, Jin Xing Distripark, Singapore 129809
John Wiley & Sons Canada Ltd, 22 Worcester Road, Etobicoke, Ontario, Canada M9W 1L1
Wiley also publishes its books in a variety of electronic formats. Some content that appears
in print may not be available in electronic books.
A catalogue record for this book is available from the British Library
ISBN-13: 978-0-470-02628-1
ISBN-10: 0-470-02628-6
Preface ix
1 Introduction 1
What is texture? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Why are we interested in texture? . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
How do we cope with texture when texture is a nuisance? . . . . . . . . . . . . . . 3
How does texture give us information about the material of the imaged object? . . 3
Are there non-optical images? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
What is the meaning of texture in non-optical images? . . . . . . . . . . . . . . . . 4
What is the albedo of a surface? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Can a surface with variable albedo appear non-textured? . . . . . . . . . . . . . . 4
Can a rough surface of uniform albedo appear non-textured? . . . . . . . . . . . . 4
What are the problems of texture which image processing is trying to solve? . . . . 4
What are the limitations of image processing in trying to solve the above problems? 5
How may the limitations of image processing be overcome for recognising textures? 6
What is this book about? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Box 1.1. An algorithm for the isolation of textured regions . . . . . . . . . . . . . 6
2 Binary textures 11
Why are we interested in binary textures? . . . . . . . . . . . . . . . . . . . . . . . 11
What is this chapter about? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Are there any generic tools appropriate for all types of texture? . . . . . . . . . . . 12
Can we at least distinguish classes of texture? . . . . . . . . . . . . . . . . . . . . . 12
Which are the texture classes? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Which tools are appropriate for each type of texture? . . . . . . . . . . . . . . . . 12
2.1 Shape grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
What is a shape grammar? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Box 2.1. Shape grammars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
What happens if the placement of the primitive pattern is not regular? . . . . . . . 21
What happens if the primitive pattern itself is not always the same? . . . . . . . . 22
What happens if the primitive patterns vary in a continuous way? . . . . . . . . . 22
2.2 Boolean models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
What is a 2D Boolean model? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Box 2.2. How can we draw random numbers according to a given probability density
function? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
viii Contents
How can we estimate the Markov parameters with the least square error estimation
method? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
Box 3.10. Least square parameter estimation for the MRF parameters . . . . . . . 190
Is a Markov random field always realisable given that we define it arbitrarily? . . . 196
What conditions make an MRF self-consistent? . . . . . . . . . . . . . . . . . . . . 196
What is a clique in a neighbourhood structure? . . . . . . . . . . . . . . . . . . . . 196
3.5 Gibbs distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
What is a Gibbs distribution? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
What is a clique potential? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
Can we have a Markov random field with only singleton cliques? . . . . . . . . . . 201
What is the relationship between the clique potentials and the Markov parameters? 211
Box 3.11. Prove the equivalence of Markov random fields and Gibbs distributions
(Hammersley–Clifford theorem). . . . . . . . . . . . . . . . . . . . . . . . . . 215
How can we use the Gibbs distribution to create textures? . . . . . . . . . . . . . . 220
How can we create an image compatible with a Gibbs model if we are not interested
in fixing the histogram of the image? . . . . . . . . . . . . . . . . . . . . . . . 226
What is the temperature of a Gibbs distribution? . . . . . . . . . . . . . . . . . . . 230
How does the temperature parameter of the Gibbs distribution determine how dis-
tinguishable one configuration is from another? . . . . . . . . . . . . . . . . . 230
What is the critical temperature of a Markov random field? . . . . . . . . . . . . . 238
3.6 The autocorrelation function as a texture descriptor . . . . . . . . . . 246
How can we compute the autocorrelation function of an MRF? . . . . . . . . . . . 246
Can we use the autocorrelation function itself to characterise a texture? . . . . . . 246
How can we use the autocorrelation function directly for texture characterisation? 250
How can we infer the periodicity of a texture from the autocorrelation function? . 252
How can we extract parametric features from the autocorrelation function? . . . . 253
Box 3.12. Least square fitting in 2D and 1D . . . . . . . . . . . . . . . . . . . . . . 257
3.7 Texture features from the Fourier transform . . . . . . . . . . . . . . . 260
Can we infer the periodicity of a texture directly from its power spectrum? . . . . 260
Does the phase of the Fourier transform convey any useful information? . . . . . . 265
Since the phase conveys more information for a pattern than its power spectrum,
why don’t we use the phase to describe textures? . . . . . . . . . . . . . . . . 270
Is it possible to compute from the image phase a function the value of which changes
only due to genuine image changes? . . . . . . . . . . . . . . . . . . . . . . . 270
How do we perform phase unwrapping? . . . . . . . . . . . . . . . . . . . . . . . . 271
What are the drawbacks of the simple phase unwrapping algorithm? . . . . . . . . 273
3.8 Co-occurrence matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275
Can we use non-parametric descriptions of texture? . . . . . . . . . . . . . . . . . . 275
How is a co-occurrence matrix defined? . . . . . . . . . . . . . . . . . . . . . . . . 277
How do we compute the co-occurrence matrix in practice? . . . . . . . . . . . . . . 281
How can we recognise textures with the help of the co-occurrence matrix? . . . . . 281
How can we choose the parameters of the co-occurrence matrix? . . . . . . . . . . 283
What are the higher-order co-occurrence matrices? . . . . . . . . . . . . . . . . . . 294
What is the “take home” message of this chapter? . . . . . . . . . . . . . . . . . . 294
Contents xi
Why is the continuous wavelet transform invertible and the discrete wavelet trans-
form non-invertible? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 474
How can we span the part of the “what happens when” space which contains the
direct component of the signal? . . . . . . . . . . . . . . . . . . . . . . . . . . 475
Can we span the whole “what is where” space by using only the scaling function? . 477
What is a Laplacian pyramid? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 477
Why is the creation of a Laplacian pyramid associated with the application of a
Gaussian function at different scales, and the subtraction of the results? . . . 477
Why may the second derivative of a Gaussian function be used as a filter to estimate
the second derivative of a signal? . . . . . . . . . . . . . . . . . . . . . . . . . 477
How can we extract the coarse resolution content of a signal from its content at a
finer resolution? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 477
How can we choose the scaling function? . . . . . . . . . . . . . . . . . . . . . . . . 481
How do we perform the multiresolution analysis of a signal in practice? . . . . . . 485
Why in tree wavelet analysis do we always analyse the part of the signal which
contains the low frequencies only? . . . . . . . . . . . . . . . . . . . . . . . . 486
Box 4.7. How do we recover the original signal from its wavelet coefficients in practice?494
How many different wavelet filters exist? . . . . . . . . . . . . . . . . . . . . . . . . 500
How may we use wavelets to process images? . . . . . . . . . . . . . . . . . . . . . 500
How may we use wavelets to construct texture features? . . . . . . . . . . . . . . . 507
What is the maximum overlap algorithm? . . . . . . . . . . . . . . . . . . . . . . . 507
What is the relationship between Gabor functions and wavelets? . . . . . . . . . . 518
4.5 Where Image Processing and Pattern Recognition meet . . . . . . . . 521
Why in wavelet analysis do we always split the band with the maximum energy? . 521
What is feature selection? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 521
How can we visualise the histogram of more than one feature in order to decide
whether they constitute a good feature set? . . . . . . . . . . . . . . . . . . . 523
What is the feature space? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 523
What is the histogram of distances in a feature space? . . . . . . . . . . . . . . . . 523
Is it possible that the histogram of distances does not pick up the presence of clusters,
even though clusters are present? . . . . . . . . . . . . . . . . . . . . . . . . . 525
How do we segment the image once we have produced a set of features for each pixel?527
What is the K-means algorithm? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 527
What is deterministic annealing? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 528
Box 4.8. Maximum entropy clustering . . . . . . . . . . . . . . . . . . . . . . . . . 529
How may we assess the quality of a segmentation? . . . . . . . . . . . . . . . . . . 535
How is the Bhattacharyya distance defined? . . . . . . . . . . . . . . . . . . . . . . 535
How can we compute the Bhattacharyya distance in practice? . . . . . . . . . . . . 535
How may we assess the quality of a segmentation using a manual segmentation as
reference? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 536
What is a confusion matrix? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 537
What are the over- and under-detection errors? . . . . . . . . . . . . . . . . . . . . 538
4.6 Laws’ masks and the ‘‘what looks like where’’ space . . . . . . . . . . . 539
Is it possible to extract image features without referring to the frequency domain? 539
How are Laws’ masks defined? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 539
Is there a systematic way to construct features that span the “what looks like where”
space completely? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 548
Contents xiii
How can we expand a local image neighbourhood in terms of the Walsh elementary
images? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 556
Can we use convolution to compute the coefficients of the expansion of a sub-image
in terms of a set of elementary images? . . . . . . . . . . . . . . . . . . . . . . 562
Is there any other way to express the local structure of the image? . . . . . . . . . 573
4.7 Local binary patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 574
What is the local binary pattern approach to texture representation? . . . . . . . . 574
How can we make this representation rotationally invariant? . . . . . . . . . . . . . 574
How can we make this representation appropriate for macro-textures? . . . . . . . 575
How can we use the local binary patterns to characterise textures? . . . . . . . . . 576
What is a metric? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 576
What is a pseudo-metric? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 576
Why should one wish to use a pseudo-metric and not a metric? . . . . . . . . . . . 577
How can we measure the difference between two histograms? . . . . . . . . . . . . 577
How can we use the local binary patterns to segment textures? . . . . . . . . . . . 579
How can we overcome the shortcomings of the LBP segmentation? . . . . . . . . . 580
4.8 The Wigner distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . 583
What is the Wigner distribution? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 583
How is the Wigner distribution used for the analysis of digital signals? . . . . . . . 591
What is the pseudo-Wigner distribution? . . . . . . . . . . . . . . . . . . . . . . . 591
What is the Kaiser window? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 592
What is the Nyquist frequency? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 594
Why does the use of the pseudo-Wigner distribution require signals which have been
sampled at twice their Nyquist frequency? . . . . . . . . . . . . . . . . . . . . 595
Should we worry about aliasing when we use the pseudo-Wigner distribution for
texture analysis? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 595
How is the pseudo-Wigner distribution defined for the analysis of images? . . . . . 597
How can the pseudo-Wigner distribution be used for texture segmentation? . . . . 597
What is the “take-home” message of this chapter? . . . . . . . . . . . . . . . . . . 605
References 609
Index 613
Preface
It is often said that everybody knows what texture is but nobody can define it. Here is
an unconventional definition: texture is what makes life beautiful; texture is what makes
life interesting and texture is what makes life possible. Texture is what makes Mozart’s
music beautiful, the masterpieces of the art of the Renaissance classical and the facades of
Barcelona’s buildings attractive. Variety in detail is what keeps us going from one day to the
next and the roughness of the world is what allows us to walk, communicate and exist. If
surfaces were smooth, friction would not exist, the Earth would be bombarded by asteroids
and life would not have developed. If surfaces were smooth, pencils would not write, cars
would not run, and feet would not keep us upright.
Texture is all around us, and texture is also on the images we create. Just as variation
in what we do allows us to distinguish one day in our life from another, texture allows us
to identify what we see. And if texture allows us to distinguish the objects around us, it
cannot be ignored by any automatic system for vision. Thus, texture becomes a major part
of Image Processing, around which we can build the main core of Image Processing research
achievements.
This book is exactly trying to do this: it uses texture as the motivation to present some of
the most important topics of Image Processing that have preoccupied the Image Processing
research community in the recent years. The book covers the topics which have already been
well established in Image Processing research and it has an important ambition: it tries to
cover them in depth and be self-contained so that the reader does not need to open other
books to understand them.
The book is written on two levels. The top, easy level, is for the reader who is simply inter-
ested in learning the basics. This level is appropriate for an undergraduate or Master’s level
course. The second level goes in depth, demonstrating and proving theorems and concepts.
This level is appropriate for research students. Examples that refer to the advanced
level are marked with a B and the theory of this level is presented in boxes with
a grey background. In a sense, the book is an interlacing of mainstream material and
appendices that cover advanced or even peripheral issues.
The book aspires to be a classical textbook on Image Processing and not an account of
the state of the art. So, the reader who hopes to find here the latest algorithms proposed,
will be disappointed.
A large part of this book was written when the first co-author was on sabbatical at the
Informatics and Telematics Institute in Thessaloniki, Greece. The support of the Institute
as well as the support of our home institutions, namely the University of Surrey for the first
co-author when writing this book, and the University Jaume I, throughout this endeavour is
gratefully acknowledged.
xvi Preface
We would also like to thank the Media Lab of the Massachusetts Institute of Technology
for allowing us to use five images from the VisTex database, the Signal and Image Processing
Institute of the University of Southern California for allowing us to use three images from
their USC-SIPI Image database, and Dr Xavier Llado who supplied the images shown in
figures 1.5 and 1.6.
For the accompanying website please visit www.wiley.com/go/texture.
Introduction
What is texture?
Texture is the variation of data at scales smaller than the scales of interest. For example, in
figure 1.1 we show the image of a person wearing a Hawaiian shirt. If we are interested in
identifying the person, the pattern on the shirt is considered as texture. If we are interested in
identifying a flower or a bird on the shirt, each flower or bird of the pattern is a non-textured
object at the scale of this image, as we can hardly see any details inside it.
the object and the shape recognition algorithm would be confused. This is demonstrated
in figure 1.2.
• Texture may be an important cue in object recognition as it tells us something about
the material from which the object is made. For example, in the image of figure 1.3 we
may discriminate the city from the woods and the fields, from the type of variation the
image shows at scales smaller than the objects we are talking about.
Figure 1.2: (a) An original image. (b) Manually extracted edge map. (c) The automatic edge
extraction algorithm is confused by the presence of texture on the box.
(a) (b)
Figure 1.3: (a) Bluebury from an aeroplane. (b) Edge map where the urban area has been
annotated. Other texture patches correspond to woods.
Introduction 3
There are algorithms which can isolate the texture regions irrespective of the type of texture.
The texture regions then may be identified and treated separately from the rest of the image.
Since texture manifests itself as image intensity variation, it gives rise to many edges in the
edge map of the image. A trivial algorithm that can isolate texture regions is to consider
a scanning window and count inside the window the number of edgels. Any window with
number of edgels higher than a certain threshold is considered as belonging to a textured
region. Figure 1.4 demonstrates the result of this algorithm applied to the image of figure
1.3. In Box 1.1 a more sophisticated algorithm for the same purpose is presented.
(a) (b)
Figure 1.4: (a) Pixels that exceed a threshold in the number of edgels that fall inside a
scanning window centred on them. (b) Boundaries of the textured areas.
How does texture give us information about the material of the imaged object?
There are two ways by which texture may arise in optical images:
• Texture may be the result of variation of the albedo of the imaged surface. For example,
the image in figure 1.2a shows the surface of a box on which a coloured pattern is printed.
The change of colours creates variation in the brightness of the image at scales smaller
than the size of the box, which in this occasion is the object of interest.
• Texture may be the result of variation of the shape of the imaged surface. If a surface is
rough, even if it is uniformly coloured, texture in its image will arise from the interplay
of shadows and better illuminated parts. The textures in the image of figure 1.3 are
the result of the roughness of the surfaces of the town and the woods when seen from
above.
4 Image Processing: Dealing with Texture
Yes, there are. They are the images created by devices that measure something other than the
intensity of reflected light. For example, the grey value in Magnetic Resonance Images
(MRI) used in medicine may indicate the proton density of the imaged tissue. Another
example concerns images called seismic sections which are created by plotting data collected
by seismographs.
The albedo of a surface is a function that characterises the material from which the surface is
made. It gives the fraction of incident light this material reflects at each wavelength. When
we say that a surface is multicoloured, effectively we say that it has an albedo that varies
from location to location.
Yes, if it is imaged by a single band sensor and the intensity of reflected light by the different
coloured patches within the sensor sensitivity band yields constant accumulated recording.
Yes, if the roughness of the surface and the direction of incident light are such that no part
of the surface is in shadow and the surface is mat (ie the material it is made from reflects
equally in all directions, irrespective of the direction from which it is illuminated).
What are the problems of texture which image processing is trying to solve?
Image processing is trying to solve three major problems concerned with texture:
What are the limitations of image processing in trying to solve the above prob-
lems?
The most important limitation is that image processing, if applied in the strict sense of the
word, implies that it is dealing only with the images of the surfaces. However, the same surface
may appear totally differently textured when imaged under different conditions. Figure 1.5
shows the same surface imaged from different distances. Its appearance changes dramatically
and pure image processing without any extra information would never recognise these two
images as depicting the same object. Another example is shown in figure 1.6 where the same
surface is imaged from the same distance, but with the illuminating source being at two
different positions. Again, pure image processing would never recognise these two images as
depicting the same object. These two examples illustrate two important properties of texture:
• Texture is scale-dependent.
• Image texture depends on the imaging geometry.
Figure 1.5: A surface imaged from two different distances may create two very different image
textures.
Figure 1.6: A surface imaged under different illumination directions may give rise to two very
different image textures.
6 Image Processing: Dealing with Texture
How may the limitations of image processing be overcome for recognising tex-
tures?
Image processing which makes use of additional information, eg information concerning the
imaging geometry, may be able to deal with both the scale-dependence of texture as well as its
dependence on the imaging geometry. An alternative way is to use computer vision techniques
which allow us to recover the missing information suppressed by the imaging process. For
example, a rough surface exists in 3D. However, when it is imaged on a 2D medium, the
information concerning the third dimension is lost. This information is independent of the
imaging process. If one could recover it from the available image, then one would have had a
scale-invariant description of the roughness of the surface. The recovery of such information
is possible if one uses extra information, or more than one image.
This book is about the methods used to deal with the first two problems related to texture,
namely texture classification and texture segmentation. The various methods will be pre-
sented from the point of view of the data rather than the approach used: we shall start from
the simplest image textures, ie 2D binary images, and proceed to composite grey texture
images.
Texture in an image implies intensity variation at scales smaller than the image size. It
is well known that intensity variation gives rise to edgels in the edge map of the image.
High density of edgels, therefore, characterises textured regions.
Let us imagine that each edgel is a person standing in a flat landscape. The person sees
the world around her through the other individuals who might surround her and obscure
her view. Suppose that we define the field of view of each individual as the maximum
angle through which she can see the world through her neighbours who stand within
a certain distance from her. Individuals who are standing in the middle of a textured
region will have their views severely restricted and therefore their fields of view will not
have large values. On the other hand, individuals standing at the borders of textured
regions will be able to see well towards the outside of the textured region and they will
enjoy high values of field of view.
The trouble is that even edge pixels that simply separate uniformly coloured regions
will enjoy equally high values of the field of view. So, we need to devise an algorithm
that will distinguish these two types of edgels: those that delineate textured regions
and those that simply form edges between different uniformly shaded regions.
These are the steps of such an algorithm demonstrated with the help of the image of
figure 1.7a:
Introduction 7
(a) (b)
(c) (d)
(e) (f)
Figure 1.7: (a) Original image of size 256 × 256 showing a town. (b) The edge map of
the image obtained by using a Sobel edge detector. (c) The edgels that have field of
view larger than 90o within a local neighbourhood of size 9 × 9 (ie n = 9). (d) The edge
map of panel (b) without the edgels of panel (c) and any edgel in a neighbourhood of
size 7 × 7 around them (ie m = 7). (e) The edgels of panel (d) plus all edgels from panel
(b) that are in a neighbourhood of size 7 × 7 around them (ie δm = 0). (f) The final
result keeping only the edgels of panel (e) that have field of view larger than 90o within
a neighbourhood of 9 × 9.
8 Image Processing: Dealing with Texture
• Identify the edgels in the image using an edge filter which does very little or no
smoothing (eg Sobel filters). The result is shown in figure 1.7b.
• In the edge map keep only those edgels that have a field of view larger than d
degrees. This result is shown in figure 1.7c.
• Around each edgel you have kept in figure 1.7c, consider a window of size m × m.
Delete all edgels of figure 1.7b that are inside this window. The result you will
get is shown in figure 1.7d. Note that this in way you have eliminated all those
edgels with high viewing angles which do not delineate texture regions. However,
at the same time the texture regions have shrunk.
1
φ
2 5
4
3
Figure 1.8: Each tile in this figure represents a pixel. The black tiles represent edgels.
The windowed region with the thick black line represents the neighbourhood of size
n × n we consider around the edgel in the middle. In this case n = 9. The neighbours
inside this window are sorted in an anti-clockwise direction. The numbers next to them
indicate their rank. We measure the angle between any two successive rays from the
central edgel. The largest of these angles, marked with φ is the field of view of the
central edgel.
Introduction 9
(a) (b)
(c) (d)
(e) (f)
(g) (h)
Figure 1.9: (a) An original image of size 382 × 287, showing the cross-section of the
aorta of a rabbit which ate too much cholesterol. The edgels in the texture region are
sparse. (b) Edge map produced using a Sobel edge detector. (c) The edge map made
denser by replacing every 4 pixels by 1 and marking the new pixel as an edgel even if
only a single pixel in the original 2 × 2 configuration was an edgel. (d) The edgels which
have a field of view larger than 90o within a local neighbourhood of size 5 × 5 in panel
(c). (e) The edgels of panel (c) after the edgels in (d) and their neighbouring edgels
inside a window of size 4 × 4 are removed. (f) The edgels of panel (e) plus all their
neighbouring edgels inside a window of size 6 × 6 in panel (c) restored. (g) The edgels
in (f) which have a field of view larger than 90o . (h) The final result in the original size.
10 Image Processing: Dealing with Texture
• Around each edgel you have kept in figure 1.7d, consider a window of size (m +
δm) × (m + δm). Restore inside the window all edgels that are present in figure
1.7b. The result is shown in figure 1.7e. Note that now you have only the edgels
that constitute the textured region. Constant δm makes sure that the window used
at this step is slightly larger than the window used at the previous step because
we are dealing with discrete points which are at a certain average distance from
each other.
• For each edgel in figure 1.7e calculate the field of view as you did before. Keep
only the edgels that have a field of view larger than a threshold d. This is the
final result and it is shown in figure 1.7f.
This algorithm works well for texture regions with dense edgels. If the edgels are not
very dense, but you still want to identify the textured regions, you may preprocess the
edge map as follows.
Before you start working with the edge map, make it denser by replacing every l × l tile
of pixels by a single pixel. Even if only one of the pixels in the tile is an edgel, make
the replacement pixel an edgel too. This way the edge map will shrink but the number
of edgels will be preserved and they will be made denser. You can apply the algorithm
described above to the shrunk edge map. The result may be brought to the full size by
replacing every pixel by a tile of size l × l. All pixels in the tile will be marked either as
edgels or background according to how the original pixel was marked. This will yield
a thick boundary for the textured region. If this is a problem the boundary may be
thinned while preserving its continuity. This process is demonstrated in figure 1.9.
Chapter 2
Binary textures
A grey or colour image contains a lot of information, not all of which is necessary in order
to convey its contents. For example, in the image of figure 2.1 we can easily recognise the
trees and the birds although only two brightness levels are used. Texture indicates variation
of brightness. The minimum number of brightness levels we can have is two, so, the simplest
way of representing an image is to binarise it.
Figure 2.1: We can easily recognise the depicted objects, even from a binary image.
This chapter is about developing some basic tools that will allow us to quantify binary tex-
tures.
Image Processing: Dealing with Texture M. Petrou and P. Garcı́a-Sevilla
c 2006 John Wiley & Sons, Ltd. ISBN: 0-470-02628-6
12 Image Processing: Dealing with Texture
Are there any generic tools appropriate for all types of texture?
No, there are not. This is because there is a vast variety of textures one may encounter. In
fact every surface we see may appear textured at some resolution. So, the variety of textures
is as great as the variety of the world around us.
111111111
000000000
Yes, there are some broad texture classes. Look at figure 2.2 and try to spot the odd one out.
00000
11111111111111
000000000
You will find the answer in the next paragraph.
00000
11111
11111
000000000
111111111
00000111111111
000000000
00000
11111000000000
111111111
000000000
111111111
00000
11111000000000
111111111
00000
11111000000000
111111111
00000
11111
11111111111111
00000
00000
000000000
000000000
111111111
11111 (a) (b)
(c) (d)
Figure 2.2: Four binary textures: which one is different from the other three?
There are two broad categories of texture: regular and irregular ones. Most textures of man-
made objects are regular, like those in figures 2.2a and 2.2b, while most natural textures
are irregular. Texture 2.2d is semi-regular: it consists of regularly shaped objects placed at
random positions. The odd one out in figure 2.2 is texture 2.2c, because it shows a random
pattern, while all other images exhibit some regularity.
For regular textures we use shape grammars to describe them. For irregular textures we
use statistical descriptors based on Boolean models and mathematical morphology.
Shape grammars 13
A shape grammar is a formal way of describing a regular pattern. For example, if we wanted
to describe with words the pattern of figure 2.3a, it would have been quite tedious. However,
if we were to describe with words the pattern of figure 2.3b, we could simply say: it is a
rectangular arrangement of the pattern shown in fig 2.3c, with the patterns tightly packed
from all directions. So, in order to describe a pattern we need basically two things: some
elementary patterns, and a set of placement rules. Formally, we need something more: we
need some symbol to mark the position where we “stick” the elementary patterns together,
and we need a starting pattern.
Figure 2.3: (a) A random texture. (b) A texture with a regular primitive pattern and regular
placement rules. (c) The primitive pattern of texture (b) scaled up.
1. A set V of shapes, called terminals. These are in effect the primitive patterns
from which we can build up the texture.
2. A set U of shapes called markers. These are small markers which indicate the
places where the “new bricks” should be placed when building the texture pattern.
14 Image Processing: Dealing with Texture
3. A mapping between two sets W1 , and W2 , which consist of elements that are
elements of sets V and U and combinations of them. The mapping is many to
one, allowing us to replace an element of set W1 by an element of set W2 . These
are the placement rules.
4. A starting pattern S. This must be an element of W1 so that we can start building
the texture from it.
Example 2.1
Consider the texture of figure 2.4. Write a shape grammar to describe it.
The first problem we have to solve is the choice of the primitive pattern. We may
choose any one of those shown in figure 2.5.
Figure 2.5: There may be more than one primitive pattern that may be used to
characterise the same texture.
Shape grammars 15
Let us choose the first one of these patterns. Let us also choose a small circle to be
used as a marker. Figure 2.6 is the set of rules we need in order to produce the texture
of figure 2.4. Finally, we need to choose a starting element. Let us choose as the
starting element the left-hand-side element of rule 1. Figure 2.7 is an example of how
texture 2.4 may be produced by the successive application of the proper rules. We can
see that the texture we had may be fully described and reproduced on the basis of the
five rules of figure 2.6.
Figure 2.8 shows the set of rules we would need to reproduce the pattern if we had
chosen the third primitive pattern of figure 2.5. Figure 2.9 shows how we may apply
these rules to reproduce a section of the texture of figure 2.4. Notice that now we need
the application of more rules to construct a section of the same size as before. This is
because our primitive element is now simpler. This becomes more obvious if we choose
as our primitive pattern the last pattern of figure 2.5. The rules needed in this case
are shown in figure 2.10, and their application to produce part of the texture of figure
2.4 is shown in figure 2.11.
Rule 1
Rule 2
Rule 3
Rule 4
Rule 5
Figure 2.6: The rules of a grammar that may be used to produce the texture of figure
2.4, using as primitive element the first pattern in figure 2.5.
16 Image Processing: Dealing with Texture
Rule 1 Rule 1
Rule 2 Rule 3
Rule 3 Rule 2
Rule 1 Rule 1
Rule 5
Figure 2.7: Successive application of the rules of figure 2.6 allows us to reproduce the
texture of figure 2.4. The sequence of rules used characterises the texture.
Shape grammars 17
Rule 1
Rule 2
Rule 3
Rule 4
Rule 5
Rule 6
Rule 7
Rule 8
Rule 9
Rule 10
Figure 2.8: The rules of a grammar that may be used to characterise the texture of
figure 2.4, using as primitive element the third pattern in figure 2.5. The simpler the
primitive elements we use, the more rules we need.
18 Image Processing: Dealing with Texture
Rule 1 Rule 1
Rule 2 Rule 5
Rule 5 Rule 6
Rule 1 Rule 1
Rule 2 Rule 5
Rule 5 Rule 10
Figure 2.9: Successive application of the rules of figure 2.8 allows us to reproduce a
section of the texture of figure 2.4.
Shape grammars 19
Rule 1
Rule 2
Rule 3
Rule 4
Rule 5
Rule 6
Rule 7
Rule 8
Rule 9
Rule 10
Rule 11
Rule 12
Figure 2.10: The rules of a grammar that may be used to characterise the texture of
figure 2.4, using as primitive element the fifth pattern in figure 2.5.
20 Image Processing: Dealing with Texture
Rule 3 Rule 10
Rule 5
Rule 9 Rule 5
Rule 11
Figure 2.11: Successive application of the rules of figure 2.10 allows us to reproduce a
section of the texture of figure 2.4.
Example 2.2
Use the rules of figure 2.10 with starting element the left-hand side of rule
1, to produce a texture pattern different from that of figure 2.11.
Figure 2.12: An alternative pattern which may be produced by the successive appli-
cation of the rules of figure 2.10.
The texture shown in figure 2.2d is an example of a semi-stochastic pattern: the primitives
are regular, but the rules according to which they are placed appear to be random. Such
patterns may be described by stochastic grammars. In a stochastic grammar the rules are
applied in a random order, or they depend on a random variable. They may even be given
different weights, ie they may be drawn from a random distribution that favours some of the
rules more than others. Examples of semi-stochastic textures are those of figure 2.13.
Example 2.3
Use the rules of figure 2.10 with starting element the left-hand side of rule
1, to produce a semi-stochastic pattern.
Figure 2.14 shows a pattern which was produced by the rules of figure 2.10 applied
in a totally random order, but avoiding overlaps. In other words, every time a rule
indicated continuation towards a direction where there was already a tile, this rule was
abandoned and another one was chosen at random.
Figure 2.14: A pattern which was produced by the application of the rules of figure
2.10 in a random order. Starting with the left-hand side of the first rule of figure 2.10,
we proceed by applying rules 2, 10, 3, 7,..., to produce this pattern, growing it from
its top left corner.
What happens if the primitive pattern itself is not always the same?
Then we must use rules which choose primitives from a set of possible primitive patterns
every time they are applied.
Then we may use a probability density function to describe the distribution of the possible
values of the parameters which describe the shape of the primitive pattern. This leads to the
2D Boolean models used for texture description.
Boolean models 23
The Boolean model consists of two independent probabilistic processes: a point process cre-
ating the germs and a shape process creating the grains. The outcome of the point process
is a set of locations in the 2D space. The outcome of the shape process is a set of shapes that
are placed at the random positions chosen by the point process.
Box 2.2. How can we draw random numbers according to a given probability
density function?
Most computers have programs that can produce uniformly distributed random num-
bers. Let us say that we wish to draw random numbers x according to a given probability
density function px (x). Let us also say that we know how to draw random numbers
y with a uniform probability density function defined in the range [A, B]. We may
formulate the problem as follows.
Define a transformation y = g(x) which is one-to-one and which is such that if y is
drawn from a uniform probability density function in the range [A, B], samples x are
distributed according to the given probability density function px (x).
Since we assume that relationship y = g(x) is one-to-one, we may schematically depict
it as shown in figure 2.15.
y g(x)
y
1
x1 x
It is obvious from figure 2.15 that distributions Py (y1 ) and Px (x1 ) of the two variables
are identical, since whenever y is less than y1 = g(x1 ), x is less than x1 :
24 Image Processing: Dealing with Texture
y1 = Px (x1 ) (2.7)
So, to produce random numbers x, distributed according to a given probability density
function px (x), we follow these steps: we compute the distribution of x, Px (x1 ) using
equation (2.2); we tabulate pairs of numbers (x1 , Px (x1 )); we draw uniformly distributed
numbers y1 in the range [0, 1]; we use our tabulated numbers as a look-up table where
for each y1 = Px (x1 ) we look up the corresponding x1 . These x1 numbers are our
random samples distributed according to the way we wanted.
Example B2.4
In this case:
1 (x−µ)2
px (x) = √ e− 2σ2 (2.8)
2πσ
+∞ µ +∞
Note that −∞ px (x)dx = 1 and that −∞ px (x)dx = µ px (x)dx = 1/2.
The distribution of such a probability density function is computed as follows:
x1
1 (x−µ)2
Px (x1 ) = √ e− 2σ2 dx (2.9)
2πσ −∞
We distinguish two cases: x1 ≤ µ and x1 > µ. We shall deal with the second case first:
µ x1
1 − (x−µ)
2
1 (x−µ)2
Px (x1 ) = √ e 2σ 2
dx + √ e− 2σ2 dx
2πσ −∞ 2πσ µ
x1
1 1 (x−µ)2
= +√ e− 2σ2 dx (2.10)
2 2πσ µ
If we set
x−µ √
z≡ √ ⇒ dx = σ 2dz (2.11)
2σ
we obtain z1
1 1 2 1 1
Px (x1 ) = + √ e−z dz = + erf(z1 ) (2.12)
2 π 0 2 2
√
where z1 ≡ (x1 − µ)/( 2σ) and the error function is defined as:
z
2 2
erf(z) ≡ √ e−t dt (2.13)
π 0
When x1 ≤ µ, we write:
µ µ
1 (x−µ)2
− 2σ2 1 (x−µ)2
Px (x1 ) = √ e dx − √ e− 2σ2 dx
2πσ −∞ 2πσ x1
µ
1 1 (x−µ)2
= −√ e− 2σ2 dx (2.14)
2 2πσ x1
If we set
y+µ √
z≡ √ ⇒ dy = σ 2dz (2.16)
2σ
we obtain
√ z1
1 1 2 1 1
Px (x1 ) = − √ σ 2 e−z dz = − erf(z1 ) (2.17)
2 2πσ 0 2 2
√
where here z1 ≡ (−x1 + µ)/( 2σ). In summary, we deduced that:
1 −x
√1 +µ
2 − 12 erf 2σ
for x1 ≤ µ
Px (x1 ) =
x√
(2.18)
1 1 1 −µ
2 + 2 erf 2σ
for x1 > µ
The error function may be easily computed by using various rational approximations,
one of which is
2
erf(x) = 1 − (a1 t + a2 t2 + a3 t3 )e−x (2.19)
Figure 2.16: A Gaussian probability density function with µ = 2 and σ = 3 and the
histogram of 50 random numbers drawn according to it.
Boolean models 27
Table 2.1: Pairs (x1 , Px (x1 )) when Px (x1 ) is the distribution function of a Gaussian
probability density function with mean µ = 2 and standard deviation σ = 3.
We then draw 50 numbers uniformly distributed in the range [0, 1] and look them up
under column Px (x1 ) to read the numbers we are really interested in, namely the
corresponding x1 numbers. Wherever there is no exact entry in our look-up table, we
perform linear interpolation. Some of the drawn numbers and the corresponding x1
numbers worked out from the look-up table are given in table 2.2. We usually construct
the look-up table so that it is very unlikely that we shall draw a number smaller than
the smallest one in the table or larger than the largest one in the table. The example we
give here uses a rather sparse look-up table. Figure 2.16 shows the theoretical Gaussian
probability density function superimposed on the histogram of the 50 numbers drawn.
28 Image Processing: Dealing with Texture
Table 2.2: The numbers on the left were drawn from a uniform distribution with range
[0, 1]. Each one of them was treated as a value of Px (x1 ) and table 2.1 was used as a
look up table to find the corresponding value of x1 . The thus deduced x1 values are
given in the corresponding position on the right of this table. They are samples from a
Gaussian probability density function with mean µ = 2 and standard deviation σ = 3.
Imagine that we draw N random numbers uniformly distributed in the range [0, T ].
Every time we do it, we find yet another combination of N such numbers. So, if we
concentrate our attention to a sub-range of this range, say [t1 , t2 ], every time we repeat
the experiment, we have yet another number of numbers in this range. Let us say that
in the ith realisation we have ki numbers in the range [t1 , t2 ]. The probability of ki
taking a particular value k is given by the Poisson probability density function
(λ(t2 − t1 ))k
p(ki = k) = e−λ(t2 −t1 ) (2.20)
k!
where λ = N/T .
If the range is chosen to start from 0, ie if t1 = 0, and say t2 = t, then these numbers
ki constitute a Poisson process which is a function of t: k(t).
Boolean models 29
Example B2.5
Create a pattern of 100 × 100 pixels, using a Poisson process with parameter
λ = 0.005. Use a shape process which creates squares with side l. The values
of l are Gaussianly distributed with mean l0 = 5 and standard deviation
σl = 2. Repeat the experiment for λ = 0.01, 0.015, 0.025, 0.05, and 0.1.
Using the definition of λ immediately after equation (2.20), and for a total number
of pixels T = 100 × 100 = 10, 000, we work out that we must draw N = 50 pixels
when λ = 0.005. Therefore, we must identify 50 locations uniformly distributed in the
100 × 100 grid where we shall place black squares of randomly chosen sizes. We draw
100 random numbers uniformly distributed in the range [1, 100]. Then we combine
them two by two to form the coordinates of the 50 pixels we need. We use the method
of Box 2.3 to draw random numbers Gaussianly distributed with mean 5 and standard
deviation 2, in order to decide the size of each square. Each drawn number is rounded
to the nearest integer. The result is shown in figure 2.17a. The results for the other
values of λ are shown in figures 2.17b--2.17f.
Figure 2.17: 2D Boolean patterns created for different values of parameter λ of the
Poisson process. The shape process creates squares with side l Gaussianly distributed
with mean l0 = 5 and standard deviation σl = 2. Parameter λ determines how many
such shapes are created and placed uniformly inside the grid.
30 Image Processing: Dealing with Texture
Example B2.6
Create a texture pattern of 100 × 100 pixels in size, using the following 2D
Boolean model. A point process defined as a Poisson process with parame-
ter λ = 0.005, and a shape process that creates circles with radius r given by
a Gaussian probability density function with mean r0 = 5 and standard de-
viation σr = 2. Approximate each circle by its nearest digital circle. Repeat
the experiment for λ = 0.01, 0.015, 0.025, 0.05, and 0.1.
We work again as for example 2.5. The results for the various values of λ are shown
in figure 2.18.
Figure 2.18: 2D Boolean patterns created for different values of parameter λ of the
underlying Poisson process. The shape process created circles with radius r. The
values of r were drawn according to a Gaussian probability density function with
mean r0 = 5 and standard deviation σl = 2, using the method described in Box 2.3.
For λ = 0.1 we had far too many circles and obtained complete coverage of the grid.
Parameter λ of the Poisson process determines how many such shapes are created and
placed uniformly inside the grid.
Boolean models 31
One can distinguish two types of parameters in a binary texture: aggregate and individual
parameters. The aggregate parameters are those that we can measure directly from the image.
An example of an aggregate parameter is the average number of black pixels inside a window
of fixed size. The individual parameters are those which govern the Boolean model which
we assume describes the texture. An example of an individual parameter is the value of λ
of the underlying Poisson germ process. The individual parameters are estimated from the
values of the aggregate parameters. Both types of parameter may be used to characterise the
texture; however, if one needs to reproduce the texture, one must estimate as completely as
possible the individual model parameters. In that case, one must also check the validity of
the adopted Boolean model, since it is possible that it may not be applicable.
Some useful aggregate parameters of a Boolean texture are the area fraction f , the specific
boundary length L, and the specific convexity N + . There are many more aggregate
parameters that may be defined, but we chose these particular ones because they are useful
in estimating some of the basic individual parameters of the underlying Boolean model, in
the case when the grains are assumed to be convex shapes and the germ process is assumed
to be Poisson.
The area fraction f is defined as the ratio of the black (covered) pixels over all pixels in
the image. For example, in the image of figure 2.19a this parameter is f = 0.38 because there
are 38 black pixels and the image consists of 100 pixels.
The specific boundary length L is defined as the average boundary length per unit area
in the image. It may be estimated by counting the number of border pixels in the image.
These are all pixels that are black but have at least one white neighbour. When using the
word “neighbour” one has to specify whether a neighbour is a pixel which has with the pixel
in question a common side or just a common vertex. If we accept as neighbours only pixels
which share a common side with the pixel being examined, then we are using 4-connectivity.
If in addition we count as neighbours pixels which share a common vertex with the pixel in
question, then we are using 8-connectivity. Very different results may be obtained according
to which connectivity we adopt. As the whole theory of 2D Boolean models is based on the
assumption of convex grains, it is advisable to use 8-connectivity to decide whether a pixel
is a boundary pixel or not. Another issue arises as to whether we must also count as border
pixels those that touch the edges of the image. Let us assume that we use 8-connectivity in
order to decide whether a pixel is a border pixel or not. The boundary pixels of image 2.19a,
which are interior to the image and are boundary pixels because they touch a white pixel,
are shown in figure 2.19b. We can count 35 such pixels, and therefore, from this count we
estimate that the specific boundary length L is 0.35. If in addition we consider as boundary
pixels those that touch the border of the image itself, then we must also include the two pixels
marked in figure 2.19c, and in this case L = 0.37. A compromise solution is to count only
as boundary pixels the pixels that touch either the left or the bottom border of the image.
It can be shown that the estimator of the specific boundary length in this case is unbiased.
Intuitively we understand this by saying that we make a compromise and it is as if the image
continues in a wrapped round way in the two directions, so counting only the border pixels
once makes sense. If we use this unbiased estimator in our example, we must only consider
32 Image Processing: Dealing with Texture
Figure 2.19: (a) A 10 × 10 binary image. (b)The boundary pixels of the image. These are the
interior boundary pixels identified as having at least one white neighbour using 8-connectivity
to define which are the neighbours of a pixel. (c) The border pixels that may be counted
as boundary pixels as well. (d) A convex grain and the point from which its tangent is
orthogonal to vector u. (e) Patterns used to identify boundary pixels from which a tangent
can be drawn to the black grain that is perpendicular to the upward vertical direction. The
tangent points are marked with a cross. The third pattern is intentionally left incomplete with
an apparently missing pixel at its top right corner, in order to indicate that it is indifferent
whether that pixel is black or white. (f) The tangent points of tangents perpendicular to the
vertical direction identified in the image.
the border black pixel on the left in figure 2.19c, and this will yield L = 0.36.
To appreciate the problem of choosing between 4-connectivity and 8-connectivity in de-
ciding the pixels that constitute the boundary pixels of the grains, we present in table 2.3 the
“perimeter” of a digital circle of a certain radius when 4- or 8-connectivity is used and the
corresponding value of the perimeter of a circle in the continuous domain. In the same table
we also include the areas of the digital and continuous circles. Figure 2.20 shows some digital
circles and their boundary pixels defined using 4- and 8-connectivity with the background
pixels. A method which may be used to estimate the specific boundary length L and which
bypasses the problem of choosing a connectivity is the following. We choose n digital lines
passing through a white point somewhere in the middle of the image. Along each line we
count how many times it crosses a boundary, ie how many transitions we have from white
to black and the opposite. We divide this number by the total number of pixels per digital
Boolean models 33
line. The quantity we compute this way is the specific boundary length per unit length,
P (βi ), where βi is the angle line i forms with the chosen reference direction, say the horizontal
axis. Then the specific boundary length L per unit area is given by (see example 2.7):
n
π
L= P (βi ) (2.21)
2n i=1
Table 2.3: Perimeter and area of digital and continuous circles. The perimeter of the digital
circle was measured using 4-connectivity (second column) and 8-connectivity (third column)
in order to decide whether a pixel of the digital circle had a background pixel as a neighbour
or not. The digital circle in each case was defined as the set of pixel positions (i, j) which
satisfied the inequality i2 + j 2 ≤ r 2 , with the centre of the circle assumed to be at (0, 0).
The specific convexity N + is defined as the average number of boundary points per unit
area from which one can draw a tangent to the black grain orthogonal to a fixed direction.
For example, if the fixed direction is indicated by vector u in figure 2.19d, the tangent to the
grain is drawn as shown in the same figure. These points are the points of lower positive
tangents. Such points are identified with the existence of local patterns like those shown in
figure 2.19e and they are marked with a + in figure 2.19f. It is very easy to identify them in
this example. We find that we have 6 such points, and so N + = 6/100 = 0.06. However, in
practice they have to be identified using the hit-or-miss algorithm described later in example
2.20, using the patterns shown in figure 2.19e. In addition, we use more than one orientation
for vector u, typically four with u pointing up, down, left and right, and estimate N + as the
average of its values over these orientations.
Another random document with
no related content on Scribd:
OLD HOUSES WITH CARVED DOORPOSTS, NORWAY.
CHAPTER XXVI.
THE SIGURD SAGA.
Volsung did not long remain childless, for ten stalwart sons and
one lovely daughter, Signy, came to brighten his home. As soon as
this maiden reached marriageable years, many suitors asked for her
hand, which was finally pledged to Siggeir, King of the Goths, whom,
however, she had never seen.
The wedding day came, and when the bride first beheld her
The wedding
destined groom she shrank back in dismay, for his
of Signy. puny form and lowering glances contrasted oddly with
her brothers’ strong frames and frank faces. But it was
too late to withdraw,—the family honor was at stake,—and Signy so
successfully concealed her dislike that none except her twin brother
Sigmund suspected how reluctantly she became Siggeir’s wife.
The wedding feast was held as usual, and when the
The sword in merrymakings had reached their height the guests
the were startled by the sudden entrance of a tall, one-
Branstock.
eyed man, closely enveloped in a mantle of cloudy
blue. Without vouchsafing word or glance to any in the assembly,
the stranger strode up to the Branstock and thrust a glittering sword
up to the hilt in its great bole. Then, turning slowly around, he faced
the awe-struck assembly, and in the midst of the general silence
declared that the weapon would belong to the warrior who could pull
it out, and that it would assure him victory in every battle. These
words ended, he passed out and disappeared, leaving an intimate
conviction in the minds of all the guests that Odin, king of the gods,
had been in their midst.
“So sweet his speaking sounded, so wise his words did seem,
That moveless all men sat there, as in a happy dream
We stir not lest we waken; but there his speech had end,
And slowly down the hall-floor and outward did he wend;
And none would cast him a question or follow on his ways,
For they knew that the gift was Odin’s, a sword for the world to
praise.”
Volsung was the first to recover the power of speech, and,
waiving his own right to try to secure the divine weapon, he invited
Siggeir to make the first attempt to draw it out of the tree-trunk.
The bridegroom anxiously tugged and strained, but the sword
remained firmly embedded in the oak. He resumed his seat, with an
air of chagrin, and then Volsung also tried and failed. But the
weapon was evidently not intended for either of them, and the
young Volsung princes were next invited to try their strength.
“Sons I have gotten and cherished, now stand ye forth and try;
Lest Odin tell in God-home how from the way he strayed,
And how to the man he would not he gave away his blade.”
“At last by the side of the Branstock Sigmund the Volsung stood,
And with right hand wise in battle the precious sword-hilt caught,
Yet in a careless fashion, as he deemed it all for naught;
When, lo, from floor to rafter went up a shattering shout,
For aloft in the hand of Sigmund the naked blade showed out
As high o’er his head he shook it: for the sword had come away
From the grip of the heart of the Branstock, as though all loose it
lay.”
Marching towards the palace, the brave little troop soon fell into
Siggeir’s ambuscade, and, although they fought with heroic courage,
they were so overpowered by the superior number of their foes that
Volsung was soon slain and all his sons made captive. Led bound
into the presence of Siggeir, who had taken no part in the fight (for
he was an arrant coward), Sigmund was forced to relinquish his
precious sword, and he and his brothers were all condemned to die.
Signy, hearing this cruel sentence, vainly interceded for them,
but all she could obtain by her prayers and entreaties was that her
kinsmen should be chained to a fallen oak in the forest, there to
perish of hunger and thirst if the wild beasts spared them. Then,
fearing lest his wife should visit and succor her brothers, Siggeir
confined her in the palace, where she was closely guarded night and
day.
Early every morning Siggeir himself sent a messenger into the
forest to see whether the Volsungs were still living, and every
morning the man returned saying a monster had come during the
night and had devoured one of the princes, leaving nothing but his
bones. When none but Sigmund remained alive, Signy finally
prevailed upon one of her servants to carry some honey into the
forest and smear it over her brother’s face and mouth.
That very night the wild beast, attracted by the smell of the
honey, licked Sigmund’s face, and even thrust its tongue into his
mouth. Clinching his teeth upon it, Sigmund, weak and wounded as
he was, struggled until his bonds broke and he could slay the nightly
visitor who had caused the death of all his brothers. Then he
vanished into the forest, where he remained concealed until the
daily messenger had come and gone, and until Signy, released from
captivity, came speeding to the forest to weep over her kinsmen’s
remains.
Seeing her evident grief, and knowing she had no part in
Siggeir’s cruelty, Sigmund stole out of his place of concealment,
comforted her as best he could, helped her to bury the whitening
bones, and registered a solemn oath in her presence to avenge his
family’s wrongs. This vow was fully approved by Signy, who,
however, bade her brother abide a favorable time, promising to send
him a helper. Then the brother and sister sadly parted, she to return
to her distasteful palace home, and he to seek the most remote part
of the forest, where he built a tiny hut and plied the trade of a
smith.
“And once in the dark she murmured: ‘Where then was the ancient
song
That the Gods were but twin-born once, and deemed it nothing
wrong
To mingle for the world’s sake, whence had the Æsir birth,
And the Vanir, and the Dwarf-kind, and all the folk of earth?’”
“For here the tale of the elders doth men a marvel to wit,
That such was the shaping of Sigmund among all earthly kings,
That unhurt he handled adders and other deadly things,
And might drink unscathed of venom: but Sinfiotli was so wrought
That no sting of creeping creatures would harm his body aught.”
“And then King Siggeir’s roof-tree upheaved for its utmost fall,
And its huge walls clashed together, and its mean and lowly things
The fire of death confounded with the tokens of the kings.”
The long-planned vengeance had finally been
Helgi.
carried out, Volsung’s death had been avenged, and
Sigmund, feeling that nothing now detained him in Gothland, set sail
with Sinfiotli and returned to Hunaland, where he was warmly
welcomed and again sat under the shade of his ancestral tree, the
mighty Branstock. His authority fully established, Sigmund married
Borghild, a beautiful princess, who bore him two sons, Hamond and
Helgi, the latter of whom was visited by the Norns when he lay in his
cradle, and promised sumptuous entertainment in Valhalla when his
earthly career should be ended.
“And the woman was fair and lovely, and bore him sons of fame;
Men called them Hamond and Helgi, and when Helgi first saw light
There came the Norns to his cradle and gave him life full bright,
And called him Sunlit Hill, Sharp Sword, and Land of Rings,
And bade him be lovely and great, and a joy in the tale of kings.”
“He drank as he spake the words, and forthwith the venom ran
In a chill flood over his heart, and down fell the mighty man
With never an uttered death-word and never a death-changed look,
And the floor of the hall of the Volsungs beneath his falling shook.
Then up rose the elder of days with a great and bitter cry,
And lifted the head of the fallen; and none durst come anigh
To hearken the words of his sorrow, if any words he said
But such as the Father of all men might speak over Balder dead.
And again, as before the death-stroke, waxed the hall of the
Volsungs dim,
And once more he seemed in the forest, where he spake with
naught but him.”
“But, lo! through the hedge of the war-shafts, a mighty man there
came,
One-eyed and seeming ancient, but his visage shone like flame:
Gleaming gray was his kirtle, and his hood was cloudy blue;
And he bore a mighty twi-bill, as he waded the fight-sheaves
through,
And stood face to face with Sigmund, and upheaved the bill to
smite.
Once more round the head of the Volsung fierce glittered the
Branstock’s light,
The sword that came from Odin: and Sigmund’s cry once more
Rang out to the very heavens above the din of war.
Then clashed the meeting edges with Sigmund’s latest stroke,
And in shivering shards fell earthward that fear of worldly folk.
But changed were the eyes of Sigmund, the war-wrath left his face;
For that gray-clad, mighty Helper was gone, and in his place
Drave on the unbroken spear-wood ’gainst the Volsung’s empty
hands:
And there they smote down Sigmund, the wonder of all lands,
On the foemen, on the death-heap his deeds had piled that day.”
All the Volsung race and army had already succumbed, so Lygni
immediately left the battlefield to hasten on and take possession of
the kingdom and palace, where he fully expected to find the fair
Hiordis and force her to become his wife. As soon as he had gone,
however, the beautiful young queen crept out of her hiding place in
the thicket, ran to the dying Sigmund, caught him to her breast in a
last passionate embrace, and tearfully listened to his dying words.
He then bade her gather up the fragments of his sword, carefully
treasure them, and give them to the son whom he foretold would
soon be born, and who was destined to avenge his death and be far
greater than he.
“‘I have wrought for the Volsungs truly, and yet have I known full well
That a better one than I am shall bear the tale to tell:
And for him shall these shards be smithied; and he shall be my son,
To remember what I have forgotten and to do what I left undone.’”
These gods had not wandered very far before Loki perceived an
otter basking in the sun. Animated by his usual spirit of destruction,
he slew the unoffending beast—which, as it happened, was the
dwarf king’s second son, Otter—and flung its lifeless body over his
shoulders, thinking it would furnish a good dish when meal time
came.
Following his companions, Loki came at last to Hreidmar’s house,
entered with them, and flung his burden down upon the floor. The
moment the dwarf king’s glance fell upon it he flew into a towering
rage, and before the gods could help themselves they were bound
by his order, and heard him declare that they should never recover
their liberty unless they could satisfy his thirst for gold by giving him
enough of that precious substance to cover the otterskin inside and
out.
“And he spake: ‘Hast thou hearkened, Sigurd? Wilt thou help a man
that is old
To avenge him for his father? Wilt thou win that treasure of gold
And be more than the kings of the earth? Wilt thou rid the earth of a
wrong
And heal the woe and the sorrow my heart hath endured o’er long?’”
On his way to the Volsung land Sigurd saw a man walking on the
waters, and took him on board, little suspecting that this individual,
who said his name was Feng or Fiöllnir, was Odin or Hnikar, the wave
stiller. He therefore conversed freely with the stranger, who promised
him favorable winds, and learned from him how to distinguish
auspicious from unauspicious omens.
After slaying Lygni and cutting the bloody eagle on his foes,
The fight Sigurd left his reconquered kingdom and went with
with the Regin to slay Fafnir. A long ride through the
dragon.
mountains, which rose higher and higher before him,
brought him at last to his goal, where a one-eyed stranger bade him
dig trenches in the middle of the track along which the dragon daily
rolled his slimy length to go down to the river and quench his thirst.
He then bade Sigurd cower in one of those holes, and there wait
until the monster passed over him, when he could drive his trusty
weapon straight into its heart.
SIGURD AND THE DRAGON.—K. Dielitz.
“Then all sank into silence, and the son of Sigmund stood
On the torn and furrowed desert by the pool of Fafnir’s blood,
And the serpent lay before him, dead, chilly, dull, and gray;
And over the Glittering Heath fair shone the sun and the day,
And a light wind followed the sun and breathed o’er the fateful
place,
As fresh as it furrows the sea plain, or bows the acres’ face.”
“Then Regin spake to Sigurd: ‘Of this slaying wilt thou be free?
Then gather thou fire together and roast the heart for me,
That I may eat it and live, and be thy master and more;
For therein was might and wisdom, and the grudged and hoarded
lore:—
Or else depart on thy ways afraid from the Glittering Heath.’”
Sigurd, knowing that a true warrior never refused satisfaction of
some kind to the kindred of the slain, immediately prepared to act as
cook, while Regin dozed until the meat was ready. Feeling of the
heart to ascertain whether it were tender, Sigurd burned his fingers
so severely that he instinctively thrust them into his mouth to allay
the smart. No sooner had Fafnir’s blood touched his lips than he
discovered, to his utter surprise, that he could understand the songs
of the birds, which were already gathering around the carrion.
Listening to them attentively, he found they were advising him to
slay Regin, appropriate the gold, eat the heart and drink the blood of
the dragon; and as this advice entirely coincided with his own
wishes, he lost no time in executing it. A small portion of Fafnir’s
heart was reserved for future consumption, ere he wandered off in
search of the mighty hoard. Then, after donning the Helmet of
Dread, the hauberk of gold, and the ring Andvaranaut, and loading
Greyfell with as much ruddy gold as he could carry, Sigurd sprang on
his horse, listening eagerly to the birds’ songs to know what he had
best undertake next.
Soon he heard them sing of a warrior maiden fast asleep on a
The sleeping mountain and all surrounded by a glittering barrier of
warrior flames; through which only the bravest of men could
maiden.
pass in order to arouse her.
No sooner had Sigurd thus fearlessly sprung into the very heart
of the flames than the fire flickered and died out, leaving nothing
but a broad circle of white ashes, through which he rode until he
came to a great castle, with shield-hung walls, in which he
penetrated unchallenged, for the gates were wide open and no
warders or men at arms were to be seen. Proceeding cautiously, for
he feared some snare, Sigurd at last came to the center of the
inclosure, where he saw a recumbent form all cased in armor. To
remove the helmet was but a moment’s work, but Sigurd started
back in surprise when he beheld, instead of a warrior, the sleeping
face of a most beautiful woman.
All his efforts to awaken her were quite vain, however, until he
had cut the armor off her body, and she lay before him in pure-white
linen garments, her long golden hair rippling and waving around her.
As the last fastening of her armor gave way, she opened wide her
beautiful eyes, gazed in rapture upon the rising sun, and after