Ebffiledoc 8224
Ebffiledoc 8224
https://textbookfull.com/product/optical-inspection-of-
microsystems-2nd-edition-wolfgang-osten/
https://textbookfull.com/product/modern-optics-and-photonics-of-
nano-and-microsystems-first-edition-yu-n-kulchin/
https://textbookfull.com/product/introduction-to-optical-
components-first-edition-aggarwal/
https://textbookfull.com/product/chronopoetics-the-temporal-
being-and-operativity-of-technological-media-first-edition-
wolfgang-ernst/
The Mexican Revolution s Wake Sarah Osten
https://textbookfull.com/product/the-mexican-revolution-s-wake-
sarah-osten/
https://textbookfull.com/product/foundations-of-environmental-
economics-wolfgang-buchholz/
https://textbookfull.com/product/self-organized-lightwave-
networks-self-aligned-coupling-optical-waveguides-first-edition-
yoshimura/
https://textbookfull.com/product/elements-of-optical-networking-
basics-and-practice-of-glass-fiber-optical-data-
communication-2nd-edition-bruckner/
DK5855_half 2/9/06 3:33 PM Page 1
DK5855_C000.fm Page i Tuesday, May 1, 2007 12:06 PM
DK5855_series.qxd 6/9/06 10:34 AM Page B
DK5855_C000.fm Page ii Tuesday, May 1, 2007 12:06 PM
Founding Editor
Brian J. Thompson
University of Rochester
Rochester, New York
29. Electron and Ion Microscopy and Microanalysis: Principles and Applications,
Second Edition, Revised and Expanded, Lawrence E. Murr
30. Handbook of Infrared Optical Materials, edited by Paul Klocek
31. Optical Scanning, edited by Gerald F. Marshall
32. Polymers for Lightwave and Integrated Optics: Technology and Applications,
edited by Lawrence A. Hornak
33. Electro-Optical Displays, edited by Mohammad A. Karim
34. Mathematical Morphology in Image Processing, edited by
Edward R. Dougherty
35. Opto-Mechanical Systems Design: Second Edition, Revised and Expanded,
Paul R. Yoder, Jr.
36. Polarized Light: Fundamentals and Applications, Edward Collett
37. Rare Earth Doped Fiber Lasers and Amplifiers, edited by Michel J. F. Digonnet
38. Speckle Metrology, edited by Rajpal S. Sirohi
39. Organic Photoreceptors for Imaging Systems, Paul M. Borsenberger
and David S. Weiss
40. Photonic Switching and Interconnects, edited by Abdellatif Marrakchi
41. Design and Fabrication of Acousto-Optic Devices, edited by Akis P. Goutzoulis
and Dennis R. Pape
42. Digital Image Processing Methods, edited by Edward R. Dougherty
43. Visual Science and Engineering: Models and Applications, edited by
D. H. Kelly
44. Handbook of Lens Design, Daniel Malacara and Zacarias Malacara
45. Photonic Devices and Systems, edited by Robert G. Hunsberger
46. Infrared Technology Fundamentals: Second Edition, Revised and Expanded,
edited by Monroe Schlessinger
47. Spatial Light Modulator Technology: Materials, Devices, and Applications,
edited by Uzi Efron
48. Lens Design: Second Edition, Revised and Expanded, Milton Laikin
49. Thin Films for Optical Systems, edited by Francoise R. Flory
50. Tunable Laser Applications, edited by F. J. Duarte
51. Acousto-Optic Signal Processing: Theory and Implementation,
Second Edition, edited by Norman J. Berg and John M. Pellegrino
52. Handbook of Nonlinear Optics, Richard L. Sutherland
53. Handbook of Optical Fibers and Cables: Second Edition, Hiroshi Murata
54. Optical Storage and Retrieval: Memory, Neural Networks, and Fractals,
edited by Francis T. S. Yu and Suganda Jutamulia
55. Devices for Optoelectronics, Wallace B. Leigh
56. Practical Design and Production of Optical Thin Films, Ronald R. Willey
57. Acousto-Optics: Second Edition, Adrian Korpel
58. Diffraction Gratings and Applications, Erwin G. Loewen and Evgeny Popov
59. Organic Photoreceptors for Xerography, Paul M. Borsenberger
and David S. Weiss
60. Characterization Techniques and Tabulations for Organic Nonlinear Optical
Materials, edited by Mark G. Kuzyk and Carl W. Dirk
61. Interferogram Analysis for Optical Testing, Daniel Malacara, Manuel Servin,
and Zacarias Malacara
62. Computational Modeling of Vision: The Role of Combination,
William R. Uttal, Ramakrishna Kakarala, Spiram Dayanand, Thomas
Shepherd, Jagadeesh Kalki, Charles F. Lunskis, Jr., and Ning Liu
63. Microoptics Technology: Fabrication and Applications of Lens Arrays
and Devices, Nicholas Borrelli
64. Visual Information Representation, Communication, and Image Processing,
edited by Chang Wen Chen and Ya-Qin Zhang
65. Optical Methods of Measurement, Rajpal S. Sirohi and F. S. Chau
66. Integrated Optical Circuits and Components: Design and Applications,
edited by Edmond J. Murphy
DK5855_series.qxd 6/9/06 10:34 AM Page D
DK5855_C000.fm Page iv Tuesday, May 1, 2007 12:06 PM
edited by
CRC Press
Taylor & Francis Group
6000 Broken Sound Parkway NW, Suite 300
Boca Raton, FL 33487-2742
This book contains information obtained from authentic and highly regarded sources. Reprinted material is quoted with
permission, and sources are indicated. A wide variety of references are listed. Reasonable efforts have been made to publish
reliable data and information, but the author and the publisher cannot assume responsibility for the validity of all materials
or for the consequences of their use.
No part of this book may be reprinted, reproduced, transmitted, or utilized in any form by any electronic, mechanical, or
other means, now known or hereafter invented, including photocopying, microfilming, and recording, or in any information
storage or retrieval system, without written permission from the publishers.
For permission to photocopy or use material electronically from this work, please access www.copyright.com
(http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC) 222 Rosewood Drive, Danvers, MA
01923, 978-750-8400. CCC is a not-for-profit organization that provides licenses and registration for a variety of users. For
organizations that have been granted a photocopy license by the CCC, a separate system of payment has been arranged.
Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for
identification and explanation without intent to infringe.
TS156.2O652 2006
670.42’5--dc22 2005046670
for
Angelika, Luise, and Stefan
DK5855_C000.fm Page x Tuesday, May 1, 2007 12:06 PM
DK5855_C000.fm Page xi Tuesday, May 1, 2007 12:06 PM
Preface
The miniaturization of complex devices such as sensors and actuators is one of the biggest challenges
in modern technology. Different manufacturing technologies — for instance, the so-called LIGA
technique and UV lithography — allow the realization of nonsilicon and silicon microparts with a
high aspect ratio and structural dimensions in the range from nanometers to millimeters. LIGA is
an acronym standing for the main steps of the process, i.e., deep x-ray lithography, electroforming,
and plastic molding. These three steps make it possible to mass-produce high-quality microcom-
ponents and microstructured parts, in particular from plastics, but also from ceramics and metals
at low cost. Techniques based on UV lithography or advanced silicon etching processes (ASE)
allow for direct integration of electronics with respect to the realization of advanced microelectro-
mechanical systems (MEMS) devices. Further technologies such as laser micromachining, electro-
chemical milling (ECF), electrodischarge machining, and nanoimprint lithography (NIL) offer,
meanwhile, an economical, high-resolution alternative to UV, VUV, and next-generation optical
lithography.
Increased production output, high system performance, and product reliability and lifetime are
important conditions for the trust in a new technology and deciding factors for its commercial
success. Consequently, high quality standards are a must for all manufacturers. However, with
increasing miniaturization, the importance of measurement and testing is rapidly growing, and
therefore the need in microsystems technology for suitable measurement and testing procedures is
evident. Both reliability and lifetime are strongly dependent on material properties and thermome-
chanical design. In comparison to conventional technologies, the situation in microsystems tech-
nology is extremely complicated. Modern microsystems (MEMS and MOEMS) and their compo-
nents are characterized by high-volume integration of a variety of materials and materials combinations.
This variety is needed to realize very different and variable functions such as sensor and actuator
performance, signal processing, etc. Still, it is well known that the materials´ behavior in combi-
nation with new structural design cannot be easily predicted by theoretical simulations. A possible
reason for wrong predictions made by FEM calculations with respect to the operational behavior
of microdevices is, for instance, the lack of reliable materials data and boundary conditions in the
microscale. Therefore, measurement and testing procedures are confronted with a complex set of
demands. In general, the potential for the following is challenged:
Measurement and inspection techniques are required that are very fast, robust, and relatively
low cost compared to the products being investigated. The reason for this demand is obvious:
properties determined on much larger specimens cannot be scaled down from bulk material without
any experimental verification. Further on, in microscale, materials’ behavior is noticeably affected
DK5855_C000.fm Page xii Tuesday, May 1, 2007 12:06 PM
by production technology. Therefore, simple and robust methods to analyze the shape and defor-
mation of the microcomponents are needed. Together with the knowledge of the applied load and
appropriate physical models, these data can be used for the derivation of material parameters and
various system properties. It is obvious that neither a single method nor a class of measurement
techniques can fulfill these requirements completely. Conventional tensile test techniques (e.g.,
strain gauges) are unable to test specimens from submillimeter-sized regions because of their limited
local resolution and partly unwanted tactile character. Other approaches, such as, for instance,
microhardness measurements, do not reveal directional variations.
However, full-field optical methods provide a promising alternative to the conventional methods.
The main advantages of these methods are their noncontact, nondestructive, and fieldwise working
principle; fast response potential; high sensitivity and accuracy (typical displacement resolution of
a few nanometers, strain values of 100 microstrain); high resolution of data points (e.g., 1000 ×
1000 points for submillimeter field of view); advanced performance of the system, i.e., automatic
analysis of the results; and data preprocessing in order to meet requirements of the underlying
numerical or analytical model. Thus, this book offers a timely review of the research into applying
optical measurement techniques for microsystems inspection. The authors give a general survey of
the most important and challenging optical methods such as light scattering, scanning probe micros-
copy, confocal microscopy, fringe projection, grid and Moiré techniques, interference microscopy,
laser Doppler vibrometry, holography, speckle metrology, and spectroscopy. Moreover, modern
approaches for data acquisition and processing (for instance, digital image processing and correlation)
are presented.
The editor hopes that this book will significantly push the application of optical principles for
the investigation of microsystems. Thanks are due to all authors for their contributions, which give
a comprehensive overview of the state of the art in the fascinating and challenging field of optical
microsystems metrology. Finally, the editor is grateful for the cooperation shown by CRC Press
represented by Taisuke Soda, Preethi Cholmondeley, Gerry Jaffe, and Jessica Vakili.
Wolfgang Osten
Stuttgart
DK5855_C000.fm Page xiii Tuesday, May 1, 2007 12:06 PM
Contributors
Anand Asundi Roland Höfling
School of Mechanical and Aerospace Vialux GmbH
Engineering Chemnitz, Germany
Nanyang Technological University
Singapore, Singapore Markus Hüttel
Information Processing
Petra Aswendt Fraunhofer Institute Manufacturing
Fraunhofer Institute IWU Engineering and Automation (IPA)
Chemnitz, Germany Stuttgart, Germany
Christian Rembe
Cosme Furlong Polytec GmbH
Center for Holographic Studies and Laser Waldbronn, Germany
MicroMechatronics
Mechanical Engineering Department Aiko Ruprecht
Worcester Polytechnic Institute Institut fur Technische Optik
Worcester, Massachusetts (U.S.A.) Universität Stuttgart
Stuttgart, Germany
Christophe Gorecki Leszek Salbut
Département LOPMD, FEMTO-ST Institute of Micromechanics and Photonics
Université de Franche-Comté Warsaw University of Technology
Besançon, France Warsaw, Poland
DK5855_C000.fm Page xvi Tuesday, May 1, 2007 12:06 PM
Contents
Chapter 1
Image Processing and Computer Vision for MEMS Testing............................................................1
Markus Hüttel
Chapter 2
Image Correlation Techniques for Microsystems Inspection..........................................................55
Dietmar Vogel and Bernd Michel
Chapter 3
Light Scattering Techniques for the Inspection of Microcomponents
and Microstructures........................................................................................................................103
Angela Duparré
Chapter 4
Characterization and Measurement of Microcomponents with the Atomic
Force Microscope (AFM) ..............................................................................................................121
F. Michael Serry and Joanna Schmit
Chapter 5
Optical Profiling Techniques for MEMS Measurement................................................................145
Klaus Körner, Aiko Ruprecht, and Tobias Wiesendanger
Chapter 6
Grid and Moiré Methods for Micromeasurements .......................................................................163
Anand Asundi, Bing Zhao, and Huimin Xie
Chapter 7
Grating Interferometry for In-Plane Displacement and Strain Measurement
of Microcomponents ......................................................................................................................201
Leszek Salbut
Chapter 8
Interference Microscopy Techniques for Microsystem Characterization .....................................217
Alain Bosseboeuf and Sylvain Petitgrand
Chapter 9
Measuring MEMS in Motion by Laser Doppler Vibrometry .......................................................245
Christian Rembe, Georg Siegmund, Heinrich Steger, and Michael Wörtge
Chapter 10
An Interferometric Platform for Static, Quasi-Static, and Dynamic Evaluation
of Out-of-Plane Deformations of MEMS and MOEMS...............................................................293
Christophe Gorecki, Michal Jozwik, and Patrick Delobelle
Chapter 11
Optoelectronic Holography for Testing Electronic Packaging and MEMS .................................325
Cosme Furlong
DK5855_C000.fm Page xviii Tuesday, May 1, 2007 12:06 PM
Chapter 12
Digital Holography and Its Application in MEMS/MOEMS Inspection .....................................351
Wolfgang Osten and Pietro Ferraro
Chapter 13
Speckle Metrology for Microsystem Inspection ...........................................................................427
Roland Höfling and Petra Aswendt
Chapter 14
Spectroscopic Techniques for MEMS Inspection .........................................................................459
Ingrid De Wolf
Index ..............................................................................................................................................483
DK5855_C001.fm Page 1 Thursday, June 15, 2006 7:38 PM
CONTENTS
1.1 Introduction...............................................................................................................................2
1.2 Classification of Tasks..............................................................................................................2
1.3 Image Processing and Computer Vision Components.............................................................4
1.3.1 Behavior of Light, Colors, and Filters.........................................................................5
1.3.2 Illumination...................................................................................................................8
1.3.3 Lens Systems ..............................................................................................................12
1.3.4 Sensors........................................................................................................................15
1.3.4.1 CCD Sensors ...............................................................................................16
1.3.4.2 CMOS Sensors ............................................................................................18
1.3.4.3 Color Sensors and Cameras........................................................................19
1.3.4.4 Camera Types and Interfaces......................................................................20
1.3.4.5 Frame Grabbers...........................................................................................21
1.4 Processing and Analysis of Image Data ................................................................................22
1.4.1 Computer Vision Process ...........................................................................................22
1.4.2 Image Data Preprocessing and Processing Methods .................................................24
1.4.2.1 Histograms...................................................................................................24
1.4.2.2 Point Transformations .................................................................................25
1.4.2.3 Spatial Filtering...........................................................................................27
1.4.3 Image Data Analysis Methods ...................................................................................31
1.4.3.1 Spectral Operations .....................................................................................32
1.4.4 Solving Measurement and Testing Tasks...................................................................36
1.4.4.1 Finding a Test Object or Region of Interest...............................................36
1.4.4.2 Position Recognition ...................................................................................40
1.4.4.3 Measuring Geometric Features ...................................................................42
1.4.4.4 Presence Verification ...................................................................................45
1.4.4.5 Defect and Fault Detection .........................................................................47
1.5 Commercial and Noncommercial Image Processing
and Computer Vision Software ..............................................................................................49
1.6 Image Processing Techniques for the Processing of Fringe Patterns
in Optical Metrology ..............................................................................................................50
1.7 Conclusion ..............................................................................................................................52
References ........................................................................................................................................52
1
DK5855_C001.fm Page 2 Saturday, May 6, 2006 2:06 PM
1.1 INTRODUCTION
Not only is there a requirement for testing electrical and dynamic behavior of MEMS, but there is
also considerable demand for methods to test these systems during both the development phase
and the entire manufacturing phase. With the aid of these test methods, it is possible to assess such
static properties as the dimension, shape, presence, orientation, and surface characteristics of
microsystems and their components. Using an optical measurement and testing technique based
on image processing and computer vision, a wide range of procedures that enable such properties
to be recorded rapidly and in a robust and noncontact way can be applied.
If measurement and testing means are not based on special optical procedures but rather on
illumination with normal light and imaging with normal and microscopic optics, their resolution
capabilities extend to only just below the micrometer range. This is due to the diffraction of light
and the dimensions of imaging sensor elements in the lateral direction. Such a degree of resolution
is inadequate as it is unable to cover the entire range of microsystem structure sizes — from just
a few nanometers (e.g., surface roughness) to a few millimeters (e.g., external contours). In order
to measure sizes in the nanometer range, special imaging measurement and testing means are
required. These include interferometers, spectrometers, near-field/scanning electron/atomic-force
microscopes, and specialized reconstruction and analysis processes such as fringe processing or
scanning techniques, which are described in Chapters 4 and 8 through 14.
The main advantage of implementing optical testing equipment using simple light sources and
normal and microscopic optics is the speed with which images can be recorded and analyzed and
the fact that they can be easily integrated into the manufacturing process, thus making the error-
prone removal of components from and reintroduction into the clean environment for test purposes
superfluous. For this reason, despite their limited resolution capabilities, these equipment are ideally
suited for testing large piece numbers, i.e., in the manufacturing process in the areas of assembly,
function, and integrity testing and packaging.
Furthermore, the algorithms developed for image processing and computer vision are not only
suitable for analyzing images recorded using a video camera but can also be applied to the fields
of signal analysis, data analysis and reconstruction, etc.
This chapter deals with the technical aspects of illumination and image recording techniques as
well as image processing and computer vision processes relevant to optical measurement and testing
techniques and their implementation in typical measurement and testing tasks. Several software
products that are available commercially for image processing and computer vision will also be
described. However, a classification of typical measurement and testing tasks in the field of
microsystem development and production is given first.
Electronic Stability Program, Bosch, Germany (ESP)). However, MEMS and MOEMS are being
used increasingly in many other fields such as the medical industry or biological and chemical
diagnostics. “Lab on a chip” is capable of carrying out complete investigation processes for chemical,
biological, or medical analyses. Microspectrometers, just a few millimeters in size, enable the con-
struction of extremely small and low-cost analysis devices that can be used as in-flight equipment for
monitoring terrestrial, biological, and climatic processes and in a wide range of technical applications
in the form of handheld devices. Using micromotors, microdrives, micropumps, and microcameras,
instruments can be constructed for keyhole diagnoses and surgical interventions.
Although, in comparison with microelectronic components, microsystems possess a clear three-
dimensional structure (e.g., microspectrometers, electrical micromotors or gears for drives made
using microinjection molding techniques), classic MEMS structures, especially sensor and mirror
systems, are essentially two-dimensional. This feature is the result of constructing MEMS based
on semiconductor materials and on the corresponding manufacturing processes. Another conspic-
uous characteristic of MEMS-based systems is the common use of hybrid constructions, where the
drive and analysis electronics are located on a semiconductor chip and the actual MEMS (e.g., the
sensor element) on a separate chip.
Both of these properties influence the tests realizable for MEMS using image processing and
computer vision. These are essentially performed using an incident light arrangement in which
image recording and illumination take place at the same angle. The transmissive light arrangement
(which can be much better controlled), where the object to be tested is situated between the
illumination source and the image recording system, can be used to advantage if MEMS were more
three-dimensional in shape. This is increasingly becoming the case.
From the point of view of image processing and computer vision, the solutions listed here are
possible for the following examples of testing tasks:
• Locate test object or regions of interest: Test objects need to be located if their positions
vary either in relation to each another and/or in relation to the reference system of the
observation area. This is often the case in systems of hybrid constructions, systems where
individual components are located relatively inaccurately next to one another. Regions
of interest need to be located, for example, in cases in which MEMS components are
individually processed or adjusted (e.g., when measuring the cross sections of laser-
trimmed resistances and capacitors) or in the case of a high-resolution measurement or
test system for a measuring area that is to be determined using a small measurement
range (multiscaled measuring or testing).
• Position recognition: The position of components needs to be recognized if they are not
aligned for a test or if their positions, when installed, show degrees of freedom (e.g.,
resistors, diodes, or gears). Another example is when tests need to be carried out on the
components themselves, such as the recognition of component coding or the measurement
of geometric features.
• Measuring geometric features: Investigating a component from a metrological point of
view shows whether production tolerances have been adhered to or not; these affect both
the mechanical and electrical behavior of the object. The location of geometric features
using contour or edge recognition and the calculations based on them to obtain straight-
ness, circularity, length, enclosed surfaces, angle, distance, diameter, etc. form the prin-
ciples of optical metrology.
• Presence verification: The monitoring of production and assembly processes often
requires a purely qualitative statement regarding certain features without specific knowl-
edge of their geometrical characteristics. For example, in production processes, tool
breakage can be monitored by checking a work piece for the presence of bore holes,
DK5855_C001.fm Page 4 Saturday, May 6, 2006 2:06 PM
grooves, etc. With assembly processes, the focus of interest is usually on the final
assembled item, i.e., the presence of all components requiring assembly.
• Fault detection: In contrast to presence verification, in the case of fault detection, features
are checked for deviations from required standards. To detect faults, for example, text
recognition is used for reading identification markings on components, color checking
for verifying color-coded components, and texture analysis for investigating structured
surfaces for flaws (e.g., scratches).
In order to solve these examples of measuring and verification tasks based on two-dimensional
image data, image processing and computer vision have a whole range of proven processing and
interpretation methods available. Shading correction, the averaging of image series, spatial and
morphological filtering, edge detection, pattern and geometric feature matching, segmentation,
connectivity analysis, and the metrology of geometric features denote some of these methods. Other
techniques, required, for example, in interferometry, spectroscopy, or holography and which also
permit imaging metrology, are described in Chapters 5, 6, and 12.
Literature Computer
with
frame grabber
and computer vision
Image processing
software
theory
Camera/lens Light
Test object
Software
lead to false conclusions. As far as metrological tasks are concerned, it is essential to understand
the imaging properties of lenses. For these reasons, the following sections are concerned with the
most important aspects of illumination, imaging sensors, camera technology, and the imaging
properties of lenses.
Light r 2r 3r
source
Red
Rainbow colored
White light light out
in Prism Violet
Red Cyan
Ma
w
Gr
e
llo
Blu
gen
e en
Ye
ta
White Black
Ma
Gr
w
Blu
Cyan Red
gen
llo
e en
ta
Ye
FIGURE 1.5 Additive (left) and subtractive (right) colors.
Red
Red
Green
Red colored object Red and green colored object
the colors get added together and form white. If, however, colored substances (e.g., pigments)
illuminated by white light are mixed, the colors are subtracted from one another and form black,
as shown in Figure 1.5.
From this arise the questions why objects appear colored if they are illuminated by white light
and also what are the effects of colored lights in conjunction with colored objects.
If an object appearing red to the human eye is illuminated by white light, the electromagnetic
waves of the spectrum corresponding to the color red are reflected and reach the eye. The light
from all the other wavelengths is adsorbed by the object’s pigments and is transformed into heat.
Naturally, the same applies for objects made up of several colors, as shown in Figure 1.6.
As monochrome cameras are often used in image processing, colored light can be used
advantageously to highlight colored objects. If, as shown in Figure 1.7, a red-and-green-colored
Red light
Red light in out
Red
Green
Red and green colored object Resulting grayscale image
object is illuminated by red light, essentially only the red-colored areas of the object reflect the
light, leading to pale gray values in an image taken by a monochrome camera.
LEDs are often used as colored light sources in image processing today if the area to be
illuminated is not too large. Because of their specific color types, LEDs cover the entire frequency
range of visible light and the adjacent areas of near-IR and UV, induce minimal loss of warmth,
and can be switched on and off very quickly. However, if intense light sources are required, sources
of white light such as halogen lamps are implemented and combined with optical filters. The filters
are generally made of colored glass, which selectively allow light of a certain wavelength to pass
through and either adsorb or reflect all other wavelengths.
If a scene is illuminated with colored light but recorded with a monochrome camera, it often
makes sense to place the filter directly in front of the camera lens rather than to filter the light from
the source of white light. In this way, any stray light from the environment that does not possess
the wavelength of the transmitted filter light is also filtered out and is therefore unable to reach the
camera’s sensors (see Figure 1.8).
In cases in which the colors of an object are irrelevant, near-infrared light sources are often
used in conjunction with a filter that only transmits this light. As a result, light conditions in which
the object is illuminated are almost completely independent of visible light. If the infrared light
source is also monochromatic, i.e., only light from a narrow range of the spectrum is transmitted
or, if the filter is constructed as a narrow band-pass one (see the following text), the color distortions
of the lens cause fewer chromatic errors in the images recorded, which can be used advantageously
especially when metrological tasks are concerned.
White light in
Stray light
Optical filter
(red)
Red light
Red out
Green
Stray light
White light in Red light
out
Optical filter
(red)
Red
Red and green
Green light out Resulting grayscale image
Red and green colored object
(b)
Relative transmission
Relative transmitance
75% 75%
25% 25%
0% 0%
300 400 500 600 700 800 (nm) 300 400 500 600 700 800 (nm)
(a) (b)
Relative sensitivity
Relative radiation
75% 75%
25% 25%
0% 0%
300 400 500 600 700 800 (nm) 300 400 500 600 700 800 (nm)
(c) (d)
FIGURE 1.9 Spectral response plot of: (a) edge filters, (b) band-pass filters, (c) light sources, (d) CCD sensors.
As far as the transmission of light is concerned, the behavior of optical filters is characterized
using spectral response plots. To do this, in general, the relative transmission of the filter material
is plotted as a function of the wavelength of the electromagnetic spectrum (see Figure 1.9).
In the same way, it is also possible to characterize the spectral fractions of irradiated light from
a light source and the spectral sensitivity of light-sensitive sensors.
1.3.2 ILLUMINATION
The type of illumination plays a crucial role in image processing. By using an illumination suited
to the special task in question, the features of a test object requiring measurement or testing can
often be better highlighted. Thus, the processing of images is simplified drastically in many cases.
Illumination can essentially be classified into two types in accordance with the arrangement of
the light source, test object, and image-recording system. If the test object is situated between the
light source and the image-recording system, it is known as a transmissive light arrangement; if
the light source and the image-recording system are situated on the same side in relation to the
test object, it is known as an incident light arrangement.
The simplest realization of a transmissive light arrangement is represented by a light panel
(see Figure 1.10) in which the light source (usually consisting of several lamps, tube lamps, or
LEDs) is positioned behind a ground-glass screen that scatters light diffusely. This arrangement is
especially used to determine and measure the contours of flat, nontransparent test objects because
the camera sees the shadowed image of the object. In cases in which objects are translucent or
DK5855_C001.fm Page 9 Saturday, May 6, 2006 2:06 PM
Object
FIGURE 1.10 Transmissive light arrangement: (a) principle of a light panel, (b) silhouette of an object.
transparent, this arrangement enables internal structures to be recognized. However, the diffusely
scattered light from a ground-glass screen is disadvantageous if the dimensions of test objects are
particularly large in the direction of the optical axis of the camera. This is because surfaces that
are almost parallel to the axis could also be illuminated, thus falsifying the true silhouette of the
test object and the resulting measurement data obtained.
This disadvantage of diffusely scattered light can be reduced if an illumination system composed
of a concave lens and a spotlight source (see Figure 1.11), a so-called collimator, is used instead
of ground-glass illumination. When the spot light source is placed at the focal point of the lens,
the light emerges as almost parallel rays. LEDs (without lens optics) are especially useful as light
sources for this because light-emitting semiconductors are so small that they are almost punctiform.
Despite the advantage of parallel light rays, illuminating collimators have the disadvantage of
producing inhomogeneous illumination. In accordance with the aforementioned law regarding the
decrease in light intensity of a spotlight source, the light intensity of light rays passing through the
lens decreases from the optical axis towards the periphery; this is known as shading or vignetting.
This effect, which is disadvantageous as far as image processing is concerned, can be avoided if
a homogeneously illuminated diffuse ground-glass light source, placed at the image level of a lens,
is projected instead of using a spotlight source at the focal point of a collimator. If a telecentric
lens (see Subsection 1.3.3) is used instead of a normal lens to project a ground-glass light source,
homogeneous illumination with parallel light rays results.
Object Object
FIGURE 1.11 Transmissive light arrangement: (a) collimator, (b) projection of a homogeneous light source.
DK5855_C001.fm Page 10 Saturday, May 6, 2006 2:06 PM
Because many MEMS or their components are based on silicone wafers, which are, according
to their nature, essentially two-dimensional, the main type of illumination used in image processing
to test such elements is that of incident light arrangement. In contrast to the transmissive light
arrangement, by using this type of illumination the test features of the object and all other areas
are equally illuminated. This results in the features of the images recorded often being more difficult
to differentiate in subsequent image processing. From this point of view, a type of illumination that
is suited for the task is of particular importance.
With incident light arrangements, an essential difference is made between the dark field and
bright field arrangement because they highlight very different aspects of a test object.
As far as the angle of observation is concerned, with dark field arrangement (see Figure 1.12)
the test objects are illuminated from an almost perpendicular angle. As a result, in an ideal situation,
only a small amount of light or no light falls on the object surfaces that are parallel to the angle of
incidence and thus perpendicular to the angle of observation. These fields are seen by the observer
as dark areas. In contrast, all other nontransparent object details are highlighted, provided they are
not positioned in the shadows of other object details. With nontransparent test objects, dark field
illumination is, therefore, advantageous if raised sections of an object have to be highlighted.
Another example of using dark field illumination to advantage is in detecting particles on smooth,
even surfaces. Transparent test objects such as glass and plastic objects can also be examined for
inclusions or edge defects because these features stand out well against a dark background,
especially when the latter is matte black in color. Generally, spot and linear, diffuse and directed
light sources are used for dark field illumination. In many cases, low-angle glass fibers or LED
ring lights reduce the problem of shadow formation.
In contrast to the dark field arrangement, in bright field arrangement test objects are illuminated
from nearly the angle of observation. In simple cases, the light source for bright field arrangement
View direction
Object
Lamps
(a)
View direction
Object
Lamps
(b)
FIGURE 1.12 Principles of dark field illumination: (a) single-sided, (b) circular light.
Another random document with
no related content on Scribd:
1.E.6. You may convert to and distribute this work in any binary,
compressed, marked up, nonproprietary or proprietary form,
including any word processing or hypertext form. However, if you
provide access to or distribute copies of a Project Gutenberg™ work
in a format other than “Plain Vanilla ASCII” or other format used in
the official version posted on the official Project Gutenberg™ website
(www.gutenberg.org), you must, at no additional cost, fee or expense
to the user, provide a copy, a means of exporting a copy, or a means
of obtaining a copy upon request, of the work in its original “Plain
Vanilla ASCII” or other form. Any alternate format must include the
full Project Gutenberg™ License as specified in paragraph 1.E.1.
• You pay a royalty fee of 20% of the gross profits you derive from
the use of Project Gutenberg™ works calculated using the
method you already use to calculate your applicable taxes. The
fee is owed to the owner of the Project Gutenberg™ trademark,
but he has agreed to donate royalties under this paragraph to
the Project Gutenberg Literary Archive Foundation. Royalty
payments must be paid within 60 days following each date on
which you prepare (or are legally required to prepare) your
periodic tax returns. Royalty payments should be clearly marked
as such and sent to the Project Gutenberg Literary Archive
Foundation at the address specified in Section 4, “Information
about donations to the Project Gutenberg Literary Archive
Foundation.”
• You comply with all other terms of this agreement for free
distribution of Project Gutenberg™ works.
1.F.
1.F.4. Except for the limited right of replacement or refund set forth in
paragraph 1.F.3, this work is provided to you ‘AS-IS’, WITH NO
OTHER WARRANTIES OF ANY KIND, EXPRESS OR IMPLIED,
INCLUDING BUT NOT LIMITED TO WARRANTIES OF
MERCHANTABILITY OR FITNESS FOR ANY PURPOSE.
Please check the Project Gutenberg web pages for current donation
methods and addresses. Donations are accepted in a number of
other ways including checks, online payments and credit card
donations. To donate, please visit: www.gutenberg.org/donate.
Most people start at our website which has the main PG search
facility: www.gutenberg.org.