0% found this document useful (0 votes)
77 views295 pages

AI Unit 5

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
77 views295 pages

AI Unit 5

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 295

INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
CHAPTER 1
INTRODUCTION TO PATTERN RECOGNITION
SYSTEM
1.1 Overview
One of the most important capabilities of mankind is learning by experience, by our
endeavors, by our faults. By the time we attain an age of five most of us are able to recognize
digits, characters; whether it is big or small, uppercase or lowercase, rotated, tilted. We will
be able to recognize, even if the character is on a mutilated paper, partially occluded or even
on the clustered background. Looking at the history of the human search for knowledge, it is
clear that humans are fascinated with recognizing patterns in nature, understand it, and
attempt to relate patterns into a set of rules. But the question is how this experience can be
used to make machines to learn. The most important challenge is how to generalize these
experiences, how do we make decisions and how our experiences can be built into a
machine? This has been one of the main fundamental principles behind the development of
vast range of theories and concepts that are based on the natural world.
Looking at the history, pattern recognition system has come a long way. Earlier it was
confined to theoretical research in the field of statistics for deriving various models out of the
large amount of data. With the advent in computer technology, number of practical
applications is increased in manifold which lead to further theoretical developments. At
present, pattern recognition has become integral part of any machine intelligence system that
exhibit decision making capabilities. Many different mathematical techniques are used for
this purpose.
Pattern recognition is concerned with the design and development of systems that
recognize patterns in data. The purpose of a pattern recognition program is to analyze a scene
in the real world and to arrive at a description of the scene which is useful for the
accomplishment of some task. The real world observations are gathered through sensors and
pattern recognition system classifies or describes these observations. A feature extraction
mechanism computes numeric or symbolic information from these observations. These
extracted features are then classified or described using a classifier. The process used for
pattern recognition consists of many procedures that ensure efficient description of the
patterns. 1
1
1
INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
1.2 Pattern Recognition
Pattern recognition can be defined as the categorization of input data into identifiable
classes via the extraction of significant features or attributes of the data from a background of
irrelevant detail. Duda and Hart defined it as a field concerned with machine recognition of
meaningful regularities in noisy or complex environments. A more simple definition is
search for structure in data. According to Jain et al. pattern recognition is a general term to
describe a wide range of problems like recognition, description, classification, and grouping
of patterns. Pattern recognition is about guessing or predicting the unknown nature of an
observation, a discrete quantity such as black or white, one or zero, sick or healthy, real or
fake. Watanabe defined a pattern as “opposite of a chaos; it is an entity, vaguely defined, that
could be given a name.” For example, a pattern could be a fingerprint image, a handwritten
word, a human face, or a speech signal. The pattern recognition problems are important in a
variety of engineering and scientific disciplines such as biology, psychology, medicine,
marketing, artificial intelligence, computer vision and remote sensing.
The field of pattern recognition is concerned mainly with the description and analysis
of measurements taken from physical or mental processes. It consists of acquiring raw data
and taking actions based on the “class” of the patterns recognized in the data. Earlier it was
studied as a specialized subject due to higher cost of the hardware for acquiring the data and
to compute the answers. The fast developments in computer technology and resources
enhanced possible various practical applications of pattern recognition, which in turn
contributed to the demands for further theoretical developments.
The design of a pattern recognition system essentially involves the following three
aspects: data representation, Classification and finally, Prototyping. The problem domain
dictates the choice of sensors, pre-processing techniques, representation scheme, and
decision making model.

2
1
INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
i. Representation - It describes the patterns to be recognized;
ii. Classification - It recognizes the “category” to which the patterns provided belong
to;
iii. Prototyping - It is the mechanism used for developing the prototypes or models.
Prototypes are used for representing the different classes to be
recognized.
A general pattern recognition system is shown in the Figure 1.1.1In the first step data
is acquired and preprocessed, this step is followed by feature extraction, feature reduction
and grouping of features, and finally the features are classified. In the classification step, the
trained classifier assigns the input pattern to one of the pattern classes based on the measured
features. The training set used during construction of the classifier is different from the test
set which is used for evaluation. This ensures different performance environment.
1

Figure 1.1: A general pattern recognition system

3
1
INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
1.3 Pattern Recognition approaches
Patterns generated from the raw data depend on the nature of the data. Patterns may
be generated based on the statistical feature of the data. In some situations, underlying
structure of the data decides the type of the pattern generated. In some other instances,
neither of the two situation exits. In such scenarios a system is developed and trained for
desired responses. Thus, for a given problem one or more of these different approaches may
be used to obtain the solution. Hence, to obtain the desired attributes for a pattern recognition
system, there are many different mathematical techniques. The four best-known approaches
for the pattern recognition are:
1. Template matching
2. Statistical classification
3. Syntactic matching
4. Neural networks
In template matching, the prototype of the pattern to be recognized is compared
against the pattern to be recognized. In the statistical approach, the patterns are described as
random variables, from which class densities can be inferred. Classification is done based on
the statistical modeling of data. In the syntactic approach, a pattern is seen as being
composed of simple sub-patterns which are themselves built from yet simpler sub-patterns,
the simplest being the primitives. Inter relationships between these primitive patterns are
used to represent a more complex pattern. The neural network approach to pattern
recognition is strongly related to the statistical methods, since they can be regarded as
parametric models with their own learning scheme.
The models proposed need not be independent and sometimes the same pattern
recognition method exists with different interpretations. A hybrid system may be built
involving multiple models. The comparison of different approaches is summarized in
Table 1.1.

4
1
INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
Table 1.1: Pattern Recognition Models

Approach Representation Recognition Typical Criterion


Function

Template Matching Samples, pixels, Correlation, distance Classification error


Curves measure

Statistical Features Discriminant Classification error


Function

Syntactic or Primitives Rules, grammar Acceptance error


Structural

Neural networks Samples, pixels, Network function Mean square error


features

1.3.1 Template matching


One of the simplest and earliest approaches to pattern recognition is based on
template matching. Matching is carried out to determine the similarity between two entities
such as points, curves, or shapes of the same type. In template matching, a template or a
prototype of the pattern to be recognized is available. The pattern to be recognized is
matched against the stored template while taking into account all allowable operations such
as translation, rotation and scale changes. The similarity measure, often a correlation, may be
optimized based on the available training set. Often, the template itself is learned from the
training set. Template matching is computationally demanding. Present day computers with
higher computation power, due to their faster processors, has made this approach more
feasible. The rigid template matching even though effective in some application domains has
a number of disadvantages. For example, it would fail if the patterns are distorted due to the
imaging process, viewpoint change, or large intra-class variations among the patterns. When
the deformation cannot be easily explained or modeled directly, deformable template models
or rubber sheet deformations can be used to the match patterns.

5
1
INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
1.3.2 Statistical Pattern Recognition
The statistical pattern recognition approach assumes statistical basis for classification
of data. It generates random parameters that represent the properties of the pattern to be
recognized. The main goal of statistical pattern classification is to find to which category or
class a given sample belongs. Statistical methodologies such as statistical hypothesis testing,
correlation and Bayes classification are used for implementing this method. The effectiveness
of the representation is determined by how well pattern from different classes are well
separated.
To measure the nearness of the given sample with one of the classes, statistical
pattern recognition uses probability of error. Bayesian classifier is a natural choice in
applying statistical methods to pattern recognition. However, its implementation is often
difficult due to the complexity of the problems and especially when the dimensionality of the
system is high. One can also consider simpler solution such as a parametric classifier based
on assumed mathematical forms such as linear, quadratic or piecewise. Initially a parametric
form of the decision boundary is specified; then the best decision boundary of the specified
form is found based on the classification of training samples. Another important issue
concerned with statistical pattern recognition is the estimation of the values of the parameters
since they are not given in practice. In these systems it is always important to understand
how the number of samples affects the classifier design and performance.
1.3.3 Syntactic Pattern Recognition
In many situations there exist interrelationship or interconnection between the
features associated with a pattern. In such circumstances it is appropriate to assume a
hierarchical relationship where a pattern is viewed as being consist of simple sub patterns
which are themselves built with yet another sub pattern. This is the basis of Syntactic pattern
recognition. In this method symbolic data structures such as arrays, strings, trees, or graphs
are used for pattern representation. These data structures define the relations between
fundamental pattern components and allow the representation of hierarchical models. Thus
complex patterns can be represented from simpler ones. The recognition of an unknown
pattern is accomplished by comparing its symbolic representation with a number of
predefined objects. This comparison helps to compute the similarity measurement between
the unknown input and with known patterns.

6
1
INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
The symbolic data structures used for the representation of the patterns are
represented by words of symbols or strings. The individual symbols in a string usually
represent components of the atomic pattern. The strings are however one-dimensional in
nature but many patterns are inherently two or more dimensional. One of the most used and
powerful symbolic structure for higher dimensional data representation is a graph. A graph is
composed of a set of nodes and a set of edges in which the nodes represent simpler sub-
patterns and the edges the relations between those sub-patterns. These relations may be
spatial, temporal or of any other type, depending on the problem. An important subclass of a
graph is a tree. A tree has three different classes of nodes, which are root, interior and leave.
Trees are intermediate between strings and graphs. They are interesting for pattern
recognition applications since they are more powerful than strings as a representation of the
object and computationally less expensive than graphs. Another form of symbolic
representation is the array which is a special type of graph which has the nodes and edges
arranged in a regular form. This type of data structure is very useful for low level pattern
representation.
Structural pattern recognition is found to be good because it provides a description of
how the given pattern is constructed from the primitives in addition to classification. This
method is useful in situations where the patterns have a definite structure which can be
captured in terms of a set of rules. However, due to parsing difficulties the implementation of
a syntactic approach is limited. It is very difficult to use this method for segmentation of
noisy patterns and another problem is inference of the grammar from training data. Powerful
pattern recognition capabilities can be achieved by combining the syntactic and statistical
pattern recognition techniques [Fu 1986].
1.3.4 Neural Network
Neural computing is based on the way by which biological neural system store and
manipulates information. It can be viewed as parallel computing environment consisting of
interconnection of large number of simple processors. Neural network have been successfully
applied in many tasks of pattern recognition and machine learning systems. The structure of
neural system is drawn from analogies with biological neural systems. Many algorithms have
been designed to work with neural network learning have been developed. In these
algorithms, a set of rules defines the evolution process undertaken by the synaptic

7
1
INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
connections of the networks, thus allowing them to learn how to perform specified tasks.
Neural network models uses a network of weighted directed graphs in which the nodes are
artificial neurons and directed edges are connections between neuron outputs and neuron
inputs. The neural networks have the ability to learn complex nonlinear input-output
relationships, use sequential training procedures, and adapt themselves to the data.
Different types of neural networks are used for pattern classification. Among them Feed-
forward network and Kohonen-Network is commonly used. The learning process involves
updating network architecture and connection weights so that a network can efficiently
perform a specific classification/clustering task. The neural network models are gaining
popularity because of their ability to solve pattern recognition problems, seemingly low
dependence on domain-specific knowledge, and due to the availability of efficient learning
algorithms for practitioners to use. Neural networks are also useful for implementing
nonlinear algorithms for feature extraction and classification. In addition, existing feature
extraction and classification algorithms can also be mapped on neural network architectures
for efficient implementation. In spite of the seemingly different underlying principles, most
of the well-known neural network models are implicitly equivalent or similar to classical
statistical pattern recognition methods.

1.4 Feature Extraction and Reduction


Feature selection is the process of choosing input to the pattern recognition system.
Many methods can be used to extract the features. The feature selected is such that it is
relevant to the task at hand. These features can be obtained from the mathematical tools or by
applying feature extraction algorithm or operator to the input data. The level at which these
features are extracted determines the amount of necessary preprocessing and may influence
the amount of error introduced into the feature extracted. Features many be represented as
continuous, discrete, or discrete binary variables. During the features extraction phase of the
recognition process objects are measured. A measurement is the value of some quantifiable
property of an object. A feature is a function of one or more measurements, computed so that
it quantifies some significant characteristic of the object. This process produces a set of
features that, taken together, forms the feature vector.
A number of transformations can be used to generate features. The basic idea is to
transform a given set of measurements to a new set of features. Transformation of features

8
1
INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
can lead to a strong reduction of information as compared with the original input data. In
most of the situations relatively small number of features is sufficient for correct recognition.
Obviously feature reduction is a sensitive procedure since if the reduction is done incorrectly
the whole recognition system may fail or may not produce the expected results. Examples of
such transformations are the Fourier transform, Empirical mode decomposition, and the Haar
transform. Feature generation via linear transformation techniques is just one of the many
possibilities. Feature extraction also depends on application in hand and may use different
techniques such as moment-based features, chain codes, and parametric models to obtain
required features.

1.5 Cluster Analysis


The main objective in clustering techniques is to partition a given data set into
homogeneous clusters. The term homogeneous is used in the sense that all points in the same
group are similar to each other and are not similar to points in other groups. The similarity of
these points is defined according to some established criteria.
While the use of clustering in pattern recognition and image processing is relatively
recent, cluster analysis is not a new field. It has been used in other disciplines, such as
biology, psychology, geology and information retrieval. The majority of the clustering
algorithms find clusters of a particular shape. Most of the real problems involve clustering in
higher dimension. And the difficulties with the natural interpretation of data embedded in a
high dimensional space are evident. Clustering method is a very active field in pattern
recognition and data mining. Thus a large amount of clustering algorithms continues to
appear in the literature. Most of these algorithms are based on proximity measures. Even
though, there are a class of algorithm based on different combinations of a proximity
measure and a clustering scheme. Clustering is a major tool used in a number of applications,
which can be basically used in four different ways namely data reduction, hypothesis
generation, hypothesis testing and prediction based on group.

1.6 Classifiers Design


Classifiers are designed to perform the classification stage of the pattern recognition
system. A Classifier partitions the feature space into different regions. The border of each
decision region is a decision boundary. The determination of region to which the feature
vector belongs to is a challenging task. There are many approaches for the design of the
9
1
INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
classifier in a pattern recognition system and they can be grouped in three classes: classifiers
based on Bayes decision theory, linear and nonlinear classifiers.
The first approach builds upon probabilistic arguments stemming from the statistical
nature of the generated features. This is due to the statistical variation of the patterns as well
as to possible noise obtained in the signal acquisition phase. The objective of this type of
design is to classify an unknown pattern in the most probable class as deduced from the
estimated probability density functions. Even though linear classifiers are more restricted in
their use, the major advantage is their simplicity and computational demand in solving
problems which do not require more sophisticated nonlinear model. Examples of linear
classifiers are the perceptron algorithm and least squares methods. For problems that are not
linearly separable and for which the design of a linear classifier, even in an optimal way,
does not lead to satisfactory performance, the use of nonlinear classifier are mandatory.

1.7 Importance and Applications


The progress of society from the era of industrial revolution to knowledge based era
has created a need for faster and more reliable information handling and retrieval systems.
Automation in industrial production and efficient management processes are gained much
importance. With the advent in the Internet and information technology has made the
manufacturing sector to reach any part of the globe. These tendencies have pushed pattern
recognition to the high edge of computer and engineering research and applications. Today
pattern recognition is an integral part in most machine intelligence systems design for
decision making task which are used in a variety of applications such as artificial intelligent
system and image understanding and analysis.
Nowadays the interest in the area of pattern recognition comes from applications such
as data mining, document classification, biometrics, financial forecasting, and computer
vision. Table1.2 gives some more examples of applications in different domains. A common
characteristic of a number of these applications is that the available features are usually not
suggested by domain experts, but must be extracted and optimized by data-driven
procedures. It is necessary to note that there is no simple approach for optimal solutions and
that multiple methods and approaches need to be used. Accordingly, several classifiers are
combined together to obtain better result in pattern recognition systems.

10
1
INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
Table 1.2: Examples of different pattern recognition applications
Problem domain Application Input Pattern Pattern Classes

Bio-informatics Sequence analysis DNA/Protein Known types of


sequence genes/patterns
Data Mining Searching for Points in Compact and well
requied multidimensional separated clusters
patterns space
Document Internet search Text document Semantic
classification categories (e.g. sports)

Document image Reading machine document image Alphanumeric


analysis for blind characters, words
Industrial Printed circuit Intensity or range Defective /
automation board inspection image non-defective
nature of product
Multimedia Internet search Video clip Video genres (e.g.
database retrieval action, dialogue,
etc.)
Biometric Personal Face, iris, Authorized users
recognition identification fingerprint for access control
Remote sensing Forecasting Multispectral Land use
weather, crop image categories, growth
yield pattern of crops
Speech Speaker Speech waveform Spoken words
recognition identification
Medicine Disease Scanned image Diseased areas in the
identification body

Machine vision for example is an area in which pattern recognition is of clear


importance. A machine vision system acquires images through a camera, these signals are
analyzed so to produce a description and categorization of objects in the image. Typical

11
1
INTRODUCTION TO PATTERN RECOGNITION SYSTEM

1
1
application of this type is desirable in the manufacturing industry for automated visual
inspection or automation in the assembly line.
Character recognition is another important application in the area of pattern
recognition, with major implications in automation and information handling. Optical
character recognition (OCR) systems consist in a scanning device and pattern recognition
software that translates the scanned imaged into computer coded characters. The advantage
of storing the recognized document are clear since it is more efficient to store ASCII
characters than a document image, also it turns possible further electronic processing. There
is a great interest in systems that recognize handwritten characters besides the machine
printed character recognition systems. A typical commercial application of such system is
machine reading of bank checks. Another application lies in automatic mail sorting machines
for postal code identification in post offices. On-line handwritten recognition systems are
another area of great commercial interest. Such system would accompany pen computers and
greatly improve human computer interface.
Recently, there has been a great amount of effort invested in speech recognition
systems. Speech is the most natural means by which we communicate and exchange
information. The potential application for such a system is numerous. One of the goal of this
kind of system is to enter data into a computer via a microphone and a major effort has been
done towards this direction with considerable success.
Computer-aided diagnosis is also an important and possible application of pattern
recognition systems. The task of these systems would be assisting doctors in making
diagnostic decisions. The need for a computer-aided diagnosis came from the fact that
medical data are often not so easily interpretable. So an automatic pattern recognition system
can assist a doctor with a second opinion.
In addition to the applications described above several other uses of pattern
recognition system are of importance such as fingerprint identification, signature
authentication, and text retrieval, and face and gesture recognition. The field of pattern
recognition still poses some great challenges not just with applied and implementational
problems, but also on the theoretical framework.

12
1
Parameter Estimation
Density estimation when the density is assumed to be in a specific parametric family.
Special cases include maximum likelihood, maximum a posteriori, unbiased estimation,
and predictive estimation. See the section on Parameter estimation techniques.

Parameter estimation techniques


Maximum likelihood
A parameter estimation heuristic that seeks parameter values that maximize the
likelihood function for the parameter. This ignores the prior distribution and so is
inconsistent with Bayesian probability theory, but it works reasonably well.
Maximum A Posteriori
A parameter estimation heuristic that seeks parameter values that maximize the posterior
density of the parameter. This implies that you have a prior distribution over the
parameter, i.e. that you are Bayesian. MAP estimation has the highest chance of getting
the parameter exactly right. But for predicting future data, it can be worse than Maximum
Likelihood; predictive estimation is a better Bayesian method for that purpose. MAP is
also not invariant to reparameterization;
Unbiased estimation
A parameter estimation procedure based on finding an estimator function that minimizes
average error. When the average error is zero then the estimator is "unbiased." The error
of the function is averaged over possible data sets, including ones you never observed.
The best function is then used to get parameter values. See "Pathologies of Orthodox
Statistics".
Predictive estimation
Parameter estimation consistent with Bayesian probability theory. It seeks to minimize
the expected "divergence" between the estimated distribution and the true distribution.
The divergence is measured by Kullback and Leibler's formula. The distribution which
achieves minimum divergence corresponds to integrating out the unknown parameter.
Hence predictive estimation can be approximated by averaging over several different
parameter choices. See "Inferring a Gaussian distribution", "A Comparison of Scientific
and Engineering Criteria for Bayesian Model Selection", Geisser, and Bishop.
Minimum Message Length
A parameter estimation technique similar to predictive estimation but motivated by
information theory. Consider compressing the data via a two-part code: the first part is a
parameter setting, encoded with respect to the prior, and the second part is the data,
encoded with respect to the model with that parameter. Parameters are continuous, and so
cannot be encoded exactly---they must be quantized, which introduces error. So we can't
choose the parameters which simply compress the data most; we have to choose
parameters which compress the data well even if the parameters are slightly modified.
The parameter setting which balances this tradeoff between accuracy and robustness is
the MML estimate. See
 "Estimation and Inference by Compact Coding", Wallace and Freeman, Journal
of the Royal Statistical Society B 49(3):240--265, 1987
 The Computer Journal special issue: MDL vs. MML
 "The Maximum Local Mass estimate"
 "Keeping Neural Networks Simple by Minimizing the Description Length of the
Weights"
 Minimum Message Length model selection

Some related methods:

 "Flat Minima"
 "Bayesian backpropagation over I-O functions rather than weights"

Bootstrapping
A technique for simulating new data sets, to assess the robustness of a model or to
produce a set of likely models. The new data sets are created by re-sampling with
replacement from the original training set, so each datum may occur more than once. See
"What are cross-validation and bootstrapping?" and "The Bootstrap is Inconsistent with
Probability Theory".
Bagging
Bootstrap averaging. Generate a bunch of models via bootstrapping and then average
their predictions. See "Bagging Predictors", "Why does bagging work?", and "Bayesian
model averaging is not model combination".
Monte Carlo integration
A technique for approximating integrals in Bayesian inference. To approximate the
integral of a function over a domain D, generate samples from a uniform distribution over
D and average the value of the function at those samples. More generally, we can use a
non-uniform proposal distribution, as long as we weight samples accordingly. This is
known as importance sampling (which is an integration method, not a sampling
method). For Bayesian estimation, a popular approach is to sample from the posterior
distribution, even though it is usually not the most efficient proposal distribution. Gibbs
sampling is typically used to generate the samples. Gibbs sampling employs a
succession of univariate samples (a Markov Chain) to generate an approximate sample
from a multivariate density. See "Introduction to Monte Carlo methods", "Probabilistic
Inference using Markov Chain Monte Carlo Methods", and the Markov Chain Monte
Carlo home page. Software includes BUGS and FBM.
Regularization
Any estimation technique designed to impose a prior assumption of "smoothness" on the
fitted function. See "Regularization Theory and Neural Networks Architectures".
Expectation-Maximization (EM)
An optimization algorithm based on iteratively maximizing a lower bound. Commonly
used for maximum likelihood or maximum a posteriori estimation, especially fitting a
mixture of Gaussians. See

 "Expectation-Maximization as lower bound maximization"


 "A Gentle Tutorial on the EM Algorithm"
 "Convexity, Maximum Likelihood and All That"
 "Very Fast EM-based Mixture Model Clustering using Multiresolution kd-trees"

Variational bound optimization


A catch-all term for variations on the EM algorithm which use alternative lower bounds
(usually simpler ones). The particular lower bound used by EM can lead to an intractable
E-step. With a looser bound, the iterative update is more tractable, at the cost of
increasing the number of iterations. Another approach, though less often used, is to use a
tighter bound, for faster convergence but a more expensive update. See "An introduction
to variational methods for graphical models", "Notes on variational learning", "Exploiting
tractable substructures in intractable networks".
Variational bound integration
To approximate the integral of a function, lower bound the function and then integrate the
lower bound. Not to be confused with Jensen bound integration. Variational Bayes
applies this technique to the likelihood function for integrating out parameters. The EM
bound can be used for this, or any of the simpler bounds used for variational bound
optimization. See

 "Using lower bounds to approximate integrals"


 "Variational Bayes for 1-dimensional mixture models"
 "Ensemble Learning for Hidden Markov Models"
 "Inferring parameters and structure of latent variable models by variational
Bayes"
 "Bayesian parameter estimation via variational methods"

Jensen bound integration


To approximate the integral of a function, apply Jensen's inequality to turn the integral
into a product which lower-bounds the integral. The bound has free parameters which are
chosen to make it as tight as possible. Unlike variational bound optimization, the
integrand itself does not need to be bounded, and very different answers can result from
the two methods. See

 "Ensemble Learning for Multi-Layer Networks"


 "Bayesian Model Selection for Support Vector Machines, Gaussian Processes and
Other Kernel Classifiers"
 "Improving the mean field approximation via the use of mixture distributions"

Expectation Propagation
To approximate the integral of a function, approximate each factor by sequential
moment-matching. For dynamic systems, it generalizes Iterative Extended Kalman
filtering. For Markov nets, it generalizes belief propagation. See A roadmap to research
on EP.
Newton-Raphson
A method for function optimization which iteratively maximizes a local quadratic
approximation to the objective function (not necessarily a lower bound as in Expectation-
Maximization). If the local approximation is not quadratic, we have a generalized
Newton method. See "Beyond Newton's method".
Iteratively Reweighted Least Squares
A method for maximum likelihood estimation of a generalized linear model. It is
equivalent to Newton-Raphson optimization. See McCullagh&Nelder.
Back-propagation
A method for maximum likelihood estimation of a feed-forward neural network. It is
equivalent to steepest-descent optimization. See Bishop.
Backfitting
A method for maximum likelihood estimation of a generalized additive regression. You
iteratively optimize each f_i while holding the others fixed. It is equivalent to the Gauss-
Seidel method in numerical linear algebra. See Hastie&Tibshirani and "Bayesian
backfitting".
Kalman filtering
An algorithm for inferring the next state or next observation of a Linear Dynamical
System. By making the state a constant, it can also be used for incrementally building up
a maximum-likelihood estimate of a parameter. See "An Introduction to the Kalman
Filter" (with links), "Dynamic Linear Models, Recursive Least Squares and Steepest
Descent Learning", "From Hidden Markov Models to Linear Dynamical Systems", and
Gelb (Ch.4).
Extended Kalman filtering
Kalman filtering applied to general dynamical systems with Gaussian noise. At each step,
the dynamical system is approximated with a linear dynamical system, to which the
Kalman filter is applied. The linear approximation can be iteratively refined to improve
the accuracy of the Kalman filter output. Despite the name, extended Kalman filtering is
not really different from Kalman filtering. See Gelb.
Relaxation labeling
An optimization algorithm for finding the most probable configuration of a Markov
random field. It generalizes the Viterbi algorithm for Markov chains. Other approaches to
this problem include Iterated Complete Modes, simulated annealing, network flow, and
variational lower bounds. See "Foundations of Relaxation Labeling Processes" (Hummel
and Zucker; appears in Readings in Computer Vision), "Self Annealing: Unifying
deterministic annealing and relaxation labeling", "Probabilistic relaxation", and Li.
Deterministic annealing
An optimization technique where the true objective function is morphed into a convex
function by a continuous convexity parameter. Start by solving the convex problem and
gradually morph to the true objective while iteratively recomputing the optimum. It is
called "graduated nonconvexity" in statistical physics, where the convexity parameter
often corresponds to temperature. See

 "Deterministic Annealing for Clustering, Compression, Classification,


Regression, and Related Optimization Problems" (Rose, Proc. IEEE Nov 1998)
 "Deterministic Annealing Variant of the EM Algorithm" (Ueda and Nakano, NIPS
7)
 "Statistical Physics, Mixtures of Distributions and the EM Algorithm"
 "Self Annealing: Unifying deterministic annealing and relaxation labeling"
 "Distributional Clustering of English Words"
 "On the Generalization of Deterministic Annealing as Constrained Optimisation"

Boosting
A technique for combining models based on adaptive resampling: different data is given
to different models. The idea is to successively omit the "easy" data points, which are
well modeled, so that the later models focus on the "hard" data. See Schapire's page,
"Additive Logistic Regression: a Statistical View of Boosting", "Prediction Games and
Arcing Algorithms", and "Half&Half Bagging and Hard Boundary Points".
Empirical Risk Minimization
A parameter estimation heuristic that seeks parameter values that minimize the "risk" or
"loss" that the model incurs on the training data. In classification, a "loss" usually means
an error, so it corresponds to choosing the model with lowest training error. In regression,
"loss" usually means squared error, so ERM corresponds to choosing the curve with
lowest squared error on the training data. It is thus the most basic (and naive) estimation
heuristic. This method only uses a loss function appropriate for the problem and does not
utilize a probabilistic model for the data. See "Empirical Risk Minimization is an
incomplete inductive principle".
Principal Component Analysis
Principle Component Analysis: A statistical technique used to
examine the interrelations among a set of variables in order
to identify the underlying structure of those variables. Also
called factor analysis.

It is a non-parametric analysis and the answer is unique and


independent of any hypothesis about data distribution.

These two properties can be regarded as weaknesses as well as


strengths.

Since the technique is non-parametric, no prior knowledge


can be incorporated.

PCA data reduction often incurs a loss of information.


The assumptions of PCA:

1. Linearity
• Assumes the data set to be linear combinations of
the variables.

2. The importance of mean and covariance


• There is no guarantee that the directions of maximum
variance will contain good features for discrimination.

3. That large variances have important dynamics


• Assumes that components with larger variance
correspond to interesting dynamics and lower ones
correspond to noise.
Where regression determines a line of best fit to a data
set, factor analysis determines several orthogonal lines
of best fit to the data set.

Orthogonal A
are perpendicular to each other in n-dimensional space.
n-Dimensional Space: the variable sample space. There are as
many dimensions as there are variables, so in a data set with 4
variables the sample space is 4-dimensional.
Components: a linear transformation that chooses a variable
system for the data set such that the greatest variance of the data
set comes to lie on the first axis (then called the principal
component), the second greatest variance on the second axis,
and so on ...

Note that components are uncorrelated, since in the


sample space they are orthogonal to each other.

Orthogonal Non-orthogonal
Locations along each component (or eigenvector) are then
associated with values across all variables. This association
between the components and the original variables is called the
eigenvalue.

In multivariate (multiple variable) space, the correlation between


the component and the original variables is called the component
loadings.

Component loadings: analogous to correlation coefficients,


squaring them give the amount of explained variation. Therefore
the component loadings tell us how much of the variation in a
variable is explained by the component.
If we use this technique on a data set with a large number of
variables, we can compress the amount of explained variation to
just a few components.

What follows is an example of Principal Component Analysis using


canal town commodity production figures (percentage of total
production) for 1845.
Towns Variables
Columbia Corn
Middletown Wheat
Harrisburg Flour
Newport Whiskey
Lewistown Groceries
Hollidaysburg Dry Goods
Johnstown
Blairsville
Pittsburgh
Dunnsburg
Williamsport
Northumberland
Berwick
Easton
New Hope
Bristol
Philadelphia
Paoli
Parkesburg
Lancaster
Total Variance Explained

Initial Eigenv alues Extraction Sums of Squared Loadings Rotation Sums of Squared Loadings
Component Total % of Variance Cumulat iv e % Total % of Variance Cumulat iv e % Total % of Variance Cumulat iv e %
1 2.533 42.211 42.211 2.533 42.211 42.211 1.887 31.452 31.452
2 1.565 26.084 68.295 1.565 26.084 68.295 1.880 31.328 62.780
3 1.504 25.073 93.368 1.504 25.073 93.368 1.835 30.587 93.368
4 .174 2.901 96.269
5 .119 1.988 98.257
6 .105 1.743 100.000
Extraction Method: Principal Component Analy sis.

In this case, 3 components contain 93.368% of the variation


of the 6 original variables. Note that there are as many
components as original input variables.

Component 1 explains 42.211% of the variation, component 2


explains 26.084%, and component 3 explains 25.073%.

The remaining 3 components explain only 6.632%.


A scree plot graphs the amount of variation explained by each component.

Cut-off point
Rotated
RotatedComponent Matrixa(a)
ComponentMatrix

Component
1 2 3
Corn -.065 .936 .214
Wheat -.104 .952 -.057
Groceries .962 -.092 -.086
Dry Goods .963 -.074 -.092
Flour -.126 -.097 .954
Whiskey -.057 .275 .927
Extraction Method: Principal Component Analy sis.
Rotation Met hod: Varimax wit h Kaiser Normalization.
a. Rotation conv erged in 4 iterations.

Highest Component Loading


Component 1: Groceries and dry goods.
Component 2: Corn and wheat.
Component 3: Flour and whiskey.
Note how the variables that make up each component
fall close to each other in the 3-dimensional sample space.
What do these components mean (how do we interpret them)?

Component 1 (groceries and dry goods) these two items


are highly processed and value added.

Component 2 (corn and wheat) these two items are not


processed (raw) and have no value added.

Component 3 (flour and whiskey) these two items are


moderately processed and value added.

It appears that the components are indicators of either


the amount of processing or value adding (or both).
The most challenging part of PCA is interpreting the
components.

1. The higher the component loadings, the more important that


variable is to the component.

2. Combinations of positive and negative loadings are interpreted

3. The specific sign of the is not important.

4. ALWAYS use the ROTATED component matrix!!


Component score: the new variable value based on the

original variable, summed over all variables.

Scoreik  D ij L jk

where Dij is the standardized value for observation i on variable j


and Ljk is the loading of variable j on component k.

Examining the component scores for each town may give some
clues as to the interpretation of the components.
Component Score Box Plot
Easton, Philadelphia, and
Northumberland are the
only towns that load highly
on a single component.
Scoring highly on a single component simply means that the
original variable values for these locations are overwhelmingly
explained by a single component.

In this case, it means that the variation among ALL of the


variables for Philadelphia (for example) is more completely
explained by a single component composed of groceries and
dry goods.
Rotated Component Matrixa

Component
1 2 3
Corn -.065 .936 .214
Wheat -.104 .952 -.057
Groceries .962 -.092 -.086
Dry Goods .963 -.074 -.092
Flour -.126 -.097 .954
Whiskey -.057 .275 .927
Extraction Method: Principal Component Analy sis.
Rotation Met hod: Varimax wit h Kaiser Normalization.
a. Rotation conv erged in 4 iterations.
Town Component Scores

Town Component 1 Component 2 Component 3


Columbia 0.31989 -0.44216 -0.44369
Middletown -0.37101 -0.24531 -0.47020 M
Harrisburg -0.00974 -0.06105 0.32792 town because it loads on
Newport -0.38678 0.40935 -0.62996 all components equally.
Lewistown -0.33132 1.27318 -0.52170
Hollidaysburg -0.44018 -0.49770 -0.59722
Johnstown -0.44188 -0.48447 -0.63736
Blairsville -0.42552 -0.38759 -0.51107
Pittsburgh -0.13834 -0.75021 1.05942
Dunnsburg -0.42728 0.03072 -0.73622
Williamsport -0.28812 -0.47716 -0.62453
Northumberland -0.00398 3.82169 0.09538
Berwick -0.36503 -0.46398 -0.60501
Easton -0.02349 -0.00587 3.28970
New Hope -0.40354 -0.42291 -0.25891
Bristol 0.60267 -0.32311 -0.50086
Philadelphia 4.08309 -0.14799 -0.24733 P
Paoli -0.41174 -0.35103 -0.38109
Parkesburg -0.25890 0.05125 0.92910
Lancaster -0.27880 -0.52566 1.46363
Component 1: Processed Goods

The green town were producers of processed goods, while the


red towns were consumers of those goods.
Component 2: Non-Processed Goods

The green town were producers of non-processed goods, while


the red towns were consumers of those goods.
Component 3: Partially Processed Goods

The green town were producers of partially processed goods,


while the red towns were consumers of those goods.
What information did PCA provide concerning the goods
exported by the canal towns?

The goods fell into recognizable categories (highly processed,


moderately processed, not processed).

A small number of towns were responsible for exporting


most of these goods.

The location of these towns relative to the goods they


produced make sense.
Industrial towns on the Columbia railroad exported
finished goods.
Small farming towns on the canal exported produce.
Midsize towns exported moderately processed
goods.
Without the use of Principal Component Analyses these associations
would be difficult to determine.

Principal Component Analyses is also used to remove correlation


among independent variables that are to be used in multivariate
regression analysis.

Correlation Matrix
Corn Wheat Groceries DryGoods Flour Whiskey
Corn 1.000 .812 -.163 -.160 .108 .450
Wheat .812 1.000 -.183 -.157 -.096 .198
Groceries -.163 -.183 1.000 .883 -.191 -.164
Correlation
DryGoods -.160 -.157 .883 1.000 -.198 -.163
Flour .108 -.096 -.191 -.198 1.000 .806
Whiskey .450 .198 -.164 -.163 .806 1.000

Correlation Note that PCA1 is highly correlated


Dry Goods Groceries PCA 2 PCA 3
PCA 1 0.963 0.962 0.000 0.000
to dry goods and groceries, but
uncorrelated to PCA2 and PCA3.
Principal Components Analysis
Covariance
• Variance and Covariance are a measure of the “spread” of a set
of points around their center of mass (mean)

• Variance – measure of the deviation from the mean for points in


one dimension e.g. heights

• Covariance as a measure of how much each of the dimensions


vary from the mean with respect to each other.

• Covariance is measured between 2 dimensions to see if there is


a relationship between the 2 dimensions e.g. number of hours
studied & marks obtained.

• The covariance between one dimension and itself is the variance


Covariance
n
covariance (X,Y) = i=1 (Xi – X) (Yi – Y)
(n -1)

• So, if you had a 3-dimensional data set (x,y,z), then you could
measure the covariance between the x and y dimensions, the y
and z dimensions, and the x and z dimensions. Measuring the
covariance between x and x , or y and y , or z and z would give
you the variance of the x , y and z dimensions respectively.
Covariance Matrix
• Representing Covariance between dimensions as a
matrix e.g. for 3 dimensions:
cov(x,x) cov(x,y) cov(x,z)
C = cov(y,x) cov(y,y) cov(y,z)
cov(z,x) cov(z,y) cov(z,z)
Variances
• Diagonal is the variances of x, y and z
• cov(x,y) = cov(y,x) hence matrix is symmetrical about
the diagonal
• N-dimensional data will result in NxN covariance
matrix
Covariance
• What is the interpretation of covariance
calculations?
e.g.: 2 dimensional data set
x: number of hours studied for a subject
y: marks obtained in that subject
covariance value is say: 104.53
what does this value mean?
Covariance examples

X
Covariance
• Exact value is not as important as it’s sign.

• A positive value of covariance indicates both


dimensions increase or decrease together e.g. as the
number of hours studied increases, the marks in that
subject increase.

• A negative value indicates while one increases the


other decreases, or vice-versa e.g. active social life
at PSU vs performance in CS dept.

• If covariance is zero: the two dimensions are


independent of each other e.g. heights of students vs
the marks obtained in a subject
Covariance
• Why bother with calculating covariance
when we could just plot the 2 values to
see their relationship?
Covariance calculations are used to find
relationships between dimensions in high
dimensional data sets (usually greater
than 3) where visualization is difficult.
PCA
• principal components analysis (PCA) is a technique
that can be used to simplify a dataset
• It is a linear transformation that chooses a new
coordinate system for the data set such that
greatest variance by any projection of the data
set comes to lie on the first axis (then called the
first principal component),
the second greatest variance on the second axis,
and so on.
• PCA can be used for reducing dimensionality by
eliminating the later principal components.
PCA Toy Example

Consider the following 3D points

1 2 4 3 5 6
2 4 8 6 10 12
3 6 12 9 15 18

If each component is stored in a byte,


we need 18 = 3 x 6 bytes
PCA Toy Example
Looking closer, we can see that all the points are related
geometrically: they are all the same point, scaled by a
factor:

1 1 2 1 4 1
2 =1* 2 4 =2* 2 8 =4* 2
3 3 6 3 12 3
3 1 5 1 6 1
6 =3* 2 10 = 5 * 2 12 = 6 * 2
9 3 15 3 18 3
PCA Toy Example

1 1 2 1 4 1
2 =1* 2 4 =2* 2 8 =4* 2
3 3 6 3 12 3
3 1 5 1 6 1
6 =3* 2 10 = 5 * 2 12 = 6 * 2
9 3 15 3 18 3

They can be stored using only 9 bytes (50% savings!):


Store one point (3 bytes) + the multiplying constants (6 bytes)
Geometrical Interpretation:
View each point in 3D space.

p3

p2
p1

But in this example, all the points happen to belong to a


line: a 1D subspace of the original 3D space.
Geometrical Interpretation:
Consider a new coordinate system where one of the axes
is along the direction of the line:

p3

p2
p1
In this coordinate system, every point has only one non-zero coordinate: we
only need to store the direction of the line (a 3 bytes image) and the non-
zero coordinate for each of the points (6 bytes).
Principal Component Analysis
(PCA)
• Given a set of points, how do we know
if they can be compressed like in the
previous example?
– The answer is to look into the
correlation between the points
– The tool for doing this is called PCA
PCA
• By finding the eigenvalues and eigenvectors of the
covariance matrix, we find that the eigenvectors with
the largest eigenvalues correspond to the dimensions
that have the strongest correlation in the dataset.
• This is the principal component.
• PCA is a useful statistical technique that has found
application in:
– fields such as face recognition and image compression
– finding patterns in data of high dimension.
PCA Theorem
Let x1 x2 … xn be a set of n N x 1 vectors and let x be their
average:
PCA Theorem
Let X be the N x n matrix with columns
x1 - x, x2 – x,… xn –x :

Note: subtracting the mean is equivalent to translating


the coordinate system to the location of the mean.
PCA Theorem
Let Q = X XT be the N x N matrix:

Notes:
1. Q is square
2. Q is symmetric
3. Q is the covariance matrix [aka scatter matrix]
4. Q can be very large (in vision, N is often the number of
pixels in an image!)
PCA Theorem
Theorem:
Each xj can be written as:

where ei are the n eigenvectors of Q with non-zero


eigenvalues.

Notes:
1. The eigenvectors e1 e2 … en span an eigenspace
2. e1 e2 … en are N x 1 orthonormal vectors (directions in
N-Dimensional space)
3. The scalars gji are the coordinates of xj in the space.
Using PCA to Compress Data
• Expressing x in terms of e1 … en has not
changed the size of the data

• However, if the points are highly correlated


many of the coordinates of x will be zero or
closed to zero.

note: this means they lie in a


lower-dimensional linear subspace
Using PCA to Compress Data
• Sort the eigenvectors ei according to
their eigenvalue:

•Assuming that

•Then
PCA Example –STEP 1
http://kybele.psych.cornell.edu/~edelman/Psych-465-Spring-2003/PCA-tutorial.pdf

• DATA:
• x y
• 2.5 2.4
• 0.5 0.7
• 2.2 2.9
mean
• 1.9 2.2
• 3.1 3.0 this becomes the
• 2.3 2.7
• 2 1.6
new origin of the
• 1 1.1 data from now on
• 1.5 1.6
• 1.1 0.9
PCA Example –STEP 2
• Calculate the covariance matrix
cov = .616555556 .615444444
.615444444 .716555556

• since the non-diagonal elements in this


covariance matrix are positive, we should
expect that both the x and y variable
increase together.
PCA Example –STEP 3
• Calculate the eigenvectors and eigenvalues
of the covariance matrix
eigenvalues = .0490833989
1.28402771
eigenvectors = -.735178656 -.677873399
.677873399 -735178656
PCA Example –STEP 3
•eigenvectors are plotted
as diagonal dotted lines
on the plot.
•Note they are
perpendicular to each
other.
•Note one of the
eigenvectors goes through
the middle of the points,
like drawing a line of best
fit.
•The second eigenvector
gives us the other, less
important, pattern in the
data, that all the points
follow the main line, but
are off to the side of the
main line by some
amount.
PCA Example –STEP 4
• Feature Vector
FeatureVector = (eig1 eig2 eig3 … eign)
We can either form a feature vector with both of the
eigenvectors:
-.677873399 -.735178656
-.735178656 .677873399
or, we can choose to leave out the smaller, less
significant component and only have a single column:
- .677873399
- .735178656
PCA Example –STEP 5
• Deriving new data coordinates
FinalData = RowFeatureVector x RowZeroMeanData
RowFeatureVector is the matrix with the
eigenvectors in the columns transposed so that the
eigenvectors are now in the rows, with the most
significant eigenvector at the top
RowZeroMeanData is the mean-adjusted data
transposed, ie. the data items are in each
column, with each row holding a separate
dimension.

Note: his is essential Rotating the coordinate axes


so higher-variance axes come first.
PCA Example –STEP 5
PCA Example : Approximation
• If we reduced the dimensionality,
obviously, when reconstructing the data
we would lose those dimensions we
chose to discard. In our example let us
assume that we considered only the x
dimension…
PCA Example : Final Approximation

2D point cloud Approximation using


one eigenvector basis
Another way of thinking
about Principal component
• direction of maximum variance in the input
space
• happens to be same as the principal
eigenvector of the covariance matrix
One-dimensional projection

find projection
that maximizes
variance
Covariance to variance

• From the covariance, the variance of


any projection can be calculated.
• Let w be a unit vector

w x
2 2
T
 w x  w Cw
T T

  wi Cij w j
ij
Maximizing variance

• Principal eigenvector of C
– the one with the largest eigenvalue.

w  arg max w Cw
* T

w: w 1

max C  max w Cw T


w: w 1

 w Cw
*T *
Implementing PCA

• Need to find “first” k eigenvectors of Q:

Q is N x N (Again, N could be the number of pixels in


an image. For a 256 x 256 image, N = 65536 !!)
Don’t want to explicitly compute Q!!!!
Singular Value Decomposition
(SVD)
Any m x n matrix X can be written as the product of 3
matrices:

Where:
• U is m x m and its columns are orthonormal vectors
• V is n x n and its columns are orthonormal vectors
• D is m x n diagonal and its diagonal elements are called
the singular values of X, and are such that:
1 ¸ 2 ¸ … n ¸ 0
SVD Properties

• The columns of U are the eigenvectors of XXT


• The columns of V are the eigenvectors of XTX
• The squares of the diagonal elements of D are the
eigenvalues of XXT and XTX
Algebra of Principal Component Analysis

2 1 – 3.2 – 1.6
3 4 – 2.2 1.4
Data: Y= 5 0 Centre each column on its mean: Yc = y – y = – 0.2 – 2.6
7 6 1.8 3.4
9 2 3.8 – 0.6

1
Covariance matrix (2 variables): - Y c' Yc = 8.2 1.6
S = -----------
n –1 1.6 5.8

Equation for eigenvalues and eigenvectors of S : (S – kI) uk =0

90
Eigenvalues: 1= 9, 2= 5 Matrix of eigenvalues: =
05

Matrix of eigenvectors: U = 0.8944 – 0.4472


0.4472 0.8944

Positions of the 5 objects in ordination space: F = y– y U

-1.0 -0.5 0.0 0.5 1.0

– 3.2 –1.6 – 3.578 0 Var 2


– 2.2 1.4 – 1.342 2.236
0.8944 – 0.4472 =
F = – 0.2 –2.6 – 1.342 –2.236 2 4
0.4472 0.8944
1.8 3.4 3.130 2.236
1
3.8 –0.6 3.130 –2.236
3 5 Var 1

-4 -2 0 2 4
Principal component analysis (PCA) 393

y2 y2
(y2 – y2 )

6
(a) 6
(b)

4 4

(y1 – y1)
2 2

0 0
0 2 8 10
y1
0 2 4 6 8 10
y1
4 6

y2 (y2 – y2 )
II (c) (d)
6 II
2 I
4 2
1 4
1
I 2 (y1 – y1)
–4 –2 2 4 2
–1
–1
–2 –2
26° 34' –2
0 –4
0 2 4 6 8 10
y1

Figure 9.2 Numerical example of principal component analysis. (a) Five objects are plotted with respect to
descriptors y1 and y2 . (b) After centring the data, the objects are now plotted with respect to
y 1 – y 1 and y 2 – y2 , represented by dashed axes. (c) The objects are plotted with
reference to principal axes I and II, which are centred with respect to the scatter of points.
(d) The two systems of axes (b and c) can be superimposed after a rotation of 26 34'.
II II
1 y2 II
6
(a) 2 y2
(b)
0 I
4 y2
1
y1
= 76° 35'
–1
2
–1 0 1 0 I

0 I
–1 y1

–2 y1
–2
–2 –1 0 1 2 3
–6 –4 –2 0 2 4

Fig. 9.3 Numerical example from Fig. 9.2. Distance and correlation biplots are discussed in
Subsection 9.1.4. (a) Distance biplot. The eigenvectors are scaled to lengths 1.
Inset: descriptors (matrix U). Main graph: descriptors (matrix U; arrows) and
objects (matrix F; dots). The interpretation of the object-descriptor relationships is
not based on their proximity, but on orthogonal projections (dashed lines) of the
objects on the descriptor-axes or their extensions. (b) Correlation biplot.
Descriptors (matrix U 1/2 ; arrows) with a covariance angle of 76 35'. Objects
(matrix G; dots). Projecting the objects orthogonally on a descriptor (dashed lines)
reconstructs the values of the objects along that descriptors, to within a
multiplicative constant.

Use the following matrices to draw biplots

Distance biplot (scaling 1): objects = F, variables = U

Correlation biplot (scaling 2): objects = G = F –1/2, variables = Usc2 = U 1/2

These two projections respect the biplot rule, that the product of the two projected
matrices reconstruct the data Y:

Distance biplot: FU' = Y Correlation biplot: G(U 1/2)' =Y


Data transformation

Transform physical variables (Ecology) or characters (Taxonomy)

• Univariate distributions are not symmetrical


⇒ Apply skewness-reduction transformation

• Variables are not in the same physical units


yi – y y i – y min
⇒ Apply standardization z i = ------------- or ranging y'i = ------------------------------
sy y max – y min

• Multistate qualitative variables


⇒ In some cases, transform them to dummy (binary) variables

Transform community composition data (Ecology)


(species presence-absence or abundance)

• Reduce asymmetry of distributions


⇒ Apply log(y + c) transformation

• Make community composition data suitable for Euclidean-based


ordination methods (PCA, RDA)
⇒ Use the chord, chi-square, or Hellinger transformations (Legendre
& Gallagher 2001)
Some uses of principal component analysis (PCA)

• Two-dimensional ordination of the objects:


- Sampling sites in ecology
- Individuals or taxa in taxonomy
⇒ A 2-dimensional ordination diagram is an interesting graphical
support for representing other properties of multivariate data, e.g.,
clusters.

• Detect outliers or erroneous data in data tables

• Find groups of variables that behave in the same way:


- Species in ecology
- Morphological/behavioural/molecular variables in taxonomy

• Simplify (collinear) data; remove noise

• Remove an identifiable component of variation


e.g., size factor in log-transformed morphological data
Algebra of Correspondence Analysis
fi+
pij = fij / f++
10 10 20 40
Frequency
= fij = 10 15 10 35 pi+ = fi+ / f++
data table Y
15 5 5 25
p+j = f +j / f++
f+j = 35 30 35 100 = f++

p ij – pi+ p + j Oij – Eij E ij


Matrix Q = qij = --------------------------- = ------------------------------------------
pi+ p + j f ++

–0.10690 – 0.05774 0.16036


Matrix Q = –0.06429 0.13887 – 0.06429
0.21129 – 0.09129 – 0.12667

0.06020 – 0.02204 –0.03980


Cross-product matrix: Q'Q = – 0.02204 0.03095 –0.00661
– 0.03980 – 0.00661 0.04592

Compute eigenvalues and eigenvectors of Q'Q : ( Q'Q – kI) uk =0

Eigenvalues: 1= 0.096, 2= = 0.096 0


0.041 Matrix of eigenvalues:
0 0.041
There are never more than k = min(r – 1, c – 1) eigenvalues > 0 in CA

0.78016 0.20336
Matrix of eigenvectors of Q'Q (c c) : U(c k) = – 0.20383 –0.81145
– 0.59144 0.54790

– 0.53693 0.55831
–1/2
Matrix of eigenvectors of QQ' (r r) : Û (r k) = QU = – 0.13043 –0.79561
0.83349 0.23516
Compute matrices F and V for scaling 1 biplot, and V̂ and F̂ for scaling 2 biplot:

CA biplot CA biplot
scaling type 1 scaling type 2

Sp.3
● Site_1

● Site_3
Sp.1
● Site_1 Sp. 3
● Site_3
Sp.1
● Sit e_2
Sp.2

● Site_2
Sp.2

-1.5 -1.0 -0.5 0.0 0.5 1.0 1. 5 2.0 -1.0 -0.5 0.0 0.5 1.0 1.5 2.0
CA axis 1 CA axis 1

Calculation details
Compute matrices V, V̂ , F, and F̂ used in the ordination biplots:

V(c k) = D(p+j)–1/2 U where p+j = f +j/f++

V̂ (r k) = D(p i+) –1/2 Û where pi+ = fi+/f++


1/2
F(r k) = V̂
1/2
F̂ (c k) = V
Biplot, scaling type 1: plot F for sites, V for species:
• This projection preserves the chi-square distance among the sites.
• The sites are at the centroids (barycentres) of the species.

Biplot, scaling type 2: plot V̂ for sites, F̂ for species:


• This projection preserves the chi-square distance among the species.
• The species are at the centroids (barycentres) of the sites.
12345262789ABC78DE323

F9D27A

 123456783952
 27A85B123452627879BC78DE323
 86DA3

1
7954927

 89B23B123452627879BC78DE323
 989239248DB9A472AB9B4D8332EB A493B279B698DDEBA4D32!AB
87BA8392!AB"53B83AB7B8B3A9BB6A8358DAB A49#3B
A895A3B

7954927

 $53ABB123452627879BC78DE323
 %B4D8332EB A493B&ADA'B4396A53'B927"3'BA94()B279B7ABB
9*B5B65AB"53B83AB7B8B3A9BBA895A3B989BA3452AB9AB
 A493B&A("(B"A7A5'B8"A'B2746A'B*A2"9'B5AA5A74AB345A'BA94(B
)(B
 %*B927"3B9B4A4+
 24B1234564627389214487BA39BA9A5627AB"5B6A6A532BB
9AB A49 123456278292A4BCD

 89B23B9AB4D8332248927B98A245BB5C2A49BA39B3A8589AB93AB
"53 E93FFBBA34BCD

2
F9D27A

 7954927
 A92BC4DE9F84992C23D2CF9F
 86DA3

27A85B123452627879BC78DE323B&1C)

 27A85B23452627879B878DE323B&1C)'B
 CD3B48DDAB,23A5#3BD27A85B23452627879
 -A93B3AB27B3989239243B87B68427ABDA85727"B9B27B9AB
D27A85B46278927BBA895A3B*24B
 A39B3A8589AB9*B5B65AB4D833A3BB A49B5BA!A79(B
 %AB5A3D927"B46278927B68EBAB3AB83B8BD27A85B4D83322A5'B5'B
65AB4667DE'B5B26A73278D29EB5A4927BA5ABD89A5B
4D8332248927(

3
126A73278D29EB.A4927

 /53ABB26A73278D29E0
 $5DA6B483ABEB2"A5B9AB26A7327BB9ABA895AB!A4953
 1898B385329E
 17A595827AB4D83322A5

 28D0
 .A4AB26A7327BBA895AB!A4953B*299BD33BB27568927

27A85B123452627879BC78DE323B&1C)

 28D0
 %5EB9B92623AB4D833B3A8582D29E
 CD3B+7*7B83B,23A5 3B23452627879B878DE323

4
27A85B123452627879BC78DE323B&1C)
 $5DA6B3989A6A79
 C332"7B4D833B489A"5EB&5B9AB"5'B4D833BD8AD)0B4"5B87B485B5BA84B
549
 /D833B489A"5EB23B8D3B48DDABAA7A79B!8528DA(B
 683AB7B,A895A3
 84B6A835A6A79B7B9AB549B23B48DDABA895A3B989BA3452AB9AB A49
 29B23B8D3B48DDAB27AA7A79B!8528DA(
 1AA7A79B!8528DAB&7) 23B9AB"5
 %ABAA7A79B!8528DAB23B8D*8E3B489A"5EB&76278DB348DA)B!8528DA
 7AA7A79B!8528DA3B&8) 85AB9AB A49BA895A3B989B62"9B
A3452AB9AB"5
 27AA7A79B!8528DA3B487BAB87EB6A835A6A79B348DAB&2(A(B76278D'B5278D'B
279A5!8DB5B5892)B

27A85B123452627879BC78DE323B&1C)

 27A85B123452627879BC78DE323B&1C)B
 C336AB989B9AB"53B85ABD27A85DEB3A858DAB
 13ABD27A85B23452627879B6ADB&1C)
 89B23B27A85DEB3A858DA
 9B3""A393B989B9AB"53B487BAB3A8589ABEB8BD27A85B
46278927BBA895A3B989BA3452AB9AB A493B

5
27A85B123452627879BC78DE323B&1C)

 $/CB!3B1C
 $/CB23B95E27"B9B27B9AB3957"A39B455AD8927B27B9AB8983A9
 1CB23B95E27"B9B92623AB4D833B3A8582D29E

27A85B123452627879BC78DE323B&1C)

 28DBB1C0B95EB9B682623AB4D833B3AA582D29E

6
12A5A79BC584A3B9B1C

 /D8339AA7A79B95873568927
 -8262327"B9AB5892BBA9*AA7B4D833B!852874AB9B*2927B4D833B
!852874A(
 7!D!27"B327"B9*B9262327"B4529A528B5B958735627"B9AB898B
3A93B27AA7A79DE
 /D833927AA7A79B95873568927
 -8262327"B9AB5892BB!A58DDB!852874AB9B*2927B4D833B!852874A
 F7DEB327"B7AB9262327"B4529A527B9B9587356B9AB898B3A93B
87BA74AB8DDB898B2793B255A3A492!ABB9A25B4D833B2A7929EB85AB
9587356AB327"B923B9587356(

:6A5248DB86DA

 22!A7B8B9*94D833B5DA6(
 790B9*B3A93BB;91B898B2793B

E93FF7 E93FF7

7
:6A5248DB86DA
 9AB<
 /69AB9AB6A87BBA84B898B3A9B87B6A87BBA7925AB898B3A9(

3437CBD4F7BD7E93FF7 23D7C78247 1< D7AC95D72A4C6

3437CBD4F7BD7E93FF7 23D7C78247 1; D7AC95D72A4C6

3437CBD4F7BD7C47
E93FF773D7E93FF7
23D7C7D4B627343 1= D7AC95D72A4C6

2627D7BF7427D5267C7B2DFBCD7D7C567
A3F2 7B47BF72!53974C7

:6A5248DB86DA
 9AB;
ED
 /69AB9AB6A9*AA7B/D833B4899A5B-8952BBBBBBBBBB87B2927B/D833B4899A5B
-8952 EF
B4BD7E93FF78A344267346B'

%262  BF7427&6BC67&6C3B9B4B2F7C7427"4 A93FF

3D BF7AC36B3DA27346B'7C7427"4 A93FF7#F247"$

(24%22D7E93FF78A344267346B'

%262 1= BF742723D7C74272D4B627343
3D 1 BF742723D7C7427"4 A93FF7#F247"$

2627D7BF7427D5267C7B2DFBCD7D7C567A3F2 7B47BF72!53974C7

8
:6A5248DB86DA
 9AB=
 2"A7!A4953B4698927
+&4BB/BD,7E6B426BCD

E93FF)2&2D2D47463DFC634BCD*7
+43BD74272B,2D2A4C6F76C
3'BB/BD,7427634BC7C724%22D7A93FF7
36B3DA274C7%B4BD7A93FF736B3DA2 B,2D2A4C6F -63DFC6."
DC9BD,75FBD,74%C7C&4BB/BD,7A6B426B37
C67463DFC6BD,74273437F24F7
BD2&2D2D490

+&4BB/BD,7E6B426BCD
E93FF)BD2&2D2D47463DFC634BCD*7
+43BD74272B,2D2A4C6F76C
3'BB/BD,7427634BC7C7C26399736B3DA27
4C7%B4BD7A93FF736B3DA2
+D9075FBD,7CD27C&4BB/BD,7A6B426BCD74C7 B,2D2A4C6F -63DFC6.F&2A
463DFC674273437F24F73D72DA273997
3437&CBD4F7B662F&2A4B27C742B67A93FF7
B2D4B4073627463DFC6275FBD,74BF7
463DFC6

:6A5248DB86DA

 9AB>
 %587356AB68952B48D4D8927

26271463DFC6."2 BF7AC&CF27C72B,2D2A4C6F76C

26271463DFC6.F&2A2 BF7AC&CF27C72B,2D2A4C6F76C

9
:6A5248DB86DA
 9AB?
 4D2A87B239874AB48D4D89A

%262 1 3971 BF742723D7C7427463DFC6273437F24

 BF7427A93FF7BD2'
 BF742742F472A4C6

1C67D A93FF2F 7D 5A9B23D7BF43DA2F73627C43BD27C6723A742F47&CBD4

:6A5248DB86DA
 9AB@
 /D8332248927B5A3D9B23B83AB7B9AB368DDA39B4D2A87B239874AB867"B9AB7B
239874A3B4D83322A53B9AB9A39B!A495B8BAD7"27"B9B4D833B

1A
9A7327B9B-D92DAB/D833A3B

 6A9*AA7B/D833B4899A5B-8952

9A7327B9B-D92DAB/D833A3

 2927B/D833B4899A5B-8952

11
9A7327B9B-D92DAB/D833A3

1 1
E F5<E D2 4 32

 AA39273

12
Introduction

Linear Discriminant Analysis (LDA) is most commonly used as dimensionality reduction technique in the pre-
processing step for pattern-classification and machine learning applications. The goal is to project a dataset onto a
lower-dimensional space with good class-separability in order avoid overfitting (“curse of dimensionality”) and
also reduce computational costs.

Ronald A. Fisher formulated the Linear Discriminant in 1936 (The Use of Multiple Measurements in Taxonomic
Problems), and it also has some practical uses as classifier. The original Linear discriminant was described for a 2-
class problem, and it was then later generalized as “multi-class Linear Discriminant Analysis” or “Multiple
Discriminant Analysis” by C. R. Rao in 1948 (The utilization of multiple measurements in problems of biological
classification)

The general LDA approach is very similar to a Principal Component Analysis (for more information about
the PCA, see the previous article Implementing a Principal Component Analysis (PCA) in Python step by
step), but in addition to finding the component axes that maximize the variance of our data (PCA), we are
additionally interested in the axes that maximize the separation between multiple classes (LDA).

So, in a nutshell, often the goal of an LDA is to project a feature space (a dataset n-dimensional samples) onto a
smaller subspace k

(where k≤n−1) while maintaining the class-discriminatory information.


In general, dimensionality reduction does not only help reducing computational costs for a given classification
task, but it can also be helpful to avoid overfitting by minimizing the error in parameter estimation (“curse of
dimensionality”).
Principal Component Analysis vs. Linear Discriminant Analysis

Both Linear Discriminant Analysis (LDA) and Principal Component Analysis (PCA) are linear transformation
techniques that are commonly used for dimensionality reduction. PCA can be described as an “unsupervised”
algorithm, since it “ignores” class labels and its goal is to find the directions (the so-called principal components)
that maximize the variance in a dataset. In contrast to PCA, LDA is “supervised” and computes the directions
(“linear discriminants”) that will represent the axes that that maximize the separation between multiple classes.

Although it might sound intuitive that LDA is superior to PCA for a multi-class classification task where the class
labels are known, this might not always the case.
For example, comparisons between classification accuracies for image recognition after using PCA or LDA show
that PCA tends to outperform LDA if the number of samples per class is relatively small (PCA vs. LDA, A.M.
Martinez et al., 2001). In practice, it is also not uncommon to use both LDA and PCA in combination: E.g., PCA
for dimensionality reduction followed by an LDA.
What is a “good” feature subspace?

Let’s assume that our goal is to reduce the dimensions of a d

-dimensional dataset by projecting it onto a (k)-dimensional subspace (where k<d). So, how do we know what size
we should choose for k (k

= the number of dimensions of the new feature subspace), and how do we know if we have a feature space that
represents our data “well”?

Later, we will compute eigenvectors (the components) from our data set and collect them in a so-called scatter-
matrices (i.e., the in-between-class scatter matrix and within-class scatter matrix).
Each of these eigenvectors is associated with an eigenvalue, which tells us about the “length” or “magnitude” of
the eigenvectors.

If we would observe that all eigenvalues have a similar magnitude, then this may be a good indicator that our data
is already projected on a “good” feature space.

And in the other scenario, if some of the eigenvalues are much much larger than others, we might be interested in
keeping only those eigenvectors with the highest eigenvalues, since they contain more information about our data
distribution. Vice versa, eigenvalues that are close to 0 are less informative and we might consider dropping those
for constructing the new feature subspace.
Summarizing the LDA approach in 5 steps

Listed below are the 5 general steps for performing a linear discriminant analysis; we will explore them in more
detail in the following sections.

1. Compute the d -dimensional mean vectors for the different classes from the
dataset.
2. Compute the scatter matrices (in-between-class and within-class scatter
matrix).
3. Compute the eigenvectors (e1,e2,...,ed) and corresponding eigenvalues
( 1, 2,..., d) for the scatter matrices.
4. Sort the eigenvectors by decreasing eigenvalues and choose k eigenvectors
with the largest eigenvalues to form a d×k dimensional matrix W (where
every column represents an eigenvector).
5. Use this d×k eigenvector matrix to transform the samples onto the new
subspace. This can be summarized by the matrix multiplication: Y=X×W
(where X is a n×d-dimensional matrix representing the n samples, and yy are
the transformed n×k-dimensional samples in the new subspace).
1234536789ABC6B4D4E9F47839
74934D7293523C3474E9
F47839B29D7E2D9D5E3B4
123456789A847B32C7

1
DEF3858A7F32C78C72B3596

1
DE8A7F32C78C72B3596

1
7F32C78C72B35968F49C53E
2 C56B5F4899676386F458
2 567FC85BC556F6386F458
DEF3858A7F32C78C72B3596
1
7F32C78C72B35968C7A7C83983E78F5689A83E789C556F48E5E
576596F48F3F896398F849 7C576596F48FB7!
2 C537C5968A9C8A7F32C78C72B35968BF68"785AA7C7638"F789685AA7C7638
C9"47873356!
1
#627C$57873356%85655&783E7856A9CF3596849
1
'27C$57873356%8F(55&783E78B4F85BC556F3596

1
)5$768F87389A8F3F8956389A88$FC5F"47 23 1 22 13 1 21
88888923783E784567FC83CF6A9CF35968C9*7B3596

3 4 3
5 62 6 57 2 4
54 34
DEF3858A7F32C78C72B3596

1C556F48F3F C72B78F3F

567FC83CF6A9CF3596

4
7 4 3 8
5
3
9

3 4 7 4
5 69 8 5 9
7F32C78C72B35968$7C28A7F32C78747B3596

1
7F32C78C72B3596
2 4489C556F48A7F32C78FC7827
2 +E783CF6A9C78A7F32C78FC784567FC8B9"56F359689A83E789C556F48
A7F32C7!

1
7F32C78747B3596
2 1648F82"7389A83E789C556F48A7F32C78FC7827!

1
963562928$7C285BC7378
123456789A847B32C7

1
DEF3858A7F32C78C72B3596

1
DE8A7F32C78C72B3596

1
7F32C78C72B35968F49C53E
2 C56B5F4899676386F458
2 567FC85BC556F6386F458
DE8A7F32C78C72B3596

1
,938FBE567847FC6568F68F3F85656837BE65-278F8
6938"787AA7B35$78A9C8E5E576596F48F3F8
2 2C789A8576596F453
2 .27C8FBB2CFB8F687AA5B576B87CF78CF548F83E785765968
56BC7F7!

1
+E78563C565B85765968F8"78F44!8
2 9C87(F47/83E7862"7C89A87678C7965"478A9C8F8B7C3F568378
9A857F78F8"78F44!
DE8A7F32C78C72B3596

1
052F45&F3596%8C9*7B359689A8E5E576596F48F3F896398
189C82!

1
F3F8B9C7596%87AA5B5763839CF78F68C73C57$F4!

1
39578C79$F4%89535$787AA7B38968-27C8FBB2CFB!
45BF359689A8A7F32C78C72B3596

1
FB78C7B9653596
1
4F6 C5337685538C7B9653596
1
+7(385656
1
5F78C73C57$F4
1
,5BC9FCCF8F3F8F6F45
1
C937568B4F5A5BF3596
45E576596F48F3F8568"5956A9CF35B

)76787(C75968F337C685F7
)76787(C7596
45E576596F48F3F8568B9237C8$5596

FB785F7 4F6 C533768553


123456789A847B32C7

1
DEF3858A7F32C78C72B3596

1
DE8A7F32C78C72B3596

1
7F32C78C72B35968F49C53E
2 C56B5F4899676386F458
2 567FC85BC556F6386F458
7F32C78C72B35968F49C53E

1
#627C$57
2 F37638'7F635B8567(568'5%83C26BF378'0
2 56776763899676386F4585
2 C56B5F4899676386F458
2 F6965BF489CC74F359686F458

1
'27C$578
2 567FC85BC556F6386F458

1
'7527C$578
2 677FCBE8395B
DEF3858C56B5F4899676386F45

1
C56B5F48B9967638F6F4588
2 672B783E78576596F45389A8F8F3F87388"8A56568F867 87389A8
$FC5F"47/8F447C83EF683E789C556F487389A8$FC5F"47
2 673F5689389A83E78F477856A9CF3596!
2 #7A248A9C83E78B9C75968F68B4F5A5BF359689A8F3F!8

1
8856A9CF35968 787F683E78$FC5F35968C776385683E78F47/8
5$768"83E78B9CC74F35968"73 77683E789C556F48$FC5F"47!88
2 +E7867 8$FC5F"47/8BF4478C56B5F48B9967638/8FC78
26B9CC74F37/8F68FC789C7C78"83E78ACFB359689A83E78393F4856A9CF35968
7FBE8C73F56!
)7973C5B85B32C789A8C56B5F48B9967638

A3

1
A3
789A73B87CD777777777EB7F7EE7EB8FA7E8787F7EA7E77777BFA
1 A2
789A727CD77777777EB7F7EE7EB8FA7E8787F7EA7E789A7
FA77777777AAEF78789A73B87CD7
CDB7FA7F7BAEAB77EAF7AFB87BFAB7E8B787F7BFA1
AF9789F787F789A7AEB7
)7973C5B85B32C789A8C56B5F48B9967638
)7973C5B85B32C789A8C56B5F48B9967638
)7973C5B85B32C789A8C56B5F48B9967638
47"CF5B87A565359689A8

!EA7F7BFA771E"BAF8EB77F7A87747FEF"AB
3
23 1 22 1 3 1 21
AEA789A7EB87EEF7A87789A7BFA
"#789A7EAF78FBF8E
3
7
A3 C 2B
3 CD3 2DB 1 B 3121 3 1 1
D 3

$9AA789A7A8 C3 5C33 1 C23 1 3 1 C3 3 4


2B 5 23 B 1 22 B 1 3 1 23B 4
EB79BA7B9789F87 F A3  EB7F%E7
&A"FE7AEF8E77CDB
C3
'7E77777777777EB878A789F877
1C 2
1
3 2
F A3   55 A3 3 A3 4 4 2 7
3 2D 3 C 2
7
3
1 D 3

1 21 2
1
3 7
C 2D 3 2 2D 3 2 C3
7
3 C37 FC3
1
1 2 3 22 1 2 3 22
D 3
1
$9AA7 3 7
F D D
1 D 3
EB789A7FEFA7F8E%77
1
3
2 2D 7EB789A7AF
1 D 3
&A"FE7AEF8E77CDB

C3 F A3 
'7E777777789F877F%EE(AB7777777777777777B")A878
7
C3 C3 3
*A87+7"A7F7*FFA78EEA

 C37 FC3 3 5C37 C3 3 34

 FC3 3 C3 -
C3
5F 3  4 4C3 -

89AAA C3 EB7F7AEAA8777,

ABE78789A7FAB87AEAFA
3 
&A"FE7AEF8E77CDB
C2
.A7E789F87777777777EB7FB7F7AEAA8777,77
$9BA7AEAFA777777777777777777EB789A7BA7FAB87777777777777
2

/7AAF77
7
F A   C FC  

1
7'9A7897FAB87AEAFA777,77EB789A7FEFA7789A7897CD

1 A
7'9A7897CD7777777777A8FEB789A7897AF8AB87F8E7789A7FEF8E7
777E789A7BFA
47"CF5B87C5$F359689A8

1
,F568378A9C8B923568
2 9C83E78B9$FC5F6B78F3C5(8'!

3
2 923785387576$7B39C% C
D D 3
4
C
2 +E78A5C3887576$7B39C8888888888888888A9C83E788!
D D 3

2 +E783CF6A9CF35968)8B965389A83E788%

5 C3 1 C2 1 31 C 4 
8A9C85F78B9C7596

123 124 125 126

ABCDCEF
1237 1284 1275 12399 FD
123456789A847B32C7

1
DEF3858A7F32C78C72B3596

1
DE8A7F32C78C72B3596

1
7F32C78C72B35968F49C53E
2 C56B5F4899676386F458
2 567FC85BC556F6386F458
567FC85BC556F6386F45

1
5C38F4578"8,!88FC6FC8F383E7827359689A8
6!8!85E7C89:2;/85E7C84567FC85BC556F638
F6F458%

1
5765968C72B3596
2 5684567FC8B9"56F359689A83E78A7F32C78<=9/!!!/=8 53E8
4FC78CF35989A8"73 776C928398 53E56C928289A8
-2FC7885BC556F638$FC5F"47>

1
4F5A5BF3596
2 C75B383E78B4F89A8F689"7C$F359688"83E78B4F8 E978
7F68$7B39C858B49738398856837C89A83E785BC556F638
$FC5F"47
588F8998BC537C5968A9C8B4F5A5BF3596

1
F3F8$FC5F35968737C56783E78
C9*7B359685C7B3596
1
DEF3?8556
2 4F856A9CF3596
DEF3858F8998C9*7B3596

1
'554FC4/8 EF3858F8998 456789ABBCB7
BC537C5968 6DCE9AF
2 '7FCF35685AA7C7638B4F7

456789ABBCB7AEC7
BCFAEAC
DEF38B4F856A9CF35968F8"7827A24

1
873 776B4F853F6B7
2 53F6B78"73 77683E78B763C9589A8
5AA7C7638B4F7

873 776B4F853F6B7
8DEF38B4F856A9CF35968F8"7827A24

1
873 776B4F853F6B7
2
53F6B78"73 77683E78B763C9589A8
5AA7C7638B4F7

1
D53E56B4F853F6B7
2 BB224F37853F6B789A8F68563F6B78398
3E78B763C9589A8538B4F

D53E56B4F853F6B7
567FC85BC556F638F6F45

1
567FC85BC556F638F6F4588
A5689385BC556F638C9*7B35968"8
F(55&568"73 776B4F853F6B78
F685655&568 53E56B4F853F6B7
567FC85BC556F638F6F45

1
567FC85BC556F638F6F4588
A5689385BC556F638C9*7B35968"8
F(55&568"73 776B4F853F6B78
F685655&568 53E56B4F853F6B7
393F3596

+CF56568F3F8AC985AA7C7638AC989/81/8@/8A

3 2 


 1
F3F8F3C5(
8393F3596

D 7EB789A7A8E7 7E897FBB
1
873 776B4F8BF337C  7EB789A7A8E7 7F7FBBAB

3 3  2 3   3 

7
F   

 1
3 3 3 2 3 2  3 
1
D53E56B4F8BF337C 

7 
F   
 1
1
C97C357%
1
873 776B4F853F6B78<83CFB789A8"73 776B4F8BF337C85!7!/83E78
2F359689A85F96F4874776389A83E78BF337C
1
D53E56B4F853F6B78<83CFB789A8 53E56B4F8BF337C
5BC556F638BC537C596

1
5BC556F638BC537C5968568F3E7F35BF48A9C24F35968
7
8FA55 F 5 4 
F F% 7
5 8FA55 F 5 4 

F
2 873 776B4F8BF337C8F3C5(
F
2 D53E56B4F8BF337C8F3C5(
3 3
1  F F
+E78935F483CF6A9CF35968585$768"894$568F8767CF45&78
7576$F4278C9"47
)CFE5BF48$57 89A8B4F5A5BF3596

 B9

6 6

 1 5  3 34
 1 3
B9 

 B9
9  9
3 3 3 5  3 34
 
83738F3F895638E 5683E7867FC738675E"9C
1C867FC738B763C95
3 5  3 34
5
45BF3596
1
FB78C7B9653596
2 874E2792C812345678,5?:C
1
5F78C73C57$F4
2 ' 738F68D76/8,5?:;
1
)76787(C75968F3F8F6F45
2 2953812345678D'?E1>8F7812345678+88?EG
1
C9375687(C75968F3F8F6F45
1
8 545768123456789!8859!?E2
1
+7(385656
2 FCA8123456788'5,=?E2>8F7812345678,5?EG
1
,75BF485F78F6F45
2 26FC/8',?EH8
5278568

1
F  58C7-25C78398"786965624FC!
88888888
2 '5624FC5389C8267CF478C9"478 E7686I
2 J(F47%876787(C75968F3F8858FC9268A7 8
3E92F68F686858FC9268A7 8E26C7/85F7/8
37(389B2763

1
C9FBE7
2 K8%8C56B5F4899676386F45
2 6724FC5&78%
2 #6B9CC74F378
2 1C3E996F488
'2FC

1
7F32C78C72B3596858F6859C3F638C7C9B7568378568
F68F45BF3596!

1
#627C$578$7C2827C$57
2 8F68

1
677FCBE8C9"47%
2 '7527C$578A7F32C78C72B3596
2 3964567FC8A7F32C78C72B3596
2 737C56F359689A83E78C72B785765968568
1
923F3596F48F683E79C735BF485278568 F78'7C5789A8A5$787"C983F5678 53E8F8C9"7889A
FBE567847FC6568F68F3F85656
2 576596F4538C72B3596 A*8'7C5789A8A5$787"C983F5678 53E8F8C9"78
2 4237C568F68B4F5A5BF3596 BCDEF
2 '7527C$57847FC656
1
13211343213113422132423413A54323213422133413
2 B7C674873E9 8118
1
39113 31211315433113422188
2
+E75C8F45BF35968398"5956A9CF35B
1
423321312123435423!32131543812 1139113113
42213
2 J(C75968F337C685F78 1
32113434913321311342213!3439113 134213911"3
2 ,5BC9FCCF876787(C75968F3F 1133452118
2
C9357687-276B78F683C2B32C7 8
88

+98F6 7C83E78F"9$78-273596/856$735F39C8767CF448C748968
12BD5E
3E75C89 6/8F8B944F"9CF39C?/89C87659C87639C?8A69 477/8 E5BE8
7534D98D7243497662B75D9B29
EF8"7768F5678"8A9449 5683E782"45E784537CF32C789$7C8F68
3B8B357893C7D934B2C7E35
7FC89C87$7687BF7!85389786938BF478398769C928F3F!

D78C99783987$7498B923F3596F48FC9FBE78A9C8
F6 7C56883E778-2735968F239F35BF44!
123425671289ABC47
DE9C486AF
12342536
1
789ABC9D3EFCA373C4959
7C45A977C347C9789ABC9D3EFCA373C4959
1
789ABC9D375ABC9D3A74789ABC9D3EFCA373C4959
1
73C43AB735477 45B7
1
72C9BC3A7C975AC2C9BC3A77B377C 45B
1
74CD3A7B7234D37B375AC2C9BC3A77B377777777
7777C 45B
1
759597C347C9789ABC9D3EFCA373C4959
89ABC9D3EFCA373C4959

2
7 3C49597 !7 AB4597 C 7 B4C59597
59ABC9D3A
2
7  CAA55DCB597 !7 C97 59ABC9D37 3BA7 C7
D CAA55DCB5973"#C 7B7B37D CAA55DCB597
7B3793C43AB759ABC9D3A7B7B3759ABC9D3$
53439B73C49597%3BA
1
12345674258983
2
3C49597 !7 CD"#54597 3& 5D5B7 3AD45B597 7 B37 BC43B7 D9D3BA7 97 B37
6 37B4C59597A3B'
2
 CAA55DCB597 !7 C97 59ABC9D37 3BA7 C7 D CAA55DCB597 #A597 B37 3& 5D5B7
3AD45B5977B37BC43B7D9D3BA$

1
A8BC28D4EF2B4674258983672674258983
2
3C49597!7AB4597C 7B4C5959759ABC9D3A
2
 CAA55DCB597!7C9759ABC9D373BA7C7D CAA55DCB5973"#C 7B7B37D CAA55DCB597
7B3793C43AB759ABC9D3A7B7B3759ABC9D3$
53439B73C49597%3BA
1
C3473C4959
1234562789498AB9B2C
DEFC46498B

F4646498B
89ABC9D3EFCA373C4959

FC4AB5349654C846
BC8
3C43ABE3547 45B77
(37)3CB#43A77B37(CA*77B377 45B+
1
B3759ABC9D37 C9#C37 8 75A7 C7 ,E1 7 C9#C3765B7 C7 A3B7 D7
65B77CBB45#B3A73-73-7.73$7(37C59773CD7CBB45#B37
38-7DC97375AD43B3747D9B59##A$7
1
C9759ABC9D3775A74343A39B37CA7/73-73-7.7370-77777
6343773875A7B372C #377B37CBB45#B3738747B3759ABC9D37'
1
B37D9D3BA7B737 3C4937DC973+
2
854262$7 897 B5A7 DCA37 637 3C497 854262 7#9DB597 7 C97 B37 DE
C59777B37#9DB597D9A5ABA77B37D9D3BA77B737 3C493$
2
C68C5$7 897 B5A7 DCA37 637 3C497 C68C5 7#9DB597 7 C97 B37
DEC597 7 7 B37 #9DB597 D9A5ABA7 7 B37 D9D3BA7  7B7 37
3C493$
5ABC9D37)#9DB59A
(375ABC9D37#9DB59A7C437DA374753439D373B45DA7376$4$B$7CBB45#B3A737
35937473CD7B6759ABC9D3A787C97$

1
787B37CBB45#B37375A79#345DC -7B397+7
1 3  8  1 3     1
 38   
1
787B37CBB45#B37375A75AD43B3-7B397+ 43923
777777772- 5 7C 8  C   
 38   
,- B3465A3$
5ABC9D37)#9DB59A
(37 C597 5ABC9D37 #9DB597 47 3B3459597 93C43AB7 9354A7 5A7 B37
#D 53C975ABC9D3+

8     3  8 -    3
3 D
E3C43ABE3547 45B

46D2B4669BD54C46B4C66D2BB4B
,$ (C*37B3759ABC9D377B737D CAA553
3$ )597793C43AB79354A777597B37B4C59597CBC$
4$ 3B345937 B37 D CAA7  77 B37 C545B67 7 B37 59ABC9D3A7
C97B37793C43AB79354A$
7$ 83B#497B37D CAA777CA7B37D CAA55DCB5977$
E35588368C7 7!2858C7"C34825
34
2

2
2
2
1 54
1 2
1
1

1233456157859A87B7CD 12334
E23345615785F8877D5853DB7CD
4254BC649356D2BB99D2C98
123456
952397C7 C3 37AC 37773CB#43723DB4A77DC 7:
75ABC9D373CA#437AC67B37#D 55C975ABC9D3

789A25B6
(37D CAA7 C3 77C7523973CB#43723DB47C665D75A79B7597:
4254BC649356D2BB99D2C986
D8C

46654

)597B3759B7D7597:765D75A793C43AB7B7C
AA597B37 C3 777D77B77C
4254BC649356D2BB99D2C986
D8C

(5A74# 37C 6A7#A7B7C4B5B597B373CB#437ACD3759B7D3 A7D9A5AB5977C 759BA7


D A347B7C752397B4C5959759B7&
 759BA7597A#D7D3 A7C437 C3 3767B37D CAA77B37B4C5959759B$7(5A7
C4B5B595975A7DC 37C758964BB42C98
4254BC649356D2BB99D2C986D8C

#C4CC8772EE5787
E3C43ABE3547 45B

46D2B466D8C98B6B4C66D2BB4B
,$ (C*37B3759ABC9D377B737D CAA553
3$ )597793C43AB79354A777597B37B4C59597CBC$
4$ 83B#497 B37 C234C37 7 B37 D CAA3A7 7 B37  793C43AB7
9354A7CA7B37D CAA55DCB5977$
5ABC9D37;35B37
3C43ABE3547 45B

46D2B4669BD54C46B4C66D2BB4B
,$ (C*37B3759ABC9D377B737D CAA553
,
3$ 3B345937473CD7D CAA77B37A#7 ! 
 73 9A7B7   -   3
4$ 83B#497B37D CAA77765B7B3743CB347$$
2C9BC3A77B377 45B
1
7 B377C 45B7DC973AB5CB37D 3&7BC43B7D9D3BA7
DC 67 C97 53439B 67 47 3CD7 9367 59ABC9D37 B7 37
D CAA553'
1
7B377C 45B74253A773934C 5ACB597CDD#4CD67
97C967C59A'
1
7B377C 45B7 3C49A723467"#5D* 6'
1
7B377C 45B75A74#AB7B795A67B4C59597CBC'
1
7 B37 7 C 45B7 5A7 59B#5B5237 C97 3CA67 B7 #934ABC97
65D7CD5 5BCB3A75 339BCB597C9755DCB59$
5AC2C9BC3A77B377 45B
1
7 B37 7 C 45B7 CA7 C437 AB4C37 43"#54339BA7
3DC#A375B7CA7B7AB437C 7B37CBC'
1
7B377C 45B75A7A 67#459759ABC9D37D CAA55DCB597
3DC#A37C 7B37B4C5959759ABC9D3A7C237B73725A5B3'
1
7 B37 CDD#4CD67 7 B37 7 C 45B7 34C3A7 65B7
59D43CA37795A37597B37B4C59597CBC'
1
7 B37 CDD#4CD67 7 B37 7 C 45B7 34C3A7 65B7
59D43CA3775443 32C9B7CBB45#B3A$
1234456573859A
123456478934ABCDE69
F577AA46
549547
EBA78AF94B6477AE3
 9!"#$9
%!&!% !$
!37853F42574'9%577AA467
DE2FFD23EF8ABEB 1
98E6498C49865A3A3B964FE6'79

55666
1
9(749865A3A3B964FE6'798E9
1234 1237 89ABB 99964'AF898C49F57795D49E9
1 999374439F5747
C
C
BEEF8ABE
8
1234 55666 1237
1
8
C
!37853F49574'9%577AA467
) *5+47,
) E842456346
) 94+E6A-479438A649865A3A3B9'585953'946E6+79F577AAF58AE39E39A9
5886AD8479E964FE6'9+58FC9E349E98C49865A3A3B94*5+4794*5F8

) 456478934ABCDE6
) 9(747919.FE7478/9EA387903456478934ABCDE6719E6946E6+A3B9
F577AAF58AE3
45647894ABCDE69%577AA467
) 57AF9A'45,
) !9A8925179A14959'F13945F179A14959'F1398C439A8579
6ED5D959'F1
%E+849
6A7853F4 4789
4FE6'

65A3A3B9 %CEE74919E98C49
4FE6'7 .3456478/964FE6'7
45647824ABCDE69%577AA467
BACA9DAEF79F ● 44A64798C64498CA3B7
7 C497489E978E64'964FE6'7
7 6A7853F49486AF98E9FE+849
'A7853F49D482443964FE6'7
7 C49549E91398C493+D469E9
3456478934ABCDE6798E96486A44

● E9F577A9539313E23964FE6',
7 %E+849'A7853F498E9E8C469
865A3A3B964FE6'7
7 !'438A9193456478934ABCDE679
7 (749F57795D479E934564789
34ABCDE6798E9'4846+A3498C49
F57795D49E9313E23964FE6'9
048B839D9851A3B9+59E6A89E841
64A3A8AE39E945647894ABCDE6

: : :

0519;23456478934ABCDE6 0D19<23456478934ABCDE6 0F19=23456478934ABCDE6

9999>23456478934ABCDE679E95964FE6'9*95649'5859EA3879
8C589C5498C49197+54789'A7853F498E9*
;93456478234ABCDE6
E6E3EA96A5B65+9'4A34798C49F577AAF58AE39DE3'56

C49564598514798C49
F5779E98C49B64439
EA38
45647894ABCDE69%577AAF58AE3
) %E+849'A7853F49D482443982E9EA387,
) FA'4539'A7853F49

4  3 2  1
3 1 2 
1 1


) 64846+A3498C49F57796E+93456478934ABCDE69A78
) 851498C49+59E6A89E849E9F57795D4795+E3B98C4912
3456478934ABCDE67
) ?4ABC98C49E8495FFE6'A3B98E9'A7853F4
45647894ABCDE69%577AAF58AE3B
) %CEE7A3B98C49549E91,
) !919A798EE97+5397437A8A498E93EA749EA387

) !919A798EE956B43934ABCDE6CEE'9+59A3F'49EA38796E+9
E8C469F57747


45647894ABCDE69%577AAF58AE3B
) F5A3B9A7747
) 886AD8479+59C5498E9D497F54'98E9644389'A7853F49
+45764796E+9D4A3B9'E+A3584'9D9E349E98C495886AD847
) *5+4,
) 9C4ABC89E959467E39+595696E+9;8C+98E9;8D+

) 924ABC89E959467E39+595696E+9EFD98E9=FFD

) 9A3FE+49E959467E39+595696E+9G;F>98E9G;
45647894ABCDE69%577AAF58AE3B

) H6ED4+92A8C9FA'4539+45764,
) #ABC9'A+437AE359'5859
) 9F6749E9'A+437AE35A8

) %5396E'F49FE38462A38A8A4964787
;9;9;9;9;9;9;9;9;9;9;9F ;9F9F9F9F9F9F9F9F9F9F9F
7
F9;9;9;9;9;9;9;9;9;9;9; F9F9F9F9F9F9F9F9F9F9F9;
'9@9;8I;I< '9@9;8I;I<
9
1 9E8AE3,9E6+5A-498C494F8E6798E93A8943B8C
456478934ABCDE69%577AAF58AE3B
) 129F577AA467956495-945634679
) !89'E4793E89DA'9+E'4794*AFA8

) (3A14945B469456346797FC9579'4FA7AE3986449A3'F8AE39
53'9642D574'97784+7
) %577AA3B9313E23964FE6'7956496458A49
4*437A4
) 5495BE6A8C+,9$031

) 44'9E69786F864798E96486A4493456478934ABCDE6795788
) C4945647894ABCDE69456FC96ED4+8
45647894ABCDE69456FC
) 2E2'A+437AE3591'286447
) 9'5859786F8649E69537246A3B93456478934ABCDE69446A479
A39<9

) 1'286449FE3786F8AE395BE6A8C+
) 44F898C49*9E699'A+437AE3905846358A3B9D48244398C49
82E1
) H568A8AE398C4975F49A38E982E92A8C959A349577A3B96E+98C49
+4'A539EA38
45647894ABCDE69456FC

12345678497ABCD32EF668
45647894ABCDE69456FC

12345678497ABCD32EF668
45647894ABCDE69456FC

12345678497ABCD32EF668
45647894ABCDE69456FC

12345678497ABCD32EF668
45647894ABCDE69456FC

12345678497ABCD32EF668
45647894ABCDE69456FC

12345678497ABCD32EF668
45647894ABCDE69456FC
12345678497ABCD32EF668
F6497CCABBCE6CBADC947E8C47CE6C8EF66C9C
45647894ABCDE69456FC

12345678497ABCD32EF668

2 C47AFCEF66
2 46C1234
2 6EC1256734
2 978EFE497CE456C12356734
2 !6FCE456C"9F8ECA86C123489EC9FC5A7CA868C1256734

#676FAB468CE9C3C3456784978

2
C$%A5B6C9C&47AFCA6C'AFE4E49747
(HH$ 9% $9%#!
E6894F8E695FCA347

) &A3'959A34569C4653490'4FA7AE39DE3'56198C5892A9745658498C49'585
E6894F8E695FCA347
;

) $349HE77AD49E8AE3
E6894F8E695FCA347

<

) 3E8C469E77AD497E8AE3
E6894F8E695FCA347

<

) $8C469E77AD497E8AE37
E6894F8E695FCA347
;

<

) ?CAFC9E349A79D48846J9;9E69<J

) #E29'E9E9'4A349D48846J
E6894F8E695FCA347
;

<

D<;
D<<

+56BA3
D;;

D;<

) &A3'9C465349+5*A+A-4798C49+56BA39@K9;9A79D4884698C539<
E6894F8E695FCA347
;

2 2
7 62 5 
2 2
2 2 7 62 5 2 4
7 62 5 1 4

D;;

2 2 D;<
2 4  F 2  4 
8  6 2 2 A3 F 2
1 4  F 2  1 4  7 
E6894F8E695FCA347

) ?49253898E9+5*A+A-4, A3 F 2 
 7 
2 
 7 
) ?CAFC9A7944A543898E9+A3A+A-A3B,
9 7


) 897D94F84'98E98C49EE2A3B9FE37865A387,

) 9 CA79A7959FE37865A34'9E8A+A-58AE396ED4+
E6894F8E695FCA347
) ?C589A98C496ED4+9A793E89A3456974565D4J
E6894F8E695FCA347
) ?C589A98C496ED4+9A793E89A3456974565D4J
E6894F8E695FCA347

) ?C589A98C496ED4+9A793E89A3456974565D4J
) !386E'F4975F1956A5D47
) 944'98E9+A3A+A-4, 2  A
 7 
9 7 2 C 1
B

 1 4
9

) D94F898E,9
E3A34569E6894F8E695FCA347
) ?C589A9'4FA7AE39DE3'569A793E89A3456J
E3A34569E6894F8E695FCA347
) 6537E6+9'5859A38E9CABC469'A+437AE35975F4
$"! !%9"!$
%577AAF58AE39A5964B6477AE3
) !37845'9E964'AF8A3B98C49F5779E953964FE6'9249
253898E964'AF898C496ED5DAA89E98C49F5779BA439
8C4964FE6'
) C496ED4+9E964'AF8A3B9FE38A3E795479A79
F54'964B6477AE396ED4+
) "43465956E5FC,9A3'959FE38A3E793F8AE398C589
+E'4798C49FE38A3E79EA3878
*5+4,9A3456964B6477AE3
%577AAF58AE39A5964B6477AE3
) 77+4959A34569F577AAF58AE39DE3'56
EBA78AF94B6477AE3
C49EBA78AF93F8AE3
EBA78AF94B6477AE3
) H6E'F479596ED5DAA89478A+5849E698C49F5779
+4+D467CA92CAFC9A79E8439469748
) C4924ABC879F539D49749E693'467853'A3B98C49
458649A+E6853F48
) ?E6179E696458A4956B49'5857487

) &57898E958
L9M9%!&!
5479%577AA46
) 96ED5DAA78AF965+42E619E697EA3B9F577AAF58AE39
6ED4+7
) E1E653'E+956A5D47

) NEA3896ED5DAA8,9F317

) %E3'A8AE3596ED5DAA8,9F17EE3

) 458AE37CA9D48244399EA38953'9FE3'A8AE359
!3C  D !3C  D !3 D !3 D  C  !3C 
6ED5DAA89'A786AD8AE37
E D  C  E C 
E C  D
E  D
*5+49E95479 C4E64+
) "A43,9
) 9'EF8E6913E2798C589+43A3BA8A79F5747978A934F19CFO9E98C498A+4

) H6AE696ED5DAA89E953958A4389C5A3B9+43A3BA8A79A79;ACF3FFF

) H6AE696ED5DAA89E953958A4389C5A3B978A934F19A79;A<F

) 9!95958A4389C57978A934F1392C585798C496ED5DAA89
C4A7C49C579+43A3BA8A7J
E F    E   6# 4 " #
E   F  6
E F  4 " 
547A539%577AA467
) %E37A'46945FC95886AD84953'9F57795D49579653'E+9
56A5D47

) "A4395964FE6'92A8C95886AD84790;39<3B3319
) "E59A798E964'AF89F5779%

) 4FAAF539249253898E9A3'98C49549E9%98C589+5*A+A-479
H0%P9;39<3B3391

) %539249478A+5849H0%P9;39<3B33919'A64F896E+9
'585J
547A539%577AA467
) 6E5FC,
) FE+8498C49E7846AE696ED5DAA89H0%9P9;39<39B39319E69
595479E9%97A3B98C4954798C4E64+
E  D D 3 D  C  E C 
E C  D D 3 D  4  

E D D 3 D 
4  

4  

) %CEE749549E9%98C589+5*A+A-479
H0%9P9;39<39B3931

) 4A543898E9FCEE7A3B9549E9%98C589+5*A+A-47
9999999 H0;39<39B393P%19H0%1
5495479%577AA46
#E298E978A+5849H6ED5DAA8A4796E+9
6585J ) %577,99H0%19@9FA
) 48B8399H0E19@9QA;F39
99999999H0M4719@9=A;F
2345 6 AE !3F5832E 3#3$2E
"838 4E %A79E &'3E

;9 M479 A3B49 ;<C>9 (9E ) &E69'A7F648495886AD847,


<9 E9 566A4'9 ;FF>9 (9E 99
1
=9 E9 A3B49 QF>9 (9E 99999H0A9P9%119@9PA1PA9F9
I9 M479 566A4'9 ;<F>9 (9E
C9 E9 6AE6F4'9 EC>9 )4E
R9 E9 566A4'9 RF>9 (9E ) 2C4649PA1P9A793+D469E9
Q9 M479 6AE6F4'9 <<F>9 (9E A37853F479C5A3B95886AD849A9
D9 E9 A3B49 DC>9 )4E
53'9D4E3B798E9F5779%1
E9 E9 566A4'9 QC>9 (9E

;F 9
;F9 E9 A3B49 EF>9 )4E ) *5+47,
F
#E298E978A+5849H6ED5DAA8A4796E+9
6585J
)&E69FE38A3E795886AD847,9
) 6A7F648A-498C49653B49A38E9DA379
) 9E349E6'A3595886AD849469DA3
) 9AE58479A3'443'43F49577+8AE3

) 2E22597A8,9909S919E6909K91
) 9FCEE749E39E349E98C4982E97A87957934295886AD84

) H6ED5DAA89'437A89478A+58AE3,
) 977+495886AD849EE279593E6+59'A786AD8AE3
) 9(749'58598E9478A+5849565+484679E9'A786AD8AE39
999048B839+453953'97853'56'9'4A58AE31
) 9$3F496ED5DAA89'A786AD8AE39A7913E2339F539749A898E9478A+58498C49
FE3'A8AE3596ED5DAA89H0APF1
#E298E978A+5849H6ED5DAA8A4796E+9
6585J ) E6+59'A786AD8AE3,
 D1 1 
4 1

1


2345 6 AE !3F5832E 3#3$2E


E D   
1 
 1

"838 4E %A79E &'3E  

1
;9 M479 A3B49 ;<C>9 (9E
<9 E9 566A4'9 ;FF>9 (9E
=9 E9 A3B49 QF>9 (9E
I9 M479 566A4'9 ;<F>9 (9E ) $349E6945FC90A3FA195A6
C9 E9 6AE6F4'9 EC>9 )4E
R9 E9 566A4'9 RF>9 (9E
Q9 M479 6AE6F4'9 <<F>9 (9E ) &E690!3FE+439%577@E1,
D9 E9 A3B49 DC>9 )4E
E9 E9 566A4'9 QC>9 (9E ) !9%577@E
;F9 E9 A3B49 EF>9 )4E
;F9

) 975+49+4539@9;;F
F

4) 975+4956A53F49@9<EQC
1
 41 44  

E   4  A  6$


  &$# 

 #%6#%
*5+49E95495479%577AA46
"A43959 47894FE6',
 +E, 7 A33E ()*E 4'
35A495479%577AA46,

H043'@M47PE19@9=AQ
● H0:P%577@E19@9H043'@EP%577@E1
H043'@EPE19@9IAQ 9 9H0566A4'P9%577@E1
H043'@M47PM4719@9F 9 9H0!3FE+4@;<F>P9%577@E1
H043'@EPM4719@9; 99999999999999@9IAQ9 9IAQ9 9F8FFQ<9@9F8FF<I
H056A8598587@A3B4PE19@9<AQ
H056A8598587@6AE6F4'PE1@;AQ ● H0:P%577@M4719@9H043'@EP9%577@M471
H056A8598587@566A4'PE19@9IAQ 999 999999999999999999 9H0566A4'P9%577@M471
H056A8598587@A3B4PM4719@9<AQ
H056A8598587@6AE6F4'PM471@;AQ
999 999999999999999999 9H0!3FE+4@;<F>P9%577@M471
H056A8598587@566A4'PM4719@9F 999999999999999@9;9 9F9 9;8<9 9;F2E9@9F

&E6985*5D49A3FE+4, A3F49H0:PE1H0E19K9H0:PM471H0M471
!9F577@E, 75+49+453@;;F
75+4956A53F4@<EQC
C464E649H0EP:19K9H0M47P:1
!9F577@M47, 75+49+453@EF 999999@K9%5779@9E
75+4956A53F4@<C
5495479%577AA46

) !9E349E98C49FE3'A8AE3596ED5DAA89A79-46E398C439
8C49438A6494*6477AE39D4FE+479-46E
) H6ED5DAA89478A+58AE3,
A 1
13 A9 -FE D1  C 
A 7-F,*E3FFA223,2EF
2A9,EBF3FA223,2EF1
A 1 2 4
/A09A)E - E D1  C  0-F033F03A923
A 2 A1
*-F0A3A*E2E3
A 1 2 3
* . EB2*A2E - E  D1  C 
A 2 
*5+49E95495479%577AA46
(3 *5'E5F8 13AE+2 ,5'E5AE-38F .3'E,/4 12344
C+53 47 3E 3E 47 +5++57 ,95886AD847
8CE3 3E 3E 3E 3E 3E32+5++57
75+E3 3E 3E 47 3E 3E32+5++57 ,9+5++57
2C54 47 3E 47 3E +5++57
6EB 3E 3E 7E+48A+47 47 3E32+5++57 ,93E32+5++57
1E+E'E 3E 3E 3E 47 3E32+5++57
D58 47 47 3E 47 +5++57
5 5  
AB4E3 3E 47 3E 47 3E32+5++57 E D    65
F58 47 3E 3E 47 +5++57 $ $ $ $
4E56'97C561 47 3E 47 3E 3E32+5++57
8684 3E 3E 7E+48A+47 47 3E32+5++57 4 4 4 %
43BA3 3E 3E 7E+48A+47 47 3E32+5++57 E D  A  6%
E6FA34 47 3E 3E 47 +5++57 44 44 44 44
44 3E 3E 47 3E 3E32+5++57
755+53'46 3E 3E 7E+48A+47 47 3E32+5++57 $
BA59+E37846 3E 3E 3E 47 3E32+5++57 E D    E   65 64
587 3E 3E 3E 47 +5++57 
E2 3E 47 3E 47 3E32+5++57
'ECA3 47 3E 47 3E +5++57 44
45B4 3E 47 3E 47 3E32+5++57
E D  A  E  A  6% 6$

*5'E5F8 13AE+2 ,5'E5AE-38F .3'E,/4 12344 H0P1H019K9H0P
47 3E 47 3E J 1H01
@K95++57
!+4+43858AE39'485A7
54954790++561
) ED7898E9A7E584'93EA749EA387

) #53'49+A77A3B95479D9AB3E6A3B98C49A37853F49
'6A3B96ED5DAA89478A+5849F5F58AE37

) ED7898E9A664453895886AD847

) !3'443'43F49577+8AE39+593E89CE'9E697E+49
5886AD847
) (749E8C46984FC3A44797FC9579547A5394A49482E6179
"434658A49796A7F6A+A358A49
+E'47
) 5495479A7959849E959B434658A49+E'4
) "434658A496EF477,9
) &A6789AF198C49F584BE69E98C4964FE6'

) C439BA4398C49F584BE639B434658498C495886AD84954796E+98C49
'A786AD8AE39E98C49F584BE6
%

) %E3'A8AE359A3'443'43F49BA439%
"434658A49796A7F6A+A358A49
+E'47
) EBA78AF94B6477AE3953'995649'A7F6A+A358A49
+E'47
) C49BE59A798E9A3'98C49DE3'5698C589'A7F6A+A358479
D48244398C4982E9F5774796E+98C49865A3A3B9'585

) !39E6'4698E9F577A98C4953B5B49E959'EF+43839
E9F539
) A8C469456398C4982E953B5B47953'9A3'92CAFC9A79+E649
A1498E9C549B4346584'98C492E6'79E9744
) $69456392C589'A46438A584798C4982E953B5B478
1234256789A5B723573C9DD7653EFA5763
AB2F
95B2F

1 B5763739DD7653FA5763AB2F3CE
1 73AF3B2F63FD6 F

1 !53B33"77838FABB723 79286#
1 73AF32753B2F63FD6 F
1 73573$F3CE3272%B2F6&3$F62F356BA$

1 'F7373CE

1 (DB7239DD7653FA57636F"6FB723ε%CE)

1 *72A9B72
B576373CE

1 CE3B33ABBF638F6BF8367355B5BA3
F62B2"35F763 3ED2B$3283*F672F2$B
1 CE33B653B256789AF83B23*+%,-3

1 CE3 FA7F3793F239B2"3DB.F3D3

3B2D953B53"BF3AA96A3A7D6 F3573
7DB5BA5F832F9632F576$3B53F 765F83
F596F3B233286B5B2"36FA7"2B5B7235$
1 *966F253CE3B3A7F36F5F8357&

1 /F62F3F57836"F36"B23ABBF636FD6789AB2"3
$F62F3B F653DAF309B23D67AF
73*3167 F&3+B2F63CFD6 F3
*F

1 238FABB723
*3-
79286BF3A23
FD65F35FF3573
AF
1 !BA372F37983

F3A77F#
*32
(.DF373383'FABB723379286BF

*3- *3-

*32 *32
07783'FABB723379286&36"B23
C79833F3+6"F
1 F38FABB723 7928637983 F33633
6735F385373 753AF33D7B F
1 !F37983.BB4F35F36"B23

*3-


*32
F3D5BB45B723167 F

1 +F535.236663.273 F37963853F53283F53B3∈333
52%273 F35F3A3 F373.B
1 F38FABB723 7928637983AB33D7B253

A766FA53⇒
1 13A7256B2F837D5BB45B723D67 F
F3D5BB45B723167 F

1 !F3A235627635F3D67 F3573B5389

1 B3B3389865BA3D67"6B2"3913D67 F
1 07 3.B9373αB3A233 F37928

1 13A23 F36FA7F6F83 
*6A5F6B5BA3735F3C795B72

1 23735F3αB36F34F67
1 13B33B2F63A7 B25B723733329 F637385
1 CD6F36FD6FF255B72

1 2B3B53272%4F673αB36F3AF839DD7653FA5763CE
1 F38FABB723 792863B38F5F6B2F83723 35F3CE
1 +F535:3:;2366633 F35F3B28BAF3735F339DD7653

FA57663!F3A236B5F
1 <7635F5B2"3B5332F38533
1 *7D95F333333333333333333333333333333333333333333333333333333283
AB3333A323B35F393B3D7B5BF3283A3-3
75F6BF
130F7F56BA3425F6D6F55B72

*3-

α2?;?
α@;?6=

αC;?
α-;?
αB;?

α2;?6@
α>;?
α=;26>

α,;?
αA;?
*32
C7F3D75F

1 F6F36F35F76F5BA39DDF63 792837235F3F66763
72392FF23853763CE
1 F36"F635F36"B235F3F635F3 7928
1 F3F635F329 F6373CE35F3F635F3 7928

1 D75F3553B23 75356B2B2"32835F5B2"35F3853
6F36FF6F2AF837233B22F63D6789A5324
1 B3B3BD765253763"F2F6B4B2"35735F3272%B2F63
AF
731 7953D753+B2F63CFD6 F

1 !F373EF6676F3ξB3B23ABBA5B72

*3-

*32
C7536"B23DF6D2F

1 'FB2F3ξB;?3B35F6F3B3273F66763763.B
1 3ξB36F3:953EA$36B FF3B237D5BB45B7235F76

1 !F3253573B2BB4F
1 *3&3568F73D6F5F63 F5FF23F667632836"B2
1 F37D5BB45B723D67 F3 FA7F
F3D5BB45B723167 F

1 F3893735F3D67 F3B

1 13B3736FA7F6F83
1 F37238BF6F2AF3B535F3B2F63FD6 F3

AF3B35535F6F3B3239DDF63 79283*3723αB
1 2AF3"B2339137F63A23 F39F83573B283αB3
(.5F2B723573D72%B2F63'FABB723
379286
1 /F3B8F&35627632B35733B"F638BF2B723
DAF3573E$F3BF3FBF6F
1 42D953DAF&35F3DAF32B36F3B2
1 <F596F3DAF&35F3DAF373φ2B35F63562765B72

1 !356276#
1 +B2F637DF65B723B235F3F596F3DAF3B3F89BF253573
272%B2F637DF65B723B23B2D953DAF
1 F3ABBA5B7235$3A23 F3EFBF6F3B533D67DF63

562765B7263(.DF&3G)
(.5F2B723573D72%B2F63'FABB723
379286
1 17B F3D67 F3735F3562765B72
1 B"3A7D955B723 968F23283683573"F533"7783
F5B5F
1 CE37F35FF3573B9F3B952F79
1 /F62F356BA$3763FBABF253A7D955B72
1 B2BB4F3HH1HH-3A23F835733E"778F3ABBF6

φ1223
φ1223 φ1223
φ1223 φ1223 φ1223
φ143 φ1223
φ1223 φ1223
φ1223 φ1223
φ1223 φ1223
φ1223 φ1223 φ1223
φ1223
φ1223
EFC982BC7D6 56789A62BC7D6
(.DF362765B72

1 'FB2F35F3$F62F392A5B723/32433

1 *72B8F635F377B2"3562765B72

1 F3B22F63D6789A53A23 F3A7D95F83 3/3


B57953"7B2"35679"35F3D3φ6
/F62F36BA$

1 F36F5B72BD3 F5FF235F3$F62F392A5B723/3283
5F3DDB2"3φ63B

1 B3B3$272335F3$F62F356BA$
1 423D6A5BAF3F3DFAB3/35F6F 3DFABB2"3φ63
B28B6FA53B25F8373A77B2"3φ6
1 4259B5BF3/32436FD6FF25379638FB6F83275B72373

BB6B53 F5FF2385323283432835B3B36737963
D6B763$27F8"F
1 /32432FF835735B335FA2BA3A728B5B723

F6AF63A728B5B723B23768F63763φ63573F.B5
(.DF373/F62F3<92A5B72

1 1727B3$F62F3B538F"6FF38

1 )8B3 B392A5B723$F62F3B53B853σ

1 *7F36F5F8357368B3 B392A5B7232F9632F576$
1 CB"7B83B53D6F5F63κ3283θ3

1 45387F327535B35F3F6AF63A728B5B7237233κ3283θ
1 )FF6A37238BF6F253$F62F392A5B723B238BF6F253
DDBA5B723B3F63A5BF3
(.DF373CE31DDBA5B72&3
286B5B2"3)FA7"2B5B72
78BBA5B723'9F3573/F62F3<92A5B72

1 *2"F33B22F63D6789A53573$F62F392A5B72
1 <76356B2B2"

6B"B2

!B53$F62F3
92A5B72
78BBA5B723'9F3573/F62F3<92A5B72

1 <7635F5B2"35F32F385333B3ABBF833A3
23B33≥?32833A3-3B33I?

6B"B2

!B53$F62F3
92A5B72
(.DF

1 C9DD7F3F3F3B32'3853D7B25
1 .2;23.-;-3.A;>3.>;B3.B;=3B5323-3=33A3
23283>3B33A3-3 32;23-;23A;%23>;%23
B;2
1 !F39F35F3D727B3$F62F3738F"6FF3-
1 /.3;3.J2-
1 *3B3F535732??

1 !F3B653B283αB3B;23K3B3 
(.DF

1 339B2"339137F63F3"F5
1 α2;?3α-;-6B3αA;?3α>;C6AAA3αB;>6@AA
1 D75F35535F3A7256B2536F3B28FF835BBF8

1 F39DD7653FA57636F35.-;-3.>;B3.B;=7

1 F38BA6BB225392A5B723B

1 3B36FA7F6F83 37B2"3-;23763 3B;%23763


3=;233.-3.>3.B3BF372333333333333333333333333333
2833"BF3 ;,
(.DF

E9F3738BA6BB225392A5B72

A32 A3- A32

2 - > B =
95B%A3*BBA5B72

1 CE3B3 BA3357%A3ABBF6
1 2F3A23A2"F35F39137695B72357373

95B%A3ABBA5B72
1 76F3A77235F3853F53B38BB8F83B2573573

D653EB25FB"F25F3B238BF6F25332833
FD65F3CE3B356B2F83763FA33738BBB72
1 95B%A3ABBA5B723B3872F3 3A7 B2B2"3

5F3795D9537335F3CE3ABBF6
1 :76B5369F
1 (66763A766FA5B2"3A78F

1 'B6FA5F83AABA3"6D
C756F

1 13B5373CE3BDFF255B723A23 F37928353
55D&LL6$F62F%AB2F676"L756F65
1 C7F3BDFF255B7239A33+43CE3A23

28F395B%A3ABBA5B72
1 CE+B"53B372"372F3735F3F6BF53

BDFF255B72373CE
1 CFF635 3577 7.F3763CE36F373

B F
C96&3C5FD3763*BBA5B72

1 16FD6F35F3D55F62356B.
1 CFFA535F3$F62F392A5B7235739F

1 CFFA535F3D6F5F63735F3$F62F392A5B723283

5F39F373*
1 M793A239F35F39F39""F5F83 35F3CE3
756F3763793A23F53D6533B85B723F53573
8F5F6B2F35F39F3735F3D6F5F6
1 (.FA95F35F356B2B2"3"76B532837 5B235F3αB
1 N2FF23853A23 F3ABBF839B2"35F3αB3283

5F39DD7653FA576
'F72565B72

1 46B3853F5
1 *323283A3A36F3EF6"F8F3B235B38F7
C56F2"53283!F$2FF373CE

1 C56F2"5
1 6B2B2"3B36F5BF3F3
1 3D737A37D5B392B$F3B232F9632F576$
1 453AF36F5BF3F3573B"38BF2B72385
1 68F73 F5FF23ABBF63A7DF.B53283F66763A23

F3A72567F83F.DBAB5
1 D72%568B5B723853B$F356B2"328356FF3A23 F39F83

3B2D953573CE3B25F8373F596F3FA576
1 !F$2FF
1 DFF833E"778F3$F62F392A5B72
(DB723C9DD7653EFA5763)F"6FB723
ε%CE)
1 +B2F636F"6FB723B23F596F3DAF
1 N2B$F3B23F53896F36F"6FB7235F3F66763

92A5B723B3ε%B2F2B5BF37392A5B72
1 4259B5BF3B5$F3F3523ε3B3B"276F8
1 B3F83573D6B53BB63573CE

εFB6FB862BB29FD8F 97A62BB29FD8F
1F25 1F25

E9F37
E9F37 56"F5
−ε ε 56"F5
(DB723C9DD7653EFA5763)F"6FB723
ε%CE)
1 0BF2&33853F535.236663.273B5356"F539F3
592366639273F3253573873ε%CE)
1 F37D5BB45B723D67 F3B

1 CBB63573CE35B3A23 F37F833389865BA3
D67"6B2"3D67 F
(DB723C9DD7653EFA5763)F"6FB723
ε%CE)
1 *3B33D6F5F63573A7256735F37925373
B29F2AF3735F3F6676
1 F3OHHHH-35F63F6F33A72567B2"35F3

A7DF.B53735F36F"6FB72392A5B72
1 B3B3BB635736B8"F36F"6FB72
1 15F6356B2B2"37B2"35F3913F3"F539F373
αB3283αBP3BA36F3 7534F673B32B387F32753
A7256B 95F35735F3F6676392A5B72
1 <76332F38533
5F63DF373/F62F3F578

1 13F723F6253B23CE&33B2F63"76B53B235F3
F596F3DAF3B3F89BF2535733272%B2F63
"76B53B235F3B2D953DAF
1 *BA3B2F63"76B53A23 F3"F2F6B4F83573

B53272%B2F63F6B723 3"7B2"35735F3F596F3
DAF
1 /F62F3D6B2ABD3A7D72F2532B3$F62F3
B28FDF28F253A7D72F2532B3$F62F3A272BA3
A766F5B7232B3$F62F3$%F232%A3CE36F3
7F3F.DF
*72A9B72

1 CE3B339F935F625BF35732F9632F576$
1 73$F3A72AFD5373CE&3.BB4F35F36"B23

2835F3$F62F356BA$
1 23A5BF36FF6A3B35$B2"3DAF37236F3

6F5F83573CE
1 23CE3BDFF255B7236F3B F37235F3

F 37637935735637237963853F5Q
)F796AF

1 55D&LL6$F62F%AB2F676"L
1 55D&LL69DD765%FA57662F5L

1 55D&LL69DD765%FA57662F5LBA%59576B6D8

1 55D&LL6$F62F%AB2F676"LDDF6L59576B%2B

1 55D&LL6A7DB2F56A7LB FFL167:FA5LCELDD
12345678
9AB7C4DE6F
1234567589A74BC6DEF

1 9A74BC6DE12314561789332A27942BC1BA1BDE674312C4B1
F2AA66C41B31B1B6167236814561
94242BC2C1BA191F949136412C4B13D36431
78346313B1459414561F94912C1697513D3641
2F69881359613B617BBC1492411BA46C1
977BF2C14B13B61F6A2C6F1F2349C76169361
B75589A74BC6DE
 6BC3C826839539EC64271456361A2CF137763326178346311111
32C162B3816349D82356F17834631111111111111111111111111111111111111111
1111111
18B694261DB44B18B694261 98B24531
D62C1 2451697516866C419319136994617834619CF1 661
45612C4B1377633268189617834631
!1"2232614BFB C1"22326198B24531D62C1 2451 4561
5B86136419CF1B766F14B1F22F612412C4B13776332681 398861
783463
53C4646D39589A74BC6DE5#94242BC98198B24531F6462C619881
7834631941BC7611$5612C78F6
1 1B3D753D5BC6346B7
1 %&&1169C31783462C
1 '$1783462C198B245
D5 6743D8B5B37ACB7

1 234567189A864BC81 2881F6462C615B 1456143A3D6C35E1BA14 B1


6866C43123179878946F19CF1241 28812CA86C761456135961BA14561
783463
$5612C78F6
1$561(782F69C1F2349C761983B179886F1!CB1F2349C761231
26C1D1

!1$561)9C59449C1F2349C761983B179886F149*279D1CB1B1
CB123126C1D

1
+$5619*21CB123126C1D

,1$561)95989CBD231F2349C7617B67431F9491AB1
F2AA66C4137986319CF17B68942BC312C14561929D8631
-1.CC61BF74139761$5619C861D64 66C14 B1
674B3179C1D6136F193191F2349C76169361 56C1
783462C15251F26C32BC981F9491
/1092C1F2349C7613B6426316F241F2349C761
69363145612C21CD61BA13D342442BC31
6126F14B1759C61BC616D612C4B19CB4561
!"#$%&5'(&#)*%+

1 $561,B3D7539EC64212319C198B24514B1783461
71BDE67431D936F1BC19442D46312C4B1F194242BC31
5661F12171
1 .412313228914B14561
6*674942BC9*22&942BC198B2451AB12*4631BA1
393329C312C1459414561DB4519446414B1A2CF14561
76C4631BA1C9498178346312C14561F9491
1 .41933631459414561BDE67419442D4631AB191
674B139761
1 C198B2451AB194242BC2C1B1783462C141
F9491B2C4312C4B151F23EB2C413D364316E1
7BC492C2C1F9491B2C4313B19314B12C22&614561
3BA31963172462BC1

5661*C123191674B16636C42C145614561C451
F9491B2C419CF1E123145616B6427176C4B2F1BA1
4561F9491B2C4312C16E1
1 628136972C1769C31783462C12319C1
98B24514B1789332A1B14B1B14561BDE67431
D936F1BC19442D4638A6946312C4B151CD61BA1
B1
1 51231B3242612C4661CD61
1 $561B2C1231FBC61D12C22&2C1456131
BA1319631BA1F2349C7631D64 66C1F94919CF14561
7B63BCF2C178346176C4B2F
-542B5!"B3D59A74BC6DE5
39EC6425-C,7F
1 &4B51962C1 245191F67232BC1BC145619861BA171:1 11111
11CD61BA17834631
1 &4B51#419C12C24298194242BC145941789332A26314561 11111
11F94912C4B17117834631;B19119332C14561 1111111
492C2C13986319CFB8B1334694279881 1111111
9314561AB88B 2C1
1111111$97614561A234171492C2C13986193132C86
6866C41783463111111
1111111!1332C169751BA14561692C2C1471492C2C1
398614B1 4561 783461 24514561C696341 76C4B2F1
A4616975119332C6C4167B461 456176C4B2F1BA1
456192C2C11783461
1 &4B5.1$976169751398612C13616C7619CF1 11111111
11111117B4612431F2349C761AB1456176C4B2F1BA111 11111
111 111111169751BA145617834631.A19139861231CB41 11111111111111
111111176C4812C14561783461 2451456178B36341 11111111 1
111111176C4B2F13 2475145231398614B14594178346111111111
11111119CF1F9461456176C4B2F1BA14561783461 11111111
111111192C2C14561C6 1398619CF14561783461 11111111
11111118B32C1456139861
1 &4B5/51<669413461+1C42817BC66C761231 11 11111111
11111111975266F145941231C428191933145B514561 11111111
11111111492C2C139861793631CB1C6 19332C6C431
$5&69B5B039B572-6DE542B5
69BBD4346D55,B3D7539EC6425
1A76DE5!23
&4B5
.C242982&942BC1<9CFB81 6175BB361AB88B 2C14 B176C4B2F31
7:!1AB14 B1783463
.C145231793614561!176C4B2F1961:==19CF1
!:-=>=
&4B5
1 $531 61BD492C14 B17834631
7BC492C2C
?!+@19CF1?,-/>@
1 $5621C6 176C4B2F3196

111111111111111111111111111111111111111111111111111
111111
&4B5.
1 4B 132C145636176C4B2F31
617B4614561(782F69C1
F2349C761BA169751BDE6741931
35B C12C149D86

1 $566AB614561C6 1
783463196
?!@19CF1?.,-/>@1

1 46*4176C4B2F31961
:!--19CF1!1:1
+A-
1 6461,1
$5617834631BD492C6F196
?!@19CF1?+,-/>@

1 $566AB6145661231CB1
759C612C14561783461
1 $531456198B24517B631
4B1915984156619CF1A2C981
638417BC32341BA1!17834631
?!@19CF1?+,-/>@1
'4
1-6425!2.3

111111111111111&4B5 111111111&4B5
'4
)B39'6B5%ABC68395#039B5
5!"B3D759A74BC6DE
B615961,16F272C631931B1492C2C1F9491B2C431BDE6741
9CF1697516F272C615931!19442D4631(97519442D461
6636C4317BBF2C9461BA14561BDE6741B6159614B1
F6462C61 527516F272C631D68BC14B178346119CF1
527516F272C631D68BC14B14561B4561783461
58 8 58$8%8&'
  8!"#
 
3"!85

3"!8( $ 

3"!89 ) *

3"!8+ , )
6461
1 1*D64639539AB55
8BD4C67916B361
613616F272C6119CF1
6F272C61919314561A2341
76C4B2F31
1 C6419CF1719CF17!1
F6CB46145617BBF2C9461
BA1456176C4B2F31456C1
7:19CF17!:!1
1 456B847BD4C6756743D8B91 61798789461456111
F2349C761D64 66C178346176C4B2F14B169751BDE674111111111
C64131361(782F69C1F2349C761456C1 61596111111111
F2349C761942*1941246942BC1=1231

1 (97517B8C12C14561F2349C761942*13DB82&6314561
BDE6741
1 $561A2341B 1BA14561F2349C761942*17B63BCF314B14561
F2349C761BA169751BDE67414B14561A234176C4B2F19CF14561
367BCF1B 12314561F2349C761BA169751BDE67414B14561367BCF1
76C4B2F1
1 %B16*9861F2349C761AB16F272C61D1:1,1+114B14561
A234176C4B2F1111111112311111111111111111119CF12431F2349C7614B14561
367BCF176C4B2F1231111111111123111111111111111111111111116471
6461!
1 456B847589A74BC6DE91B61
9332C169751BDE67411D936F1
BC145612C21F2349C761
1 )6F272C6112319332C6F14B1
B116F272C61914B1
B1!16F272C61D14B1
B1!19CF16F272C61"14B1
B1!1
1 $5616866C431BA13B1
942*1D68B 123112A19CF1
BC812A14561BDE6741231
9332C6F14B145941B1
1 *4BC346D75456B847BD4C6756743D8B7911111$561
C6*4134612314B17B4614561F2349C761BA11111111119881
BDE674314B14561C6 176C4B2F31
1 6228914B13461!1 615961F2349C761942*1941
246942BC11231
1 *4BC346D75456B8475
89A74BC6DE9936F1BC14561C6 1
F2349C761942*1 61B614561
6F272C61914B1113B11 52861
98814561B4561BDE67431692C1
$5613B1942*123135B C1
D68B 1

1 *4BC346D575B4BC6DB5
8BD4C6794B 1 61669413461
,14B17987894614561C6 176C4B2F31
7BBF2C9461D936F1BC14561
783462C1BA162B31246942BC1
3B19CF1B1!1DB4515931
4 B16D63145314561C6 1
76C4B2F31961111111111111111111
9CF
1 *4BC346D75456B847BD4C6756743D8B791
<669413461!1992C1 615961C6 1F2349C761
942*1941246942BC1!1931
1 *4BC346D75456B847589A74BC6DE992C1 61111
9332C169751BDE6741D936F1BC145612C21
F2349C761

1 B61BD492C16384145941111111111111DB92C14561
B2C1BA189341246942BC19CF145231246942BC1669831
459414561BDE67431FB631CB41B61B19CB61
1 $53145617B4942BC1BA14561769C1783462C1
593169756F12431349D282419CF1CB1B61246942BC1231
C66F6F
1B5EB4542B56D395ECA6DE537542B5CB7A947537

8 -. -.$ F/&


 8!"#8 %8&'8 018
3"!858 888888 88888 888888
3"!8(8 88888$8 88888 888888
3"!898 88888)8 8888*8 88888$8
3"!8+8 88888,8 8888)8 88888$8
!"B3D759A74BC6DE5867A395937685B

6D17)69CD83461"949131E929C41CD8346131.C466
F192C1AC742BC14B1783461F94912C4B171CD61BA1D83463
F12C41
F1G1"9491942*1=14B1!114B1$B498"949H1
F1<B 1=1:17834611:I1!:1;H1F94912C17B8C3
F1G1CD83461CD61BA1783461361 9C414561F94914B1D61783466F
F1G129461929D8631D6C4B2F1$B498"949
F1B4
F1B1F946176C4B2F
F1B19332C1783461CD614B14561"9491:1B 1=1BA1"949

"212131.C466
"21E131.C466
"21I13162C86
"21;13162C86
"212C13162C86
"2178346131.C466
"21F13162C86
"213I;

"212364288)B2C1319BB869C1
2364288)B2C1:1$6
2A14B498"94912:1CD83461$56C1
FBC81456189341F9491231415661D679361241F632C6F14B1D612C4697426
"949=14B498"9491:14B498"9491F17834614B1:14B4981F949
D6C4B2F14B498"9491:1"94914B498"9491F1
D6C4B2F!14B498"9491:1"949!14B498"94919
(8361
F7987894612C21F2349C7614B19332C14561C6 1F949
2C1:1=1J1=1FD21CD6
I1:1"94914B498"949
;1:1"949!14B498"949
%B121:11$B1CD83461
"B1B528612364288)B2C
F1452318BB1 288136817BC66C4
F798789461C6 176C4B2F3
F11:I1!:;1+:7BC41CD61BA1F949
<6"213I;1$B1+11$B1CD8346
%B121:11$B14B498"949
3I;1"949=121:1"949121G13I;1"949=12
3I;!1"949=121:1"949!121G13I;!1"949=12
"949=12
3I;+1"949=121:11G13I;+1"949=12
46*412
%B121:11$B1CD8346
D6C4B2F121:13I;121813I;+12
D6C4B2F!121:13I;!121813I;+12
46*412
F9332C19881F94914B14561C6 176C4B2F3
2364288)B2C1:1%9836

%B121:11$B14B498"949
2C1:1=1J1=1FD21CD6
I1:1"94912
;1:1"949!12
%B1E1:11$B1CD8346
F1:1F234I1;1D6C4B2F1E1D6C4B2F!1E
.A1F1212C1$56C
2C1:1F
783461:1E
(CF1.A
46*41E
.A1"949=1212K1783461$56C
"949=121:178346
2364288)B2C1:1$6
(CF1.A
46*412
CBB1
(CF1.A
(CF16D1
1B3,DB77B755!"B3D59A74BC6DE

5B56C14561CD631BA1F9491961CB413B19C12C242981
B2C1 2881F6462C61456178346132C2A279C481
! $561CD61BA178346151341D61F6462C6F1D6AB61
59CF1.431F239F9C496123145941241FB631CB41268F145613961
63841 245169751C132C761456163842C17834631F66CF1
BC145612C2429819CFB19332C6C43
+ B61C6617CB 14561698178346132C145613961F9491
D6793612A12412312C446F12C191F2AA66C41BF6124191
BF761F2AA66C417834612A14561CD61BA1F9491231A6 1
, .4123136C3242614B12C2429817BCF242BC1"2AA66C412C2429817BCF242BC1
91BF761F2AA66C4163841BA1783461$56198B2451
91D61496F12C14561D16D953ABA1
$968346D755!"B3D5
9A74BC6DE
1 .4123168942681831387596796451.417B463163841
941414,D375 5661C1231CD61BA1BDE67431B1B2C43171
231CD61BA178346319CF141231CD61BA1246942BC31
1 769C31783462C179C1D619826F14B1A613789
D86C7379C96569A3737
1 4897961B45319656937948819B78C456737959
178C5968CA4937597899F91658C3849F779
649815C9B6753 65379C9!A689"8A8756537#
1 $D49B489C9143791DC96D85584979D9
643789C6316D934D6E9831849679!A689
B6753 65371
4%'(&*4%

1 %&A867496DC35A934936A81AB1CF26746F1
7CB 86F61F237B619CF12316894268132861
569C315931ABCF1 2F61369F139612C18B41
BA1A268F319C2C1AB1C36236F1869C2C1
BA1C6981C64 B71#9446C167BC242BC31
D89332A27942BC19C98323142A2729812C468826C761
2961B76332C19752C61232BC19CF19C1
B4563

You might also like