SC - Chap5 Fall 2016
SC - Chap5 Fall 2016
Block diagram
for parameter
identification
Ch5
• The Training Data Data set composed of m desired
Input-Output pairs (u i ; y i ), (i = 1,…,m);
• System Identification needs to perform Structure and
Parameter Identification repeatedly until a Satisfactory
model is found, making use of the following procedure:
1. Specify & Parameterize a Class of Mathematical Models
representing the System to be identified;
2. Perform Parameter Identification to choose the
parameters that best fit the training data set;
3. Conduct validation set to see if the identified model
responds correctly to an unseen data set (Cross
Validation!);
4. Terminate the procedure once the results of the
validation test are satisfactory. Otherwise, another class
of model is selected & repeat step 2 to 4
Ch5 SC Fall 2016 6
NeuroFuzzy Techniques
• Soft computing approach combines in various ways Artificial
Neural Networks, NN, and fuzzy concepts. Refer to
• Has the potential to capture the benefits of both in a single
framework.
• Inference System corresponds to a set of fuzzy IF–THEN rules
that have learning capability to approximate nonlinear
functions. Different combinations include:
• Co-operative-neural algorithm adapt fuzzy systems: During
system operation, NN & FS work independently from each
other. NN tries to learn
the parameters from the
FS. This adaptation can
be performed offline or
online while FS is applied.
Ch5 SC Fall 2016 7
• Concurrent: The two techniques are applied one after
another as pre- or post-
processing
• Hybrid: Fuzzy system being
represented as a network
structure, making it possible to take advantage of learning
algorithm inherited from NN.
• Fuzzy models fall into two categories:
• Linguistic Models: Based on a collection of If-Then rules
and use fuzzy reasoning such as Mamdani's model:
If x is A and y is B then z is C
Where A, B, C are fuzzy sets of the universes of discourse X,
Y & Z, respectively and x & y are values of I/P variables.
• Sugeno's Model: Characterized with functional type
conclusions. A first-order Sugeno TISO model has the
If x is A and y is B then z px qy r
Ch5 SC Fall 2016 8
Artificial Neural Networks
• NN: Information Processing Paradigm inspired by
Biological Nervous Systems (our brain);
• Structure: Large number of highly interconnected processing
elements (neurons) working together;
• Learn from experience / by example (Like people) complex
functional relations by generalizing from a limited amount
of training data (I/O data observed on the system).
• Can serve as black-box models of nonlinear, multivariable
static and dynamic systems
• In a biological system: Learning involves adjustments to the
synaptic connections among neurons: Weights for NNs!
• Brain uses massively parallel computation: 1011 neurons &
104 connections per neuron in contrast with common NNs
that consist of layers of simple interconnected neurons and
weights
Ch5
assigned to theseSCinterconnections.
Fall 2016 9
• Information relevant to net’s I-O mapping is stored in weights
• Used in Image recognition and predictions based on past
knowledge problems.
• Strengths of NNs: Power: Model complex functions,
nonlinearity built into NN; Ease of Use: Learn by example;
Very little user domain-specific expertise needed; Naturally
Attractive: Based on modeling of biology, will it lead to
intelligent computers/robots/applications?
• NNs can’t do anything that can’t be done using traditional
computing techniques, BUT they can do some things which
would otherwise be very difficult.
• PROS: Adaptability to unknown situations, Robustness: fault
tolerance (network redundancy), Autonomous Learning and
Generalization. Can: Perform tasks that a linear program can
not; Be implemented in any application without any problem
• CONS: Large complexity of network structure; Needs
training & Requires high processing time for large NNs.
Ch5 SC Fall 2016 10
• A neuron Fires only if the total signal received at the cell
body Exceeds a certain (threshold) level. The neuron either
fires or it doesn't, there aren't different grades of firing.
• Entire brain is composed of the interconnected electro-
chemical transmitting neurons. Each neuron performs a
Weighted Sum of its inputs, and then fires a binary signal if
the total input exceeds a certain level. This is the model on
which ANNs are based.
• A Cell Body (soma), a branching I/P structure (DendrIte)
and a branching O/P structure (AxOn). Axons connect to
dendrites via synapses.
Layer1: Every node stores parameters to define MBF; the outputs are the MB
values of the input part.
Layer2: Every rule is represented by one
Node (Every node multiplies the
incoming signals & outputs their
product, which represents the
Firing strength of a rule).
Layer3: Node N is the normalized
firing strength. Other nodes are
connected to the two inputs and to
exactly one node in Layer1.
Layer4: The node generates the sum (Σi Oi) of all of the outputs produced by the
previous layer (Layer3) Overall output (Load)=Σi Oi/Σi α i =(piTime+qiWeight+ri)
Ch5 SC Fall 2016 24