Artificial Neural Networks
Artificial Neural Networks
Networks
Applied Problems:
Image, Sound, and Pattern
recognition
Decision making
Knowledge discovery
Context-Dependent Analysis
Artificial Intellect:
Who is stronger and
why?
NEUROINFORMATICS
- modern theory about principles
and new mathematical models of
information processing, which
based on the biological prototypes
and mechanisms of human brain
activities
Introduction to Neural
Networks
2
Massive parallelism
Brain computer as an
information or signal
processing system, is
composed of a large number
of a simple processing
elements, called neurons.
These neurons are
interconnected by numerous
direct links, which are called
connection, and cooperate
which other to perform a
parallel distributed
processing (PDP) in order to
soft a desired computation
tasks.
Connectionism
Brain computer is a highly
interconnected neurons
system in such a way that
the state of one neuron
affects the potential of the
large number of other
neurons which are
connected according to
weights or strength. The
key idea of such principle is
the functional capacity of
biological neural nets
determs mostly not so of a
single neuron but of its
connections
Associative
distributed
memory
Storage of information in
a brain is supposed to be
concentrated in synaptic
connections of brain
neural network, or more
precisely, in the pattern
of these connections and
strengths (weights) of the
synaptic connections.
A process of
pattern recognition
and pattern
manipulation is
based on:
How our brain
manipulates
with patterns ?
Principles of Brain Processing
3
Human brain
contains a
massively
interconnected net
of 10
10
-10
11
(10
billion) neurons
(cortical cells)
Biological
Neuron
- The simple
arithmetic
computing
element
Brain Computer: What is it?
4
The schematic
model of a
biological
neuron
Synapse
s
Dendrites
Soma
Axon
Dendrit
e from
other
Axon from
other
neuron
1. Soma or body cell - is a large,
round central body in which almost
all the logical functions of the
neuron are realized.
2. The axon (output), is a nerve
fibre attached to the soma which
can serve as a final output channel
of the neuron. An axon is usually
highly branched.
3. The dendrites (inputs)-
represent a highly branching tree of
fibres. These long irregularly shaped
nerve fibres (processes) are
attached to the soma.
4. Synapses are specialized
contacts on a neuron which are the
termination points for the axons
from other neurons.
Biological Neurons
5
?
Brain-Like
Computer
Brain-like computer
is a mathematical model of humane-
brain principles of computations. This
computer consists of those
elements which can be called the
biological neuron prototypes, which
are interconnected by direct links
called connections and which
cooperate to perform parallel
distributed processing (PDP) in order
to solve a desired computational
task.
Neurons and
Neural Net
The new paradigm of
computing
mathematics consists
of the combination of
such artificial neurons
into some artificial
neuron net.
Artificial Neural Network
Mathematical
Paradigms of Brain-Like Computer
Brain-like Computer
6
NN as an
model of brain-
like Computer
( ,..., )
( ... )
n
n n
f x x
P w w x w x
+ + +
- a partially defined function,
which is an approximation of
the decision rule function
12
m
p
m
1
m
2
m
3
x
i
y
i
n
( ) { t f
f
p n
F
:
p
1. Quantization of pattern
space into p decision
classes
Input Patterns
Response:
( )
( )
( )
]
]
]
]
]
]
1
1
2
1
1
n
x
x
x
i
x
( )
( )
( )
]
]
]
]
]
]
]
1
1
2
1
1
n
y
y
y
i
y
2. Mathematical model of
quantization:
Learning by Examples
Mathematical Interpretation of
Classification in Decision
Making
13
Data
Acquisitio
n
Data
Analysis
Interpretation
and
Decision
Making
Signals
&
parameters
Characteristics
&
Estimations
Rules
&
Knowledge
Production
s
Data
Acquisition
Data
Analysis
Decision
Making
Knowledge
Base
Adaptive Machine
Learning
via Neural Network
Intelligent Data Analysis in
Engineering Experiment
14
Self-organization basic
principle of learning:
Structure reconstruction
Input
Images
Teacher
Neuroprocess
or
Responce
The learning
involves
change of
structure
Learning Rule
Learning via Self-Organization
Principle
15
Symbol
manipulation
Pattern recognition
Which way
of
imaginatio
n is best
for you ?
Dove flies
Lion goes
Tortoise scrawls
Donkey sits
Shark swims
Ill-Formalizable
Tasks:
Sound and Pattern
recognition
Decision making
Knowledge discovery
Context-Dependent
Analysis
What is
difference
between
human brain
and traditional
computer via
specific
approaches to
solution of ill-
formalizing
tasks (those
tasks that can
not be
formalized
directly)?
Symbol Manipulation or Pattern
Recognition ?
16
Artificial Neuron
w
0
w
0
x
1
w x
1 1
Z=
w
1
w x
i i
( ) Z
. . . Output
( ) ( ,. .., ) z f x x
n
1
x
n
w
n
w x
n n
A signal at the i
th
input is multiplied (weighted) by
the weight
w
1
w
n
w
2
x
1
x
2
x
n
y
17
A Neuron
1 0 1 1
( ,..., ) ( ... )
n n n
f x x F w w x w x + + +
f is a function to be earned
are the inputs
is the activation function
1
x
n
x
1
( , ..., )
n
x f x
.
.
.
(z)
0 1 1
...
n n
z w w x w x + + +
1
, ...,
n
x x
Z is the weighted sum
18
A Neuron
2
2
1
1
( )
1
1
z
z
e
+
( )
1, 0,
sign( )
1, 0.
if z
z z
if z
'
<
z
z
z
z
1
-
1
1
0
0
1
-1
Artificial Neuron:
Classical Activation Functions
20
Connectionizm
NN is a highly interconnected structure in such a way
that the state of one neuron affects the potential of the
large number of another neurons to which it is
connected accordiny to weights of connections
Not Programming but Training
NN is trained rather than programmed to perform the
given task since it is difficult to separate the hardware
and software in the structure. We program not solution
of tasks but ability of learning to solve the tasks
]
]
]
]
]
]
11 11 11 11
11 11 11 11
11 11 11 11
11 11 11 11
w w w w
w w w w
w w w w
w w w w
Distributed Memory
NN presents an distributed memory so that changing-
adaptation of synapse can take place everywhere in
the structure of the network.
Principles of
Neurocomputing
21
( )
2
x y
Learning and Adaptation
NN are capable to adapt themselves (the synapses
connections between units) to special
environmental conditions by changing their
structure or strengths connections.
Non-Linear Functionality
Every new states of a neuron is a nonlinear
function of the input pattern created by the firing
nonlinear activity of the other neurons.
Robustness of Assosiativity
NN states are characterized by high robustness
or insensitivity to noisy and fuzzy of input data
owing to use of a highly redundance distributed
structure
Principles of
Neurocomputing
22
Threshold Neuron
(Perceptron)
Output of a threshold neuron is binary, while
inputs may be either binary or continuous
If inputs are binary, a threshold neuron
implements a Boolean function
The Boolean alphabet {1, -1} is usually used
in neural networks theory instead of {0, 1}.
Correspondence with the classical Boolean
alphabet {0, 1} is established as follows:
1 2 ( 0 1 {0 1) {1 1} } 1 1 1
y
; ; , x = - y - ,- y , x
23
Threshold Boolean
Functions
1 1 1
1 -1 -1
-1 1 -1
-1 -1 -1
XOR is an example of the non-threshold (not linearly
separable) Boolean function: it is impossible separate
1s from -1s by any single line
1 1 1
1 -1 -1
-1 1 -1
-1 -1 1
(-1, 1) (1, 1)
(-1,-1) (1,-1)
(-1, 1) (1, 1)
(-1,-1) (1,-1)
25
Threshold Neuron: Learning
A main property of a neuron and of a neural
network is their ability to learn from its
environment, and to improve its performance
through learning.
A neuron (a neural network) learns about its
environment through an iterative process of
adjustments applied to its synaptic weights.
Ideally, a network (a single neuron) becomes
more knowledgeable about its environment
after each iteration of the learning process.
26
Threshold Neuron: Learning
+
+
%
%
31
Learning Algorithm
Learning algorithm consists of the sequential
checking for all vectors from a learning set,
whether their membership is recognized
correctly. If so, no action is required. If not, a
learning rule must be applied to adjust the
weights.
This iterative process has to continue either
until for all vectors from the learning set their
membership will be recognized correctly or it
will not be recognized just for some
acceptable small amount of vectors (samples
from the learning set).
32
When we need a network
37
Is it possible to learn XOR, Parity n and
other non-linearly separable
functions
using a single neuron?
Any classical monograph/text book on neural
networks claims that to learn the XOR function a
network from at least three neurons is needed.
This is true for the real-valued neurons and real-
valued neural networks.
However, this is not true for the complex-valued
neurons !!!
A jump to the complex domain is a right way to
overcome the Misky-Paperts limitation and to
learn multiple-valued and Boolean nonlinearly
separable functions using a single neuron.
38
XOR problem
x
1
x
2
2 2 1 1 0
x w x w w
z
+ +
) (z P
B
f(x
1
, x
2
)
1 1 1+i 1 1
1 -1 1-i -1 -1
-1 1 -1+i -1 -1
-1 -1 -1-i 1 1
1
B
P
1
B
P
n=2, m=4 four sectors
W=(0, 1, i) the weighting vector
1
i
-i
-1
1
B
P
1
B
P
39
Blurred Image Restoration
(Deblurring) and Blur
Identification by MLMVN
40
Blurred Image Restoration (Deblurring)
and Blur Identification by MLMVN
M
i
c
r
o
s
c
o
p
y
T
o
m
o
g
r
a
p
h
y
P
h
o
t
o
42
Image deblurring: problem
statement
Mathematically blur is caused by the convolution of
an image with the distorting kernel.
Thus, removal of the blur is reduced to the
deconvolution.
Deconvolution is an ill-posed problem, which results
in the instability of a solution. The best way to solve
it is to use some regularization technique.
To use any kind of regularization technique, it is
absolutely necessary to know the distorting kernel
corresponding to a particular blur: so it is necessary
to identify the blur.
43
Blur Identification