Adaptive Resonance Theory
Adaptive Resonance Theory
The Adaptive Resonance Theory (ART) was incorporated as a hypothesis for human cognitive
data handling. The hypothesis has prompted neural models for pattern recognition and
unsupervised learning. ART system has been utilized to clarify different types of cognitive and
brain data.
Application of ART:
ART stands for Adaptive Resonance Theory. ART neural networks used for fast, stable learning
and prediction have been applied in different areas. The application incorporates target
recognition, face recognition, medical diagnosis, signature verification, mobile control robot.
In 1958, "The Computer and the Brain" were published, a year after Jhon von
Neumann's death. In that book, von Neumann proposed numerous extreme changes to how
analysts had been modeling the brain.
Artificial Neural Network (ANN) is entirely inspired by the way the biological nervous system
work. For Example, the human brain works. The most powerful attribute of the human brain is to
adapt, and ANN acquires similar characteristics. We should understand that how exactly our
brain does?.
The learning rule is a technique or a mathematical logic which encourages a neural network to
gain from the existing condition and uplift its performance.
A learning rule or Learning process is a technique or a mathematical logic. It boosts the Artificial
Neural Network's performance and implements this rule over the network. Thus learning rules
refreshes the weights and bias levels of a network when a network mimics in a particular data
environment.
The Hebbian rule was the primary learning rule. In 1949, Donald Hebb created this learning
algorithm of the unsupervised neural network. We can use this rule to recognize how to improve
the weights of nodes of a network.
The delta rule in an artificial neural network is a specific kind of backpropagation that assists in
refining the machine learning/artificial intelligence network, making associations among input
and outputs with different layers of artificial neurons. The Delta rule is also called the Delta
learning rule.
Delta rule is introduced by Widrow and Hoff, which is the most significant learning rule that
depends on supervised learning.
Correlation Learning Rule:
The correlation learning rule is based on the same principle as the Hebbian learning rule. It
considers that weight between corresponding neurons should be positive, and weights between
neurons with inverse reactions should be progressively negative. Opposite to the Hebbian rule,
the correlation rule is supervised learning.
Techniques and algorithms which are used in unsupervised ANNs involve self-organizing maps,
restricted Boltzmann machines, autoencoders, etc.
An associate memory network refers to a content addressable memory structure that associates a
relationship between the set of input patterns and output patterns. A content addressable memory
structure is a kind of memory structure that enables the recollection of data based on the intensity
of similarity between the input pattern and the patterns stored in the memory.
Boltzmann Machines