0% found this document useful (0 votes)
37 views

Research Proposal Presentation

The document provides an overview of artificial neural networks (ANNs). It discusses the history and fundamentals of ANNs, including their architecture, comparison to biological neurons, properties, common functions used, perceptrons, convolutional neural networks, applications, and advantages. The key components of ANN architecture are inputs, weights, transfer functions, activation functions, and bias. Common activation functions include sigmoid, tanh, ReLU, and maxout. Convolutional neural networks are used for image analysis through convolutional and pooling layers. ANNs are applied in domains like facial recognition, weather forecasting, and healthcare.

Uploaded by

20eg107140
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views

Research Proposal Presentation

The document provides an overview of artificial neural networks (ANNs). It discusses the history and fundamentals of ANNs, including their architecture, comparison to biological neurons, properties, common functions used, perceptrons, convolutional neural networks, applications, and advantages. The key components of ANN architecture are inputs, weights, transfer functions, activation functions, and bias. Common activation functions include sigmoid, tanh, ReLU, and maxout. Convolutional neural networks are used for image analysis through convolutional and pooling layers. ANNs are applied in domains like facial recognition, weather forecasting, and healthcare.

Uploaded by

20eg107140
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 20

Fundamentals of Image Data

mining

ARTIFICIAL NEURAL
NETWORK
Mithesh 20EG107131
Sai Sampreeth Reddy 20EG107142
Umesh Reddy 20EG107140
Manish 20EG107158
Sai Vivek 20EG107144
Sai Kiran 20EG107160
• History of Artificial Neural Networks

• Fundamentals of ANN
AGENDA • Neural Network Architecture

• Comparison between biological neuron and artificial


neuron

• Properties of ANN

• Functions used in ANN

• Perceptron and Convolutional neural networks (CNN)

• Applications and Advantages


History of Artificial
Neural Network
• llate-1800's - Neural Networks appear as an
analogy to biological systems
• l1960's and 70's – Simple neural networks
appear
• lFall out of favor because the perceptron is
not effective by itself, and there were no
good algorithms for multilayer nets
• l1986 – Backpropagation algorithm
appears
FUNDAMENTALS
OF ANN
• NN are constructed and implemented to model the human
brain.
• Performs various tasks such as pattern-matching, classification,
optimization function, approximation, vector quantization and
data clustering.
• These tasks are difficult for traditional computers
• ANN posess a large number of processing elements called
nodes/neurons which operate in parallel.
• Neurons are connected with others by connection link.
NEURAL NETWORK
ARCHITECTURE
The architecture of neural networks is made up of an input, output, and hidden layer.
Neural networks themselves, or artificial neural networks (ANNs), are a subset of
machine learning designed to mimic the processing power of a human brain.
There are many components to a neural network architecture. Each neural network has
a few components in common:
• Input - Input is data that is put into the model for learning and training purposes.
• Weight - Weight helps organize the variables by importance and impact of
contribution.
• Transfer function :Transfer function is
when all the inputs are summarized and
combined into one output variable.
• Activation function - The role of the
activation function is to decide whether or
not a specific neuron should be activated.
This decision is based on whether or not
the neuron’s input will be important to the
prediction process.
• Bias - Bias shifts the value given by the
activation function.
STRUCTURE OF
BIOLOGICAL NEURON
• Has 3 parts
–Soma or cell body:- cell nucleus is located
–Dendrites:- nerve connected to cell body
–Axon: carries impulses of the neuron
• End of axon splits into fine strands
• Each strand terminates into a bulb-like organ called
synapse
• Electric impulses are passed between the synapse and
dendrites
Comparison between
Biological Neuron
and Artificial
Neuron

Artificial Neural Networks are


primarily designed to mimic and
simulate the functioning of the human
brain. Using the mathematical
structure, it is ANN constructed to
replicate the biological neurons
PROPERTIES OF ANN
• Inputs are flexible
-any real values
-Highly correlated or independent
• Target function may be discrete-valued, real-valued, or vectors of discrete or
real values
-Outputs are real numbers between 0 and 1
• Resistant to errors in the training data
• Long training time
• Fast evaluation
• The function produced can be difficult for humans to interpret
Back to Agenda
Functions used in Neural Network
• Activation functions are functions used in a neural network to compute the weighted sum of
inputs and biases, which is in turn used to decide whether a neuron can be activated or not.
• Linear Activation Functions : A linear function is also known as a straight-line function
where the activation is proportional to the input i.e. the weighted sum from neurons. It has a
simple function with the equation:
f(x) = ax + c
• Non-Linear Activation Functions :
1. Sigmoid Activation Functions
2. Tanh Activation Functions
3. ReLU Activation Functions
4. Maxout
• Sigmoid Activation Functions :

• Sigmoid takes a real value as the input and outputs another value between 0 and 1. The
sigmoid activation function translates the input ranged in (-∞,∞) to the range in (0,1)

2. Tanh Activation Functions:

• The tanh function is just another possible function that can be used as a non-linear
activation function between layers of a neural network. It shares a few things in
common with the sigmoid activation function. Unlike a sigmoid function that will map
input values between 0 and 1, the Tanh will map values between -1 and 1.
3. ReLU Activation Functions :

• The formula is deceptively simple: max(0,z). Despite its name, Rectified Linear Units,
it’s not linear and provides the same benefits as Sigmoid but with better performance.

4. Maxout :

• The Maxout activation is a generalization of the ReLU and the leaky ReLU functions. It
is a piecewise linear function that returns the maximum of inputs, designed to be used
in conjunction with the dropout regularization technique.
The rectified linear activation function, or ReLU
activation function, is perhaps the most common
function used for hidden layers. It is common because
it is both simple to implement and effective at
overcoming the limitations of other previously popular
activation functions, such as Sigmoid and Tanh.
PERCEPTRON
• Basic unit in a neural network
• Linear separator
• N inputs, x1 ... xn
• Weights for each input, w1 ... wn
• A bias input x0 (constant) and associated weight w0
• Weighted sum of inputs, y = w0x0 + w1x1 + ... +
wnxn
• A threshold function or activation function,
i.e 1 if y > t, -1 if y <= t
Convolutional Neural Network
A convolutional neural network is a feed-forward neural network that is generally
used to analyze visual images by processing data with grid-like topology. It’s also
known as a ConvNet. A convolutional neural network is used to detect and classify
objects in an image.
In CNN, every image is represented in the form of an array of pixel values.

Layers in a Convolutional Neural Network


A convolution neural network has multiple hidden layers that help in extracting information
from an image. The four important layers in CNN are:
• Convolution layer
• ReLU layer
• Pooling layer
• Fully connected layer
Applications
• Weather Forecasting
• Facial Recognition.
• Stock Market Prediction.
• Social Media.
• Aerospace.
• Defence.
• Healthcare.
• Signature Verification and
Handwriting Analysis.
ADVANTAGES
• It involves human like thinking.
• They handle noisy or missing data.
• They can work with large number of variables or
parameters.
• They provide general solutions with good
predictive accuracy.
• System has got property of continuous learning.
• They deal with the non-linearity in the world in
which we live.

Back to Agenda
THANK YOU

You might also like