Machine - Learning (ANN)
Machine - Learning (ANN)
May 1, 2023
Academic City University College, Agbogba Haatso, Ghana.
Artificial Neural Network
NEURAL NETWORK
1
NEURAL NETWORK
3
MAKE GENERALIZATIONS AND INFERENCES CON’T
4
WHAT ARE NEURAL NETWORKS USED FOR?
5
APPLICATIONS OF NEURAL NETWORKS
Speech recognition
Neural networks can analyze human speech despite varying
speech patterns, pitch, tone, language, and accent. Virtual
assistants like Amazon Alexa and automatic transcription
software use speech recognition to do tasks like these:
7
APPLICATIONS OF NEURAL NETWORKS
9
HOW DO NEURAL NETWORKS WORK?
10
BUILDING BLOCKS OF A NEURAL NETWORK: LAYERS AND
NEURONS
• Layers
• Neurons
11
BUILDING BLOCKS OF A NEURAL NETWORK
12
LAYERS
1. Input Layer
2. Hidden Layer
3. Output Layer
14
INPUT LAYER
15
HIDDEN LAYER
Hidden layers take their input from the input layer or other
hidden layers. Artificial neural networks can have a large
number of hidden layers. Each hidden layer analyzes the
output from the previous layer, processes it further, and passes
it on to the next layer.
16
OUTPUT LAYER
The output layer gives the final result of all the data
processing by the artificial neural network.
It can have single or multiple nodes. For instance, if we have a
binary (yes/no) classification problem, the output layer will
have one output node, which will give the result as 1 or 0.
However, if we have a multi-class classification problem, the
output layer might consist of more than one output node.
17
NEURONS IN A NEURAL NETWORK
18
NEURON
19
ARTIFICIAL NEURON
20
WHAT IS A FIRING OF A NEURON?
21
ARTIFICIAL NEURON
Figure 4: Caption
22
ARTIFICIAL NEURON
Figure 5: Caption
23
ARTIFICIAL NEURON (CONNECTIONS)
Figure 6: Caption
24
BIAS(OFFSET)
25
ARTIFICIAL NEURON
Figure 8: Caption
Figure 9: A N Expression
If the sum of the inputs is greater than the threshold then the
neuron will fire. Otherwise, the neuron will not fire. Let’s
simplify this equation a bit and bring the threshold to the left
side of the equations. Now, this negative threshold is called
Bias-
28
ARTIFICIAL NEURON
30
ACTIVATION FUNCTION(TRANSFER FUNCTION)
31
ACTIVATION FUNCTION
33
WEIGHTS(PARAMETERS)
34
FORWARD PROPAGATION
35
FORWARD PROPAGATION
36
BACK-PROPAGATION
37
BACK-PROPAGATION
38
BACK-PROPAGATION
39
LEARNING RATE
40
CONVERGENCE
41
NORMALISATION
42
FULLY CONNECTED LAYERS
43
BACK-PROPAGATION
44
LOSS FUNCTION/COST FUNCTION
45
MODEL OPTIMIZERS
46
PERFORMANCE METRICS
49
FEEDFORWARD NEURAL NETWORKS
50
BACK PROPAGATION ALGORITHM
51
BACK PROPAGATION ALGORITHM CON’T
52
CONVOLUTIONAL NEURAL NETWORKS
53
Convolutional Neural
Networks(CNN)
CNN INTRODUCTION
55
CNN CON’T
56
CNN CON’T
57
COVNET
58
INTRODUCTION
59
CONVNET
60
KERNEL OR FILTER OR FEATURE DETECTORS
61
KERNEL
62
STRIDE
63
STRIDE CON’T
64
PADDING
66
POOLING CON’T
67
POOLING CON’T
68
FLATTEN
69
FLATTEN CON’T
70
LAYERS USED TO BUILD CNN
• Convolutional layer
• Pooling layer
• Fully-connected (FC) layer
71
CONVOLUTIONAL LAYER
This layer is the first layer that is used to extract the various
features from the input images. In this layer, We use a filter or
Kernel method to extract features from the input image.
72
CONVOLUTIONAL LAYER CON’T
73
POOLING LAYER
74
CONVOLUTIONAL LAYER
75
FULLY-CONNECTED LAYER
76
DROPOUT
77
DROPOUT
78
ACTIVATION FUNCTION
79
ACTIVATION FUNCTION CON’T
• Sigmoid:
For a binary classification in the CNN model
• tanh:
The tanh function is very similar to the sigmoid function.
The only difference is that it is symmetric around the
origin. The range of values, in this case, is from -1 to 1.
80
ACTIVATION FUNCTION CON’T
• Softmax:
It is used in multinomial logistic regression and is often
used as the last activation function of a neural network to
normalize the output of a network to a probability
distribution over predicted output classes.
• RelU:
The main advantage of using the ReLU function over
other activation functions is that it does not activate all
the neurons at the same time.
81
USE CASES
82
DEEP LEARNING ALGORITHMS
83
DEEP LEARNING FRAMEWORKS
• TensorFlow
• Keras
• PyTorch
• Theano
• Caffe
• Deeplearning4j
• MXNet
• Chainer
84
END OF PRESENTATION
THANK YOU