0% found this document useful (0 votes)
6 views3 pages

ciea_assignment5

a report on neural networks

Uploaded by

Saad Karim
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views3 pages

ciea_assignment5

a report on neural networks

Uploaded by

Saad Karim
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Neural Network with Two Hidden Layers

Mian Saad Karim


October 15, 2024

Introduction
This document describes the feedforward and backpropagation process for a
neural network with two hidden layers. The network architecture consists of an
input layer, two hidden layers, and an output layer. The figure below represents
the general structure of this network:

X1 X2 Xn
| | |
Z1------Z2------Zp
| | |
H1------H2------Hq
| | |
Y1 Y2 Ym

Network Architecture
- Input layer: n input units X1 , X2 , . . . , Xn - First hidden layer: p hidden units
Z1 , Z2 , . . . , Zp - Second hidden layer: q hidden units H1 , H2 , . . . , Hq - Output
layer: m output units Y1 , Y2 , . . . , Ym

Feedforward Process
1. Step 0: Initialize weights
Randomly initialize the weights connecting the layers:

vij (weights between input and first hidden layer)

wjk (weights between first and second hidden layer)


ukm (weights between second hidden layer and output layer)

2. Step 1: Check stopping condition


Continue while the stopping condition is false.

1
3. Step 2: For each training pair, do Steps 3-6
4. Step 3: Input layer to first hidden layer Each input unit Xi broad-
casts its signal xi to all units in the first hidden layer.
5. Step 4: First hidden layer processing Each first hidden layer unit Zj
computes the weighted sum of its inputs:
n
X
zinj = v0j + xi vij
i=1

The activation function f is applied:


zj = f (zinj )
The outputs zj are sent to all units in the second hidden layer.
6. Step 5: Second hidden layer processing Each second hidden layer
unit Hk computes the weighted sum of its inputs:
p
X
hink = w0k + zj wjk
j=1

The activation function f is applied:


hk = f (hink )
The outputs hk are sent to the output layer.
7. Step 6: Output layer processing Each output unit Ym computes the
weighted sum of its inputs:
q
X
yinm = u0m + hk ukm
k=1

The activation function f is applied:


ym = f (yinm )

Backpropagation Process
1. Step 7: Calculate output error
For each output unit Ym , compute the error:
δm = (dm − ym )f ′ (yinm )

2. Step 8: Backpropagate error to second hidden layer


For each second hidden layer unit Hk , compute the error:
p
!
X
δk = δm ukm f ′ (hink )
m=1

2
3. Step 9: Backpropagate error to first hidden layer
For each first hidden layer unit Zj , compute the error:
q
!
X
δj = δk wjk f ′ (zinj )
k=1

4. Step 10: Update weights


- Update weights from second hidden to output layer:

ukm ← ukm + ηδm hk

- Update weights from first hidden to second hidden layer:

wjk ← wjk + ηδk zj

- Update weights from input to first hidden layer:

vij ← vij + ηδj xi

where η is the learning rate.


5. Step 11: Repeat for all training pairs
6. Step 12: Check stopping condition

You might also like