ciea_assignment5
ciea_assignment5
Introduction
This document describes the feedforward and backpropagation process for a
neural network with two hidden layers. The network architecture consists of an
input layer, two hidden layers, and an output layer. The figure below represents
the general structure of this network:
X1 X2 Xn
| | |
Z1------Z2------Zp
| | |
H1------H2------Hq
| | |
Y1 Y2 Ym
Network Architecture
- Input layer: n input units X1 , X2 , . . . , Xn - First hidden layer: p hidden units
Z1 , Z2 , . . . , Zp - Second hidden layer: q hidden units H1 , H2 , . . . , Hq - Output
layer: m output units Y1 , Y2 , . . . , Ym
Feedforward Process
1. Step 0: Initialize weights
Randomly initialize the weights connecting the layers:
1
3. Step 2: For each training pair, do Steps 3-6
4. Step 3: Input layer to first hidden layer Each input unit Xi broad-
casts its signal xi to all units in the first hidden layer.
5. Step 4: First hidden layer processing Each first hidden layer unit Zj
computes the weighted sum of its inputs:
n
X
zinj = v0j + xi vij
i=1
Backpropagation Process
1. Step 7: Calculate output error
For each output unit Ym , compute the error:
δm = (dm − ym )f ′ (yinm )
2
3. Step 9: Backpropagate error to first hidden layer
For each first hidden layer unit Zj , compute the error:
q
!
X
δj = δk wjk f ′ (zinj )
k=1