EPS-DL-Handout4- Steps to Build ANN From Scratch
EPS-DL-Handout4- Steps to Build ANN From Scratch
28-10-2024
BY
ASHA
Building ANN from Scratch
Neural Networks, Activation Functions,
Loss Functions, and Optimizers
What are Neural Networks?
Each neuron takes a weighted sum of its inputs, applies a bias, and passes the
result through an activation function. The role of activation functions is to
introduce non-linearity, allowing the network to model complex data patterns.
Step 4: Define a Loss Function
The loss function quantifies how far off the network’s predictions are from the
actual values. Common loss functions include:
Cross-Entropy Loss for classification tasks. The goal is to minimize this loss by
adjusting the network's weights and biases.
Step 3: Forward Propagation – Generating Predictions
Once the architecture is set, the data is fed forward through the network:
Each neuron in the hidden layers calculates the weighted sum of its inputs, adds
the bias, and applies the activation function.
The final layer outputs a prediction, often using activation functions like
Softmax for multi-class classification or Sigmoid for binary classification.
Step 5: Backpropagation – Learning from Errors
After generating predictions, the network needs to learn from its mistakes.
Backpropagation is the process of calculating the gradient of the loss function
with respect to each weight in the network, using the chain rule.
This gradient indicates how each weight should be adjusted to reduce the
overall loss.
Step 6: Use an Optimizer to Update Weights
The optimizer is responsible for adjusting the weights based on the gradients
from backpropagation. Popular optimizers include:
Optimizers are key to ensuring that the model converges to an optimal solution
efficiently.
Step 7: Train the Network
The training process consists of feeding the model with batches of data (called
epochs), performing forward and backward passes, and updating weights to
reduce the loss function. After several iterations, the model learns to make
accurate predictions.