deep learning u1
deep learning u1
Deep Learning has its roots in the 1940s with the invention of the McCulloch-Pitts Neuron,
which was a mathematical model of a biological neuron. Over the decades, advancements have
been made in both theoretical and computational aspects, such as:
• 1940s: McCulloch-Pitts Neuron laid the foundation for artificial neural networks.
• 1950s-60s: The Perceptron algorithm was developed, showing that machines could learn
from data.
• 2010s and Beyond: Modern frameworks like TensorFlow and PyTorch emerged, and
applications in image recognition, natural language processing, and autonomous
systems soared.
Deep learning has evolved into a critical tool for solving complex, real-world problems,
leveraging massive datasets and advanced architectures.
Deep Learning is a specialized area within Machine Learning, focusing on neural networks with
multiple layers (deep architectures). These networks aim to mimic the human brain’s ability to
learn from data, identifying patterns, and making decisions. By leveraging vast datasets and
computational power, deep learning excels in tasks like image recognition, natural language
processing, and autonomous systems.
McCulloch-Pitts Neuron
• Components:
• Structure:
• Key Features:
• Universal Approximation Theorem: MLPs with sufficient neurons and layers can
approximate any continuous function.
• Challenges:
Sigmoid Neurons
• Drawbacks:
o Saturation: Gradients become very small for extreme input values (vanishing
gradient problem).
• Features:
o No cycles or feedback loops.
• Workflow:
Backpropagation
• Purpose: Train neural networks by minimizing the error between predicted and actual
outputs.
• Steps:
2. Backward Pass: Calculate gradients of the loss function with respect to weights
using the chain rule.
• Methods:
o Xavier Initialization: Scales weights based on the number of input and output
neurons.
Batch Normalization
• Benefits:
Representation Learning
• Advantages:
• Examples:
GPU Implementation
• Why GPUs?
o GPUs excel at parallel processing, making them ideal for matrix computations in
deep learning.
• Frameworks:
o TensorFlow, PyTorch, and Keras provide GPU support for faster training.
• Impact: