Week_2
Week_2
8 Convolution Neural Networks for Image Classification (Oxford Pets, Tiny ImageNet, etc.)
9 Recurrent Neural Networks for Sentiment Analysis with IMDB Movie Reviews
10 Long Short Term Memory for Stock Prices (Yahoo Finance API)
List of Experiments contd.
Week # Experiment Title
𝑥0
𝑤0 = 1
𝑥1 𝑤1
4
𝑤2
𝑥2 𝑧= 𝑥𝑖 𝑤𝑖 𝑦 ′ = 𝜑(𝑧) 𝑦′
𝑖=0
𝑤3
𝑥3
𝑤4
𝑥4
Perceptron Learning Algorithm
Inputs: 𝑋 = 𝑥1 , 𝑥2 , 𝑥3 , ⋯ , 𝑥𝑛 , 𝑊 = 𝑤1 , 𝑤2 , 𝑤3 , ⋯ , 𝑤𝑛 , 𝑏
Output: 𝑦 ′
• Step1: Initialize weights 𝑊 and bias 𝑏 to small random values
• Step 2: For each input vector x, compute the weighted sum:
𝑧 =𝑊∙𝑋+𝑏
• Step 3: Apply an activation function 𝜑(𝑧):
1, 𝑖𝑓 𝑧 ≥ 0
𝑦′ = 𝜑 𝑧 =
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• Step 4: For each training example, update the weights based on the error:
𝑤𝑖 = 𝑤𝑖 + 𝜂 𝑦 − 𝑦 ′ 𝑥𝑖
• Step 5: Update bias:
𝑏 = 𝑏 + 𝜂 𝑦 − 𝑦′
• Step 6: Repeat steps 2-4 for a specified number of epochs or until convergence
Deep Learning Frameworks
• Keras is a high-level,
• PyTorch is an open-
• TensorFlow is an open- user-friendly API for
source deep learning
source platform for building and training • Theano is a Python
framework that
creating machine and neural networks library for fast
provides a Python and
deep learning models • Originally, it could run numerical computation
C++ interface
on top of TensorFlow, that can be run on both
• Developed by Google • Developed by
Theano, or CNTK the CPU and GPU
Brain team Facebook's AI Research
• Offers a wide range of • Can leverage GPU for
• Open-source lab
built-in layers including faster numerical
• Uses data flow graphs • PyTorch operates on computation
dense, convolutional,
• Has a large ecosystem tensors, which are like
recurrent, and more • Theano is lower-level
and community NumPy arrays
• Since TensorFlow 2.0, and requires more
• Good for production • Several powerful detailed programming
Keras has become the
deployment libraries and tools are
primary high-level API
built on top of PyTorch
of TensorFlow
PyTorch or TensorFlow?
Key Factors PyTorch TensorFlow
Validation
Set Test Set
Training Set Used to
This provides an
evaluate the
The model learns patterns, weights, estimate of how the
model's
and biases by minimizing the loss model will perform
performance
function on this data. on completely new,
during
unseen data
training
Perceptron in TensorFlow
A fully connected layer
Creating the Perceptron Model
def create_perceptron(learning_rate=0.01):
model = tf.keras.Sequential([ The layer has one neuron
tf.keras.layers.Dense(1, activation='sigmoid', input_shape=(4,))
]) The input shape is specified
to have 4 features
model.compile(
optimizer=tf.keras.optimizers.SGD(learning_rate=learning_rate),
loss='binary_crossentropy',
metrics=['accuracy']
The loss function used is binary
)
cross-entropy
return model
# Plot accuracy
plt.subplot(1, 2, 1)
plt.plot(history.history['accuracy'], label='Training Accuracy')
plt.plot(history.history['val_accuracy'], label='Validation Accuracy') Plotting Training-
plt.title('Perceptron Accuracy') Validation Accuracy
plt.xlabel('Epoch')
plt.ylabel('Accuracy')
plt.legend()
# Plot loss
plt.subplot(1, 2, 2)
plt.plot(history.history['loss'], label='Training Loss')
plt.plot(history.history['val_loss'], label='Validation Loss') Plotting Training-
plt.title('Binary Cross-Entropy Loss') Validation Loss
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.legend()
plt.tight_layout()
plt.show()
Perceptron in TensorFlow
Evaluates the model on
Evaluating the Perceptron the test data and returns
def evaluate_perceptron(model, X_test, y_test): the loss and accuracy
loss, accuracy = model.evaluate(X_test, y_test)
print(f"\nTest Loss: {loss:.4f}")
print(f"Test Accuracy: {accuracy:.4f}")
Converts the predicted
predictions = (model.predict(X_test) >= 0.5).astype(int)
probabilities to binary
print("\nSample Predictions (First 10 instances):")
class labels (0 or 1) based
for i in range(10):
on a threshold of 0.5
print(f"True: {y_test[i]}, Predicted: {predictions[i][0]}")
Objective: Use the Perceptron classifier from SKlearn to classify the Iris dataset
(use data points of any two classes of your choice and do the classification).