0% found this document useful (0 votes)
49 views36 pages

Welcome To Colab - Colab

The document introduces Google Colab, an interactive environment for writing and executing Python code in the browser, offering features like zero configuration, free GPU access, and easy sharing. It highlights the Gemini API, which provides access to multimodal models for various tasks, and outlines how to get started with Colab, including importing data and utilizing machine learning libraries. Additionally, it includes examples and resources for data science and machine learning applications within Colab.

Uploaded by

v8274976
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views36 pages

Welcome To Colab - Colab

The document introduces Google Colab, an interactive environment for writing and executing Python code in the browser, offering features like zero configuration, free GPU access, and easy sharing. It highlights the Gemini API, which provides access to multimodal models for various tasks, and outlines how to get started with Colab, including importing data and utilizing machine learning libraries. Additionally, it includes examples and resources for data science and machine learning applications within Colab.

Uploaded by

v8274976
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

21/07/2025, 00:24 Copy of Welcome to Colab - Colab

keyboard_arrow_down Welcome to Colab!


Explore the Gemini API
The Gemini API gives you access to Gemini models created by Google DeepMind. Gemini
models are built from the ground up to be multimodal, so you can reason seamlessly across
text, images, code and audio.

How to get started

Go to Google AI Studio and log in with your Google Account.


Create an API key.
Use a quickstart for Python or call the REST API using curl.

Discover Gemini's advanced capabilities

Play with Gemini multimodal outputs, mixing text and images in an iterative way.
Discover the multimodal Live API (demo here).
Learn how to analyse images and detect items in your pictures using Gemini (bonus,
there's a 3D version as well!).
Unlock the power of the Gemini thinking model, capable of solving complex tasks with its
inner thoughts.

Explore complex use cases

Use Gemini grounding capabilities to create a report on a company based on what the
model can find on the Internet.
Extract invoices and form data from PDFs in a structured way.
Create illustrations based on a whole book using Gemini large context window and
Imagen.

To learn more, take a look at the Gemini cookbook or visit the Gemini API documentation.

Colab now has AI features powered by Gemini. The video below provides information on how to
use these features, whether you're new to Python or a seasoned veteran.

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 1/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

What is Colab?
Colab, or ‘Colaboratory’, allows you to write and execute Python in your browser, with

Zero configuration required


Access to GPUs free of charge
Easy sharing

Whether you're a student, a data scientist or an AI researcher, Colab can make your work easier.
Watch Introduction to Colab or Colab features you may have missed to learn more or just get
started below!

keyboard_arrow_down Getting started


The document that you are reading is not a static web page, but an interactive environment
called a Colab notebook that lets you write and execute code.

For example, here is a code cell with a short Python script that computes a value, stores it in a
variable and prints the result:

seconds_in_a_day = 24 * 60 * 60
seconds_in_a_day

86400

To execute the code in the above cell, select it with a click and then either press the play button
to the left of the code, or use the keyboard shortcut 'Command/Ctrl+Enter'. To edit the code, just
click the cell and start editing.

Variables that you define in one cell can later be used in other cells:

seconds_in_a_week = 7 * seconds_in_a_day
seconds_in_a_week

604800

Colab notebooks allow you to combine executable code and rich text in a single document,
along with images, HTML, LaTeX and more. When you create your own Colab notebooks, they
are stored in your Google Drive account. You can easily share your Colab notebooks with co-
workers or friends, allowing them to comment on your notebooks or even edit them. To find out
more, see Overview of Colab. To create a new Colab notebook you can use the File menu above,
or use the following link: Create a new Colab notebook.

Colab notebooks are Jupyter notebooks that are hosted by Colab. To find out more about the
Jupyter project, see jupyter.org.

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 2/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

keyboard_arrow_down Data science


With Colab you can harness the full power of popular Python libraries to analyse and visualise
data. The code cell below uses numpy to generate some random data, and uses matplotlib to
visualise it. To edit the code, just click the cell and start editing.

You can import your own data into Colab notebooks from your Google Drive account, including
from spreadsheets, as well as from GitHub and many other sources. To find out more about
importing data, and how Colab can be used for data science, see the links below under Working
with data.

import numpy as np
import IPython.display as display
from matplotlib import pyplot as plt
import io
import base64

ys = 200 + np.random.randn(100)
x = [x for x in range(len(ys))]

fig = plt.figure(figsize=(4, 3), facecolor='w')


plt.plot(x, ys, '-')
plt.fill_between(x, ys, 195, where=(ys > 195), facecolor='g', alpha=0.6)
plt.title("Sample Visualization", fontsize=10)

data = io.BytesIO()
plt.savefig(data)
image = F"data:image/png;base64,{base64.b64encode(data.getvalue()).decode()}"
alt = "Sample Visualization"
display.display(display.Markdown(F"""![{alt}]({image})"""))
plt.close(fig)

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 3/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Colab notebooks execute code on Google's cloud servers, meaning that you can leverage the
power of Google hardware, including GPUs and TPUs, regardless of the power of your machine.
All you need is a browser.

For example, if you find yourself waiting for pandas code to finish running and want to go faster,
you can switch to a GPU runtime and use libraries like RAPIDS cuDF that provide zero-code-
change acceleration.

To learn more about accelerating pandas on Colab, see the 10-minute guide or US stock market
data analysis demo.

keyboard_arrow_down Machine learning


With Colab you can import an image dataset, train an image classifier on it and evaluate the
model, all in just a few lines of code.

Colab is used extensively in the machine learning community with applications including:

Getting started with TensorFlow


Developing and training neural networks
Experimenting with TPUs
Disseminating AI research
Creating tutorials

To see sample Colab notebooks that demonstrate machine learning applications, see the
machine learning examples below.

keyboard_arrow_down More resources


Working with notebooks in Colab
Overview of Colab
Guide to markdown
Importing libraries and installing dependencies
Saving and loading notebooks in GitHub
Interactive forms
Interactive widgets

Working with data


Loading data: Drive, Sheets and Google Cloud Storage
Charts: visualising data

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 4/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Getting started with BigQuery

Machine learning
These are a few of the notebooks related to machine learning, including Google's online
machine learning course. See the full course website for more.
Intro to Pandas DataFrame
Intro to RAPIDS cuDF to accelerate pandas
Getting started with cuML's accelerator mode
Linear regression with tf.keras using synthetic data

Using accelerated hardware


TensorFlow with GPUs
TPUs in Colab

keyboard_arrow_down Featured examples


Retraining an Image Classifier: Build a Keras model on top of a pre-trained image classifier
to distinguish flowers.
Text Classification: Classify IMDB film reviews as either positive or negative.
Style Transfer: Use deep learning to transfer style between images.
Multilingual Universal Sentence Encoder Q&A: Use a machine-learning model to answer
questions from the SQuAD dataset.
Video Interpolation: Predict what happened in a video between the first and the last frame.

Start coding or generate with AI.

# Google Colab Python Code - CORRECTED VERSION


!pip install tensorflow scikit-learn matplotlib seaborn pandas

import tensorflow as tf
from tensorflow.keras import layers, models, optimizers
from sklearn.metrics import confusion_matrix, accuracy_score, precision_score, recall_sco
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
import pandas as pd
import itertools
import gc

# Clear any existing models


tf.keras.backend.clear_session()

# Set random seeds for reproducibility


np.random.seed(42)
tf.random.set_seed(42)
https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 5/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

# 1. Load and preprocess dataset (MNIST)


print("Loading MNIST dataset...")
(x_train, y_train), (x_test, y_test) = tf.keras.datasets.mnist.load_data()
x_train = x_train.reshape(-1, 28*28).astype('float32') / 255.0
x_test = x_test.reshape(-1, 28*28).astype('float32') / 255.0

print(f"Training data shape: {x_train.shape}")


print(f"Test data shape: {x_test.shape}")

# 2. Define hyperparameter grid


epochs_list = [10, 15, 20]
optimizer_names = ['SGD', 'Adam', 'RMSprop', 'Adagrad']
batch_sizes = [100, 1000, 5000]

# Function to create optimizer


def create_optimizer(name):
if name == 'SGD':
return optimizers.SGD(learning_rate=0.01)
elif name == 'Adam':
return optimizers.Adam(learning_rate=0.001)
elif name == 'RMSprop':
return optimizers.RMSprop(learning_rate=0.001)
elif name == 'Adagrad':
return optimizers.Adagrad(learning_rate=0.01)

# 3. Storage for results


results = []
experiment_count = 0
total_experiments = len(epochs_list) * len(optimizer_names) * len(batch_sizes)

print(f"Starting {total_experiments} experiments...")

# 4. Loop over all combinations


for epochs in epochs_list:
for opt_name in optimizer_names:
for batch_size in batch_sizes:
experiment_count += 1
print(f"\nExperiment {experiment_count}/{total_experiments}: Epochs={epochs},

# Clear session and create fresh model


tf.keras.backend.clear_session()

# Build model
model = models.Sequential([
layers.Input(shape=(28*28,)),
layers.Dense(128, activation='relu'),
layers.Dropout(0.2),
layers.Dense(64, activation='relu'),
layers.Dense(10, activation='softmax')
])

# Create fresh optimizer instance


optimizer = create_optimizer(opt_name)

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 6/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

# Compile model
model.compile(
optimizer=optimizer,
loss='sparse_categorical_crossentropy',
metrics=['accuracy']
)

try:
# Train model
history = model.fit(
x_train, y_train,
epochs=epochs,
batch_size=batch_size,
validation_split=0.1,
verbose=0
)

# Predict on test set


y_pred_probs = model.predict(x_test, batch_size=1000, verbose=0)
y_pred = np.argmax(y_pred_probs, axis=1)

# Compute metrics
cm = confusion_matrix(y_test, y_pred)
acc = accuracy_score(y_test, y_pred)
prec_micro = precision_score(y_test, y_pred, average='micro', zero_divisi
prec_macro = precision_score(y_test, y_pred, average='macro', zero_divisi
rec_micro = recall_score(y_test, y_pred, average='micro', zero_division=0
rec_macro = recall_score(y_test, y_pred, average='macro', zero_division=0
f1_micro = f1_score(y_test, y_pred, average='micro', zero_division=0)
f1_macro = f1_score(y_test, y_pred, average='macro', zero_division=0)

# Store results
result_dict = {
'epochs': epochs,
'optimizer': opt_name,
'batch_size': batch_size,
'confusion_matrix': cm,
'accuracy': acc,
'precision_micro': prec_micro,
'precision_macro': prec_macro,
'recall_micro': rec_micro,
'recall_macro': rec_macro,
'f1_micro': f1_micro,
'f1_macro': f1_macro,
'final_train_acc': history.history['accuracy'][-1],
'final_val_acc': history.history['val_accuracy'][-1]
}
results.append(result_dict)

print(f"Accuracy: {acc:.4f}, F1-macro: {f1_macro:.4f}")

# Plot confusion matrix


plt.figure(figsize=(8, 6))
sns.heatmap(cm, annot=True, fmt='d', cmap='Blues',
xticklabels=range(10), yticklabels=range(10))
https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 7/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

plt.title(f'Confusion Matrix\nEpochs: {epochs}, Optimizer: {opt_name}, Ba


plt.xlabel('Predicted Label')
plt.ylabel('True Label')
plt.tight_layout()
plt.show()

except Exception as e:
print(f"Error in experiment: {str(e)}")
continue

# Clean up memory
del model, optimizer
gc.collect()

# 5. Create comprehensive comparison table


print("\n" + "="*80)
print("RESULTS SUMMARY")
print("="*80)

# Convert results to DataFrame


df_results = pd.DataFrame([{
'Epochs': r['epochs'],
'Optimizer': r['optimizer'],
'Batch_Size': r['batch_size'],
'Accuracy': r['accuracy'],
'Precision_Micro': r['precision_micro'],
'Precision_Macro': r['precision_macro'],
'Recall_Micro': r['recall_micro'],
'Recall_Macro': r['recall_macro'],
'F1_Micro': r['f1_micro'],
'F1_Macro': r['f1_macro'],
'Train_Acc': r['final_train_acc'],
'Val_Acc': r['final_val_acc']
} for r in results])

# Display detailed results table


pd.set_option('display.max_columns', None)
pd.set_option('display.width', None)
pd.set_option('display.precision', 4)

print("\nDetailed Results Table:")


print(df_results.round(4))

# Create pivot tables for easier comparison


print("\n" + "-"*60)
print("ACCURACY COMPARISON BY OPTIMIZER AND BATCH SIZE")
print("-"*60)
accuracy_pivot = df_results.pivot_table(
index=['Optimizer', 'Batch_Size'],
columns='Epochs',
values='Accuracy',
aggfunc='mean'
).round(4)
print(accuracy_pivot)

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 8/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

print("\n" + "-"*60)
print("F1-MACRO COMPARISON BY OPTIMIZER AND BATCH SIZE")
print("-"*60)
f1_pivot = df_results.pivot_table(
index=['Optimizer', 'Batch_Size'],
columns='Epochs',
values='F1_Macro',
aggfunc='mean'
).round(4)
print(f1_pivot)

# Find best performing combinations


print("\n" + "-"*60)
print("TOP 5 BEST PERFORMING COMBINATIONS (by Accuracy)")
print("-"*60)
best_results = df_results.nlargest(5, 'Accuracy')[['Epochs', 'Optimizer', 'Batch_Size', '
print(best_results.round(4))

# Summary statistics
print("\n" + "-"*60)
print("SUMMARY STATISTICS")
print("-"*60)
print("Mean Accuracy:", df_results['Accuracy'].mean().round(4))
print("Std Accuracy:", df_results['Accuracy'].std().round(4))
print("Best Accuracy:", df_results['Accuracy'].max().round(4))
print("Worst Accuracy:", df_results['Accuracy'].min().round(4))

# Create comparison plots


plt.figure(figsize=(15, 10))

# Plot 1: Accuracy by Optimizer


plt.subplot(2, 3, 1)
df_results.boxplot(column='Accuracy', by='Optimizer', ax=plt.gca())
plt.title('Accuracy by Optimizer')
plt.suptitle('')

# Plot 2: Accuracy by Batch Size


plt.subplot(2, 3, 2)
df_results.boxplot(column='Accuracy', by='Batch_Size', ax=plt.gca())
plt.title('Accuracy by Batch Size')
plt.suptitle('')

# Plot 3: Accuracy by Epochs


plt.subplot(2, 3, 3)
df_results.boxplot(column='Accuracy', by='Epochs', ax=plt.gca())
plt.title('Accuracy by Epochs')
plt.suptitle('')

# Plot 4: F1-Macro by Optimizer


plt.subplot(2, 3, 4)
df_results.boxplot(column='F1_Macro', by='Optimizer', ax=plt.gca())
plt.title('F1-Macro by Optimizer')
plt.suptitle('')

# Plot 5: F1-Macro by Batch Size


https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 9/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

plt.subplot(2, 3, 5)
df_results.boxplot(column='F1_Macro', by='Batch_Size', ax=plt.gca())
plt.title('F1-Macro by Batch Size')
plt.suptitle('')

# Plot 6: F1-Macro by Epochs


plt.subplot(2, 3, 6)
df_results.boxplot(column='F1_Macro', by='Epochs', ax=plt.gca())
plt.title('F1-Macro by Epochs')
plt.suptitle('')

plt.tight_layout()
plt.show()

# Heatmap of accuracy across all combinations


plt.figure(figsize=(12, 8))
heatmap_data = df_results.pivot_table(
index=['Optimizer'],
columns=['Epochs', 'Batch_Size'],
values='Accuracy'
)
sns.heatmap(heatmap_data, annot=True, fmt='.3f', cmap='viridis')
plt.title('Accuracy Heatmap: Optimizer vs (Epochs, Batch Size)')
plt.tight_layout()
plt.show()

print(f"\nCompleted all {len(results)} experiments successfully!")

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 10/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Requirement already satisfied: tensorflow in /usr/local/lib/python3.11/dist-packages


Requirement already satisfied: scikit-learn in /usr/local/lib/python3.11/dist-package
Requirement already satisfied: matplotlib in /usr/local/lib/python3.11/dist-packages
Requirement already satisfied: seaborn in /usr/local/lib/python3.11/dist-packages (0.
Requirement already satisfied: pandas in /usr/local/lib/python3.11/dist-packages (2.2
Requirement already satisfied: absl-py>=1.0.0 in /usr/local/lib/python3.11/dist-packa
Requirement already satisfied: astunparse>=1.6.0 in /usr/local/lib/python3.11/dist-pa
Requirement already satisfied: flatbuffers>=24.3.25 in /usr/local/lib/python3.11/dist
Requirement already satisfied: gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1 in /usr/local/lib/
Requirement already satisfied: google-pasta>=0.1.1 in /usr/local/lib/python3.11/dist-
Requirement already satisfied: libclang>=13.0.0 in /usr/local/lib/python3.11/dist-pac
Requirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.11/dist-pa
Requirement already satisfied: packaging in /usr/local/lib/python3.11/dist-packages (
Requirement already satisfied: protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!
Requirement already satisfied: requests<3,>=2.21.0 in /usr/local/lib/python3.11/dist-
Requirement already satisfied: setuptools in /usr/local/lib/python3.11/dist-packages
Requirement already satisfied: six>=1.12.0 in /usr/local/lib/python3.11/dist-packages
Requirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.11/dist-pac
Requirement already satisfied: typing-extensions>=3.6.6 in /usr/local/lib/python3.11/
Requirement already satisfied: wrapt>=1.11.0 in /usr/local/lib/python3.11/dist-packag
Requirement already satisfied: grpcio<2.0,>=1.24.3 in /usr/local/lib/python3.11/dist-
Requirement already satisfied: tensorboard<2.19,>=2.18 in /usr/local/lib/python3.11/d
Requirement already satisfied: keras>=3.5.0 in /usr/local/lib/python3.11/dist-package
Requirement already satisfied: numpy<2.1.0,>=1.26.0 in /usr/local/lib/python3.11/dist
Requirement already satisfied: h5py>=3.11.0 in /usr/local/lib/python3.11/dist-package
Requirement already satisfied: ml-dtypes<0.5.0,>=0.4.0 in /usr/local/lib/python3.11/d
Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.23.1 in /usr/local/lib
Requirement already satisfied: scipy>=1.6.0 in /usr/local/lib/python3.11/dist-package
Requirement already satisfied: joblib>=1.2.0 in /usr/local/lib/python3.11/dist-packag
Requirement already satisfied: threadpoolctl>=3.1.0 in /usr/local/lib/python3.11/dist
Requirement already satisfied: contourpy>=1.0.1 in /usr/local/lib/python3.11/dist-pac
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.11/dist-package
Requirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.11/dist-pa
Requirement already satisfied: kiwisolver>=1.3.1 in /usr/local/lib/python3.11/dist-pa
Requirement already satisfied: pillow>=8 in /usr/local/lib/python3.11/dist-packages (
Requirement already satisfied: pyparsing>=2.3.1 in /usr/local/lib/python3.11/dist-pac
Requirement already satisfied: python-dateutil>=2.7 in /usr/local/lib/python3.11/dist
Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.11/dist-package
Requirement already satisfied: tzdata>=2022.7 in /usr/local/lib/python3.11/dist-packa
Requirement already satisfied: wheel<1.0,>=0.23.0 in /usr/local/lib/python3.11/dist-p
Requirement already satisfied: rich in /usr/local/lib/python3.11/dist-packages (from
Requirement already satisfied: namex in /usr/local/lib/python3.11/dist-packages (from
Requirement already satisfied: optree in /usr/local/lib/python3.11/dist-packages (fro
Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.11/
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.11/dist-package
Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.11/dist-p
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.11/dist-p
Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.11/dist-pack
Requirement already satisfied: tensorboard-data-server<0.8.0,>=0.7.0 in /usr/local/li
Requirement already satisfied: werkzeug>=1.0.1 in /usr/local/lib/python3.11/dist-pack
Requirement already satisfied: MarkupSafe>=2.1.1 in /usr/local/lib/python3.11/dist-pa
Requirement already satisfied: markdown-it-py>=2.2.0 in /usr/local/lib/python3.11/dis
Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /usr/local/lib/python3.11/d
Requirement already satisfied: mdurl~=0.1 in /usr/local/lib/python3.11/dist-packages
Loading MNIST dataset...
Training data shape: (60000, 784)
Test data shape: (10000, 784)
Starting 36 experiments...

Experiment 1/36: Epochs=10, Optimizer=SGD, Batch Size=100


https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 11/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Accuracy: 0.9383, F1-macro: 0.9375

Experiment 2/36: Epochs=10, Optimizer=SGD, Batch Size=1000


Accuracy: 0.8451, F1-macro: 0.8409

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 12/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 3/36: Epochs=10, Optimizer=SGD, Batch Size=5000


Accuracy: 0.5078, F1-macro: 0.4319

Experiment 4/36: Epochs=10, Optimizer=Adam, Batch Size=100


Accuracy: 0.9798, F1-macro: 0.9797

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 13/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 5/36: Epochs=10, Optimizer=Adam, Batch Size=1000


Accuracy: 0.9707, F1-macro: 0.9705

Experiment 6/36: Epochs=10, Optimizer=Adam, Batch Size=5000


Accuracy: 0.9425, F1-macro: 0.9418

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 14/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 7/36: Epochs=10, Optimizer=RMSprop, Batch Size=100


Accuracy: 0.9811, F1-macro: 0.9811

Experiment 8/36: Epochs=10, Optimizer=RMSprop, Batch Size=1000


Accuracy: 0.9718, F1-macro: 0.9716

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 15/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 9/36: Epochs=10, Optimizer=RMSprop, Batch Size=5000


Accuracy: 0.9326, F1-macro: 0.9316

Experiment 10/36: Epochs=10, Optimizer=Adagrad, Batch Size=100


Accuracy: 0.9585, F1-macro: 0.9581

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 16/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 11/36: Epochs=10, Optimizer=Adagrad, Batch Size=1000


Accuracy: 0.9069, F1-macro: 0.9052

Experiment 12/36: Epochs=10, Optimizer=Adagrad, Batch Size=5000


Accuracy: 0.8255, F1-macro: 0.8212

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 17/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 13/36: Epochs=15, Optimizer=SGD, Batch Size=100


Accuracy: 0.9492, F1-macro: 0.9487

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 18/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab
Experiment 14/36: Epochs=15, Optimizer=SGD, Batch Size=1000
Accuracy: 0.8796, F1-macro: 0.8773

Experiment 15/36: Epochs=15, Optimizer=SGD, Batch Size=5000


Accuracy: 0.6714, F1-macro: 0.6367

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 19/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 16/36: Epochs=15, Optimizer=Adam, Batch Size=100


Accuracy: 0.9788, F1-macro: 0.9787

Experiment 17/36: Epochs=15, Optimizer=Adam, Batch Size=1000


Accuracy: 0.9735, F1-macro: 0.9733

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 20/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 18/36: Epochs=15, Optimizer=Adam, Batch Size=5000


Accuracy: 0.9515, F1-macro: 0.9510

Experiment 19/36: Epochs=15, Optimizer=RMSprop, Batch Size=100


Accuracy: 0.9791, F1-macro: 0.9790

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 21/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 20/36: Epochs=15, Optimizer=RMSprop, Batch Size=1000


Accuracy: 0.9762, F1-macro: 0.9760

Experiment 21/36: Epochs=15, Optimizer=RMSprop, Batch Size=5000


Accuracy: 0.9518, F1-macro: 0.9513

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 22/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 22/36: Epochs=15, Optimizer=Adagrad, Batch Size=100


Accuracy: 0.9651, F1-macro: 0.9649

Experiment 23/36: Epochs=15, Optimizer=Adagrad, Batch Size=1000


Accuracy: 0.9186, F1-macro: 0.9174

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 23/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 24/36: Epochs=15, Optimizer=Adagrad, Batch Size=5000


Accuracy: 0.8597, F1-macro: 0.8564

Experiment 25/36: Epochs=20, Optimizer=SGD, Batch Size=100


Accuracy: 0.9543, F1-macro: 0.9537

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 24/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 26/36: Epochs=20, Optimizer=SGD, Batch Size=1000


Accuracy: 0.8960, F1-macro: 0.8942

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 25/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 27/36: Epochs=20, Optimizer=SGD, Batch Size=5000


Accuracy: 0.7641, F1-macro: 0.7479

Experiment 28/36: Epochs=20, Optimizer=Adam, Batch Size=100


Accuracy: 0.9791, F1-macro: 0.9789

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 26/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 29/36: Epochs=20, Optimizer=Adam, Batch Size=1000


Accuracy: 0.9789, F1-macro: 0.9787

Experiment 30/36: Epochs=20, Optimizer=Adam, Batch Size=5000


Accuracy: 0.9612, F1-macro: 0.9609

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 27/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 31/36: Epochs=20, Optimizer=RMSprop, Batch Size=100


Accuracy: 0.9792, F1-macro: 0.9790

Experiment 32/36: Epochs=20, Optimizer=RMSprop, Batch Size=1000


Accuracy: 0.9784, F1-macro: 0.9783

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 28/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 33/36: Epochs=20, Optimizer=RMSprop, Batch Size=5000


Accuracy: 0.9547, F1-macro: 0.9541

Experiment 34/36: Epochs=20, Optimizer=Adagrad, Batch Size=100


Accuracy: 0.9701, F1-macro: 0.9698

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 29/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

Experiment 35/36: Epochs=20, Optimizer=Adagrad, Batch Size=1000


Accuracy: 0.9246, F1-macro: 0.9234

Experiment 36/36: Epochs=20, Optimizer=Adagrad, Batch Size=5000


Accuracy: 0.8771, F1-macro: 0.8743

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 30/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

================================================================================
RESULTS SUMMARY
================================================================================

Detailed Results Table:


Epochs Optimizer Batch_Size Accuracy Precision_Micro Precision_Macro \
0 10 SGD 100 0.9383 0.9383 0.9377
1 10 SGD 1000 0.8451 0.8451 0.8464
2 10 SGD 5000 0.5078 0.5078 0.5526
3 10 Adam 100 0.9798 0.9798 0.9798
4 10 Adam 1000 0.9707 0.9707 0.9707
5 10 Adam 5000 0.9425 0.9425 0.9422
6 10 RMSprop 100 0.9811 0.9811 0.9810
7 10 RMSprop 1000 0.9718 0.9718 0.9718
8 10 RMSprop 5000 0.9326 0.9326 0.9331
9 10 Adagrad 100 0.9585 0.9585 0.9585
10 10 Adagrad 1000 0.9069 0.9069 0.9061
11 10 Adagrad 5000 0.8255 0.8255 0.8271
12 15 SGD 100 0.9492 0.9492 0.9490
13 15 SGD 1000 0.8796 0.8796 0.8783
14 15 SGD 5000 0.6714 0.6714 0.6806
15 15 Adam 100 0.9788 0.9788 0.9787
16 15 Adam 1000 0.9735 0.9735 0.9735
17 15 Adam 5000 0.9515 0.9515 0.9512
18 15 RMSprop 100 0.9791 0.9791 0.9792
19 15 RMSprop 1000 0.9762 0.9762 0.9762
20 15 RMSprop 5000 0.9518 0.9518 0.9513
21 15 Adagrad 100 0.9651 0.9651 0.9650
22 15 Adagrad 1000 0.9186 0.9186 0.9179
23 15 Adagrad 5000 0.8597 0.8597 0.8590
24 20 SGD 100 0.9543 0.9543 0.9539
25 20 SGD 1000 0.8960 0.8960 0.8948
26 20 SGD 5000 0.7641 0.7641 0.7724
27 20 Adam 100 0.9791 0.9791 0.9791
28 20 Adam 1000 0.9789 0.9789 0.9789
https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 31/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab
29 20 Adam 5000 0.9612 0.9612 0.9612
30 20 RMSprop 100 0.9792 0.9792 0.9792
31 20 RMSprop 1000 0.9784 0.9784 0.9785
32 20 RMSprop 5000 0.9547 0.9547 0.9542
33 20 Adagrad 100 0.9701 0.9701 0.9699
34 20 Adagrad 1000 0.9246 0.9246 0.9241
35 20 Adagrad 5000 0.8771 0.8771 0.8761

Recall_Micro Recall_Macro F1_Micro F1_Macro Train_Acc Val_Acc


0 0.9383 0.9375 0.9383 0.9375 0.9228 0.9513
1 0.8451 0.8413 0.8451 0.8409 0.7829 0.8710
2 0.5078 0.4957 0.5078 0.4319 0.4375 0.5245
3 0.9798 0.9796 0.9798 0.9797 0.9838 0.9808
4 0.9707 0.9704 0.9707 0.9705 0.9682 0.9765
5 0.9425 0.9417 0.9425 0.9418 0.9291 0.9560
6 0.9811 0.9811 0.9811 0.9811 0.9827 0.9802
7 0.9718 0.9715 0.9718 0.9716 0.9681 0.9770
8 0.9326 0.9317 0.9326 0.9316 0.9173 0.9452
9 0.9585 0.9580 0.9585 0.9581 0.9502 0.9690
10 0.9069 0.9053 0.9069 0.9052 0.8775 0.9202
11 0.8255 0.8220 0.8255 0.8212 0.7484 0.8460
12 0.9492 0.9486 0.9492 0.9487 0.9379 0.9602
13 0.8796 0.8775 0.8796 0.8773 0.8349 0.8965
14 0.6714 0.6623 0.6714 0.6367 0.5793 0.6862
15 0.9788 0.9787 0.9788 0.9787 0.9883 0.9810
16 0.9735 0.9732 0.9735 0.9733 0.9765 0.9788
17 0.9515 0.9509 0.9515 0.9510 0.9424 0.9638
18 0.9791 0.9789 0.9791 0.9790 0.9860 0.9820
19 0.9762 0.9760 0.9762 0.9760 0.9766 0.9783
20 0.9518 0.9515 0.9518 0.9513 0.9398 0.9620
21 0.9651 0.9649 0.9651 0.9649 0.9596 0.9737
22 0.9186 0.9174 0.9186 0.9174 0.8953 0.9313
23 0.8597 0.8567 0.8597 0.8564 0.8045 0.8818
24 0.9543 0.9538 0.9543 0.9537 0.9459 0.9653
25 0.8960 0.8943 0.8960 0.8942 0.8586 0.9113
26 0.7641 0.7567 0.7641 0.7479 0.6850 0.7875
27 0.9791 0.9789 0.9791 0.9789 0.9897 0.9807
28 0.9789 0.9786 0.9789 0.9787 0.9833 0.9805
29 0.9612 0.9608 0.9612 0.9609 0.9554 0.9695
30 0.9792 0.9790 0.9792 0.9790 0.9898 0.9833
31 0.9784 0.9783 0.9784 0.9783 0.9825 0.9808
32 0.9547 0.9544 0.9547 0.9541 0.9471 0.9653
33 0.9701 0.9698 0.9701 0.9698 0.9662 0.9760
34 0.9246 0.9234 0.9246 0.9234 0.9049 0.9373
35 0.8771 0.8745 0.8771 0.8743 0.8274 0.8977

------------------------------------------------------------
ACCURACY COMPARISON BY OPTIMIZER AND BATCH SIZE
------------------------------------------------------------
Epochs 10 15 20
Optimizer Batch_Size
Adagrad 100 0.9585 0.9651 0.9701
1000 0.9069 0.9186 0.9246
5000 0.8255 0.8597 0.8771
Adam 100 0.9798 0.9788 0.9791
1000 0.9707 0.9735 0.9789
5000 0.9425 0.9515 0.9612
RMSprop 100 0.9811 0.9791 0.9792
1000 0.9718 0.9762 0.9784
5000 0.9326 0.9518 0.9547
SGD 100 0 9383 0 9492 0 9543
https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 32/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab
SGD 100 0.9383 0.9492 0.9543
1000 0.8451 0.8796 0.8960
5000 0.5078 0.6714 0.7641

------------------------------------------------------------
F1-MACRO COMPARISON BY OPTIMIZER AND BATCH SIZE
------------------------------------------------------------
Epochs 10 15 20
Optimizer Batch_Size
Adagrad 100 0.9581 0.9649 0.9698
1000 0.9052 0.9174 0.9234
5000 0.8212 0.8564 0.8743
Adam 100 0.9797 0.9787 0.9789
1000 0.9705 0.9733 0.9787
5000 0.9418 0.9510 0.9609
RMSprop 100 0.9811 0.9790 0.9790
1000 0.9716 0.9760 0.9783
5000 0.9316 0.9513 0.9541
SGD 100 0.9375 0.9487 0.9537
1000 0.8409 0.8773 0.8942
5000 0.4319 0.6367 0.7479

------------------------------------------------------------
TOP 5 BEST PERFORMING COMBINATIONS (by Accuracy)
------------------------------------------------------------
Epochs Optimizer Batch_Size Accuracy F1_Macro
6 10 RMSprop 100 0.9811 0.9811
3 10 Adam 100 0.9798 0.9797
30 20 RMSprop 100 0.9792 0.9790
18 15 RMSprop 100 0.9791 0.9790
27 20 Adam 100 0.9791 0.9789

------------------------------------------------------------
SUMMARY STATISTICS
------------------------------------------------------------
Mean Accuracy: 0.9176
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
/tmp/ipython-input-39-2888371127.py in <cell line: 0>()
206 print("-"*60)
207 print("Mean Accuracy:", df_results['Accuracy'].mean().round(4))
--> 208 print("Std Accuracy:", df_results['Accuracy'].std().round(4))
209 print("Best Accuracy:", df_results['Accuracy'].max().round(4))
210 print("Worst Accuracy:", df_results['Accuracy'].min().round(4))

AttributeError: 'float' object has no attribute 'round'

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 33/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 34/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 35/36
21/07/2025, 00:24 Copy of Welcome to Colab - Colab

https://colab.research.google.com/drive/1abM2zBdIJSKILJpF8DOYAOfkbV2Y-OiB#scrollTo=q5HLXu2LCFkq&printMode=true 36/36

You might also like