TM Adaboost
TM Adaboost
Boosting can improve the accuracy of the model by combining several weak
models’ accuracies and averaging them for regression or voting over them for
classification to increase the accuracy of the final model.
Dataset Link
ADABOOST ALGORITHM AND ITS
HYPERPARAMETERS
Steps: Hyperparameters:
n_estimatorsint, default=50
Datacleaning: Checking of null values, The number of weak learners to
Drop null values. train iteratively.
learning_ratefloat, default=1.0
Feature selection: Select the Controls the contribution of each
important features based on the classifier. There is a trade-off
correlation with the target variable between learning_rate and
n_estimators.
Normalization of data: random_state
base_estimatorobject, default=None
Spliting of dataset into Train and Test Use GRIDSEARCHCV
dataset:
Hyperparameters:
n_estimators int, For values, variable
learning_rate = o.o1, 0.1, 1
random_state = 42
base_estimatorobject, default= Decision Tree
Hyperparameters:
n_estimators int, For values, 50, 500, 1000
learning_rate = Variable
random_state = 42
base_estimatorobject, default= Decision Tree
Confusion Matrices
Confusion Matrix
Output