0% found this document useful (0 votes)
20 views5 pages

Ex 5 - NN - Wheat Seed Data

Uploaded by

munimahesh2605
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views5 pages

Ex 5 - NN - Wheat Seed Data

Uploaded by

munimahesh2605
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

In [1]: # importing numpy, pandas libraries

import pandas as pd
import numpy as np

# loading wheat seeds data into a dataframe


seeds_data = pd.read_csv('seeds.csv')

# displaying the first 5 rows of wheet seeds data


seeds_data.head()

Out[1]: Area Perimeter Compactness Kernel.Length Kernel.Width Asymmetry.Coeff Kernel.Groove Type

0 15.26 14.84 0.8710 5.763 3.312 2.221 5.220 1

1 14.88 14.57 0.8811 5.554 3.333 1.018 4.956 1

2 14.29 14.09 0.9050 5.291 3.337 2.699 4.825 1

3 13.84 13.94 0.8955 5.324 3.379 2.259 4.805 1

4 16.14 14.99 0.9034 5.658 3.562 1.355 5.175 1

In [2]: # Extracting Independent Variables

X = seeds_data.loc[:, seeds_data.columns != 'Type']


X

Out[2]: Area Perimeter Compactness Kernel.Length Kernel.Width Asymmetry.Coeff Kernel.Groove

0 15.26 14.84 0.8710 5.763 3.312 2.221 5.220

1 14.88 14.57 0.8811 5.554 3.333 1.018 4.956

2 14.29 14.09 0.9050 5.291 3.337 2.699 4.825

3 13.84 13.94 0.8955 5.324 3.379 2.259 4.805

4 16.14 14.99 0.9034 5.658 3.562 1.355 5.175

... ... ... ... ... ... ... ...

194 12.19 13.20 0.8783 5.137 2.981 3.631 4.870

195 11.23 12.88 0.8511 5.140 2.795 4.325 5.003

196 13.20 13.66 0.8883 5.236 3.232 8.315 5.056

197 11.84 13.21 0.8521 5.175 2.836 3.598 5.044

198 12.30 13.34 0.8684 5.243 2.974 5.637 5.063

199 rows × 7 columns

In [3]: # Extracting Target Variable

Y = seeds_data.loc[:, seeds_data.columns == 'Type']


Y

Out[3]: Type

0 1

1 1

2 1

3 1

4 1
... ...

194 3

195 3

196 3

197 3

198 3

199 rows × 1 columns

Split Data for training and testing


In [4]: from sklearn.model_selection import train_test_split

X_train, X_test, y_train, y_test = train_test_split(X , Y ,


test_size = 0.2,
random_state = 523)

Training the Perceptron Classifier


In [5]: # importing Perceptron Class
from sklearn.linear_model import Perceptron

# Creating an insance of Perceptron Class


perc = Perceptron( random_state = 15)

# Training the perceptron classifier


perc.fit(X_train, np.ravel(y_train))

# importing metrics for evaluating perceptron classifier


from sklearn.metrics import accuracy_score

# Using perceptron classifier to make predictions on test data


pred_test = perc.predict(X_test)

# calculating and displaying accuracy score of Perceptron classifier


accuracy = accuracy_score(y_test, pred_test)
print('% of Accuracy using Linear Perceptron: ', accuracy * 100)

% of Accuracy using Linear Perceptron: 67.5

Correlation between two variables can be either a positive correlation, a negative


correlation, or no correlation.

In [6]: # Importing plotly.express


import plotly.express as px

# Finding the correlation of Independent variables on Target Variable


corr = seeds_data.corr()

corr = corr.round(2)
corr

Out[6]: Area Perimeter Compactness Kernel.Length Kernel.Width Asymmetry.Coeff Kernel.Groove

Area 1.00 0.99 0.61 0.95 0.97 -0.22 0.86


Perimeter 0.99 1.00 0.53 0.97 0.95 -0.21 0.89

Compactness 0.61 0.53 1.00 0.37 0.76 -0.33 0.23

Kernel.Length 0.95 0.97 0.37 1.00 0.86 -0.17 0.93

Kernel.Width 0.97 0.95 0.76 0.86 1.00 -0.25 0.75

Asymmetry.Coeff -0.22 -0.21 -0.33 -0.17 -0.25 1.00 -0.00

Kernel.Groove 0.86 0.89 0.23 0.93 0.75 -0.00 1.00

Type -0.34 -0.32 -0.54 -0.25 -0.42 0.57 0.04

In [7]: # displaying confusion matrix as a heatmap

fig = px.imshow(corr ,
width = 700,
height = 700 ,
text_auto = True,
color_continuous_scale = 'tealgrn',
)

fig.show()

Area 1 0.99 0.61 0.95 0.97 −0.22 0.86 −0.34


0.8

Perimeter 0.99 1 0.53 0.97 0.95 −0.21 0.89 −0.32

0.6

Compactness 0.61 0.53 1 0.37 0.76 −0.33 0.23 −0.54

0.4

Kernel.Length 0.95 0.97 0.37 1 0.86 −0.17 0.93 −0.25

0.2
Kernel.Width 0.97 0.95 0.76 0.86 1 −0.25 0.75 −0.42

0
Asymmetry.Coeff −0.22 −0.21 −0.33 −0.17 −0.25 1 0 0.57

Kernel.Groove 0.86 0.89 0.23 0.93 0.75 0 1 0.04 −0.2

Type −0.34 −0.32 −0.54 −0.25 −0.42 0.57 0.04 1 −0.4

Ar Pe Co Ke Ke As Ke Ty
ea rim mp rne rne ym rne pe
ete ac l.L l.W me l.G
r tne en idt try roo
ss gth h . Co ve
e ff
It can be observed that the attribute "Kernel.Groove" has very least correlation on the
target variable

In [8]: # remove Kernel.Groove attribute from X


X = X.loc[:, X.columns != 'Kernel.Groove']
X

Out[8]: Area Perimeter Compactness Kernel.Length Kernel.Width Asymmetry.Coeff

0 15.26 14.84 0.8710 5.763 3.312 2.221

1 14.88 14.57 0.8811 5.554 3.333 1.018

2 14.29 14.09 0.9050 5.291 3.337 2.699

3 13.84 13.94 0.8955 5.324 3.379 2.259

4 16.14 14.99 0.9034 5.658 3.562 1.355

... ... ... ... ... ... ...

194 12.19 13.20 0.8783 5.137 2.981 3.631

195 11.23 12.88 0.8511 5.140 2.795 4.325

196 13.20 13.66 0.8883 5.236 3.232 8.315

197 11.84 13.21 0.8521 5.175 2.836 3.598

198 12.30 13.34 0.8684 5.243 2.974 5.637

199 rows × 6 columns

Resplitting Data for training and testing

In [9]: X_train, X_test, y_train, y_test = train_test_split(X , Y ,


test_size = 0.2,
random_state = 523)

Retraining the Perceptron Classifier

In [10]: # retraining the perceptron classifier


perc.fit(X_train, np.ravel(y_train))

# Using perceptron classifier to make predictions on test data


pred_test = perc.predict(X_test)

# calculating and displaying accuracy score of Perceptron classifier


accuracy = accuracy_score(y_test, pred_test)
print('% of Accuracy using Linear Perceptron: ', accuracy * 100)

% of Accuracy using Linear Perceptron: 75.0

Install scikit-neuralnetwork
In [1]: #scikit-neuralnetwork works withscikit-learn 0.18 and above

# installing scikit-neuralnetwork if not already installed


!pip install scikit-neuralnetwork

Processing c:\users\gurram\appdata\local\pip\cache\wheels\7d\42\93\b99bd6392fb56ec7831a6
95cb7a23dd9c73382b258614b62ed\scikit_neuralnetwork-0.7-py3-none-any.whl
Processing c:\users\gurram\appdata\local\pip\cache\wheels\a3\72\b6\89bbeb6140ee3756fa2bd
d2fb03003dd60d289851314b35fd7\lasagne-0.1-py3-none-any.whl
Processing c:\users\gurram\appdata\local\pip\cache\wheels\26\68\6f\745330367ce7822fe0cd8
63712858151f5723a0a5e322cc144\theano-1.0.5-py3-none-any.whl
Requirement already satisfied: colorama in d:\anaconda\lib\site-packages (from scikit-ne
uralnetwork) (0.4.3)
Requirement already satisfied: scikit-learn>=0.17 in c:\users\gurram\appdata\roaming\pyt
hon\python37\site-packages (from scikit-neuralnetwork) (1.0.2)
Requirement already satisfied: numpy in d:\anaconda\lib\site-packages (from Lasagne>=0.1
->scikit-neuralnetwork) (1.18.1)
Requirement already satisfied: scipy>=0.14 in c:\users\gurram\appdata\roaming\python\pyt
hon37\site-packages (from Theano>=0.8->scikit-neuralnetwork) (1.7.3)
Requirement already satisfied: six>=1.9.0 in d:\anaconda\lib\site-packages (from Theano>
=0.8->scikit-neuralnetwork) (1.14.0)
Requirement already satisfied: joblib>=0.11 in d:\anaconda\lib\site-packages (from sciki
t-learn>=0.17->scikit-neuralnetwork) (0.14.1)
Collecting threadpoolctl>=2.0.0
Downloading threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Installing collected packages: Lasagne, Theano, scikit-neuralnetwork, threadpoolctl
Successfully installed Lasagne-0.1 Theano-1.0.5 scikit-neuralnetwork-0.7 threadpoolctl-
3.1.0

Training the Multilayer Perceptron Classifier using


Backpropagation algorithm
In [12]: # importing required library
import sklearn.neural_network as nn

# Creating an instance of MLPClassifier class


# Taking maximum number of iterations = 1000
# constructing MLP network with 3 hidden layers with
# 100 neurons in hidden layer 1,
# 75 neurons in hidden layer 2,
# 50 neurons in hidden layer 3

mlp = nn.MLPClassifier(random_state = 560,


hidden_layer_sizes = [100, 75, 50],
max_iter = 1000)

In [14]: # Training the MLP classifier


mlp.fit(X_train, np.ravel(y_train))

pred_test = mlp.predict(X_test)
mlp_accuracy = accuracy_score(y_test, pred_test)
print('% of Accuracy using MultiLayer Perceptron: ', "{0:0.2f}".format(mlp_accuracy*100)

% of Accuracy using MultiLayer Perceptron: 87.50

You might also like