0% found this document useful (0 votes)
9 views

Machine Learning Group Project

Uploaded by

Hassen Mhd
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

Machine Learning Group Project

Uploaded by

Hassen Mhd
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

# GROUP MEMBERS ID

13/15 + 13/15 = 26/30


#SAMUEL AMSALU 0448/1321.5+31.5+26=79+2
#NATNAEL HABTE 0593/13 22+30.5+26=78.5+2
#BERHANU BASHAW 0130/13 28+30+26=84+2
#TEDDY GETANEH 0051/13 30+35+26=91+2
#ABRHAM TAREKEGN 1600396 24+25.5+26=75.5+2

import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
import numpy as np
from sklearn.impute import SimpleImputer
import matplotlib.pyplot as plt
from sklearn.preprocessing import StandardScaler

Name= pd.read_csv("Name.csv")

Name

Samuel Natnael Berhanu Teddy Abrham

0 5.1 3.5 1.4 0.2 Iris-setosa

1 4.9 NaN 1.4 0.2 Iris-setosa

2 4.7 3.2 NaN 0.2 Iris-setosa

3 4.6 NaN NaN 0.2 Iris-setosa

4 5.0 3.6 1.4 0.2 Iris-setosa

... ... ... ... ... ...

145 NaN NaN 5.2 2.3 Iris-virginica

146 6.3 2.5 5.0 1.9 Iris-virginica

147 NaN 3.0 5.2 2.0 Iris-virginica

148 6.2 3.4 5.4 2.3 Iris-virginica

149 5.9 NaN 5.1 1.8 Iris-virginica

150 rows × 5 columns

#Write a code to show the number of rows and columns of the dataset
print("\n", Name.shape) where is the import statement code?
(150, 5)

# Write the code to read the first five records of the dataset
print ("\n", Name.head()) where is the import statement code?
Samuel Natnael Berhanu Teddy Abrham
0 5.1 3.5 1.4 0.2 Iris-setosa
1 4.9 NaN 1.4 0.2 Iris-setosa
2 4.7 3.2 NaN 0.2 Iris-setosa
3 4.6 NaN NaN 0.2 Iris-setosa
4 5.0 3.6 1.4 0.2 Iris-setosa

# Write the code to read the last five records of the dataset
print ("\n", Name.tail ()) where is the import statement code?
Samuel Natnael Berhanu Teddy Abrham
145 NaN NaN 5.2 2.3 Iris-virginica
146 6.3 2.5 5.0 1.9 Iris-virginica
147 NaN 3.0 5.2 2.0 Iris-virginica
148 6.2 3.4 5.4 2.3 Iris-virginica
149 5.9 NaN 5.1 1.8 Iris-virginica

#Write the code to display first ten records of the dataset


print ("\n", Name[:10]) where is the import statement code?
Samuel Natnael Berhanu Teddy Abrham
0 5.1 3.5 1.4 0.2 Iris-setosa
1 4.9 NaN 1.4 0.2 Iris-setosa
2 4.7 3.2 NaN 0.2 Iris-setosa
3 4.6 NaN NaN 0.2 Iris-setosa
4 5.0 3.6 1.4 0.2 Iris-setosa
5 5.4 3.9 1.7 0.4 Iris-setosa
6 4.6 3.4 1.4 0.3 Iris-setosa
7 5.0 3.4 1.5 0.2 Iris-setosa
8 4.4 2.9 1.4 0.2 Iris-setosa
9 4.9 3.1 1.5 0.1 Iris-setosa
#Write the code to display the count, mean, standard deviation, minimum, 1st quadrant, 2nd
#quadrant, 3rd quadrant and the maximum value of the dataset

print ("\n", Name.describe()) where is the import statement code?

Samuel Natnael Berhanu Teddy


count 148.000000 146.000000 148.000000 150.000000
mean 5.833108 3.054795 3.790541 1.198667
std 0.828850 0.439448 1.754618 0.763161
min 4.300000 2.000000 1.000000 0.100000
25% 5.100000 2.800000 1.600000 0.300000
50% 5.800000 3.000000 4.400000 1.300000
75% 6.400000 3.300000 5.100000 1.800000
max 7.900000 4.400000 6.900000 2.500000

#Write the code that display the count size of each class of the dataset.
print ("\n", Name[('Teddy')].value_counts()) where is the import statement code?
Teddy
0.2 28
1.3 13
1.8 12
1.5 12
1.4 8
2.3 8
1.0 7
0.4 7
0.3 7
0.1 6
2.1 6
2.0 6
1.2 5
1.9 5
1.6 4
2.5 3
2.2 3
2.4 3
1.1 3
1.7 2
0.6 1
0.5 1
Name: count, dtype: int64

#Extract the independent features from the dataset (all except the class label)
print (Name.iloc[:,:-1].values) where is the import statement code?
[[5.1 3.5 1.4 0.2]
[4.9 nan 1.4 0.2]
[4.7 3.2 nan 0.2]
[4.6 nan nan 0.2]
[5. 3.6 1.4 0.2]
[5.4 3.9 1.7 0.4]
[4.6 3.4 1.4 0.3]
[5. 3.4 1.5 0.2]
[4.4 2.9 1.4 0.2]
[4.9 3.1 1.5 0.1]
[5.4 3.7 1.5 0.2]
[4.8 3.4 1.6 0.2]
[4.8 3. 1.4 0.1]
[4.3 3. 1.1 0.1]
[5.8 4. 1.2 0.2]
[5.7 4.4 1.5 0.4]
[5.4 3.9 1.3 0.4]
[5.1 3.5 1.4 0.3]
[5.7 3.8 1.7 0.3]
[5.1 3.8 1.5 0.3]
[5.4 3.4 1.7 0.2]
[5.1 3.7 1.5 0.4]
[4.6 3.6 1. 0.2]
[5.1 3.3 1.7 0.5]
[4.8 3.4 1.9 0.2]
[5. 3. 1.6 0.2]
[5. 3.4 1.6 0.4]
[5.2 3.5 1.5 0.2]
[5.2 3.4 1.4 0.2]
[4.7 3.2 1.6 0.2]
[4.8 3.1 1.6 0.2]
[5.4 3.4 1.5 0.4]
[5.2 4.1 1.5 0.1]
[5.5 4.2 1.4 0.2]
[4.9 3.1 1.5 0.1]
[5. 3.2 1.2 0.2]
[5.5 3.5 1.3 0.2]
[4.9 3.1 1.5 0.1]
[4.4 3. 1.3 0.2]
[5.1 3.4 1.5 0.2]
[5. 3.5 1.3 0.3]
[4.5 2.3 1.3 0.3]
[4.4 3.2 1.3 0.2]
[5. 3.5 1.6 0.6]
[5.1 3.8 1.9 0.4]
[4.8 3. 1.4 0.3]
[5.1 3.8 1.6 0.2]
[4.6 3.2 1.4 0.2]
[5.3 3.7 1.5 0.2]
[5. 3.3 1.4 0.2]
[7. 3.2 4.7 1.4]
[6.4 3.2 4.5 1.5]
[6.9 3.1 4.9 1.5]
[5.5 2.3 4. 1.3]
[6.5 2.8 4.6 1.5]
[5.7 2.8 4.5 1.3]
[6.3 3.3 4.7 1.6]
[4.9 2.4 3.3 1. ]
[6.6 2.9 4.6 1.3]
[5.2 2.7 3.9 1.4]
[5. 2. 3.5 1. ]
[5.9 3. 4.2 1.5]
[6. 2.2 4. 1. ]
[6.1 2.9 4.7 1.4]
[5.6 2.9 3.6 1.3]
[6.7 3.1 4.4 1.4]
[5.6 3. 4.5 1.5]
[5.8 2.7 4.1 1. ]
[6.2 2.2 4.5 1.5]
[5.6 2.5 3.9 1.1]
[5.9 3.2 4.8 1.8]
[6.1 2.8 4. 1.3]
[6.3 2.5 4.9 1.5]
[6.1 2.8 4.7 1.2]
[6.4 2.9 4.3 1.3]
[6.6 3. 4.4 1.4]
[6.8 2.8 4.8 1.4]
[6.7 3. 5. 1.7]
[6. 2.9 4.5 1.5]
[5.7 2.6 3.5 1. ]
[5.5 2.4 3.8 1.1]
[5.5 2.4 3.7 1. ]
[5.8 2.7 3.9 1.2]
[6. 2.7 5.1 1.6]
[5.4 3. 4.5 1.5]
[6. 3.4 4.5 1.6]
[6.7 3.1 4.7 1.5]
[6.3 2.3 4.4 1.3]
[5.6 3. 4.1 1.3]
[5.5 2.5 4. 1.3]
[5.5 2.6 4.4 1.2]
[6.1 3. 4.6 1.4]
[5.8 2.6 4. 1.2]
[5. 2.3 3.3 1. ]
[5.6 2.7 4.2 1.3]
[5.7 3. 4.2 1.2]
[5.7 2.9 4.2 1.3]
[6.2 2.9 4.3 1.3]
[5.1 2.5 3. 1.1]
[5.7 2.8 4.1 1.3]
[6.3 3.3 6. 2.5]
[5.8 2.7 5.1 1.9]
[7.1 3. 5.9 2.1]
[6.3 2.9 5.6 1.8]
[6.5 3. 5.8 2.2]
[7.6 3. 6.6 2.1]
[4.9 2.5 4.5 1.7]
[7.3 2.9 6.3 1.8]
[6.7 2.5 5.8 1.8]
[7.2 3.6 6.1 2.5]
[6.5 3.2 5.1 2. ]
[6.4 2.7 5.3 1.9]
[6.8 3. 5.5 2.1]
[5.7 2.5 5. 2. ]
[5.8 2.8 5.1 2.4]
[6.4 3.2 5.3 2.3]
[6.5 3. 5.5 1.8]
[7.7 3.8 6.7 2.2]
[7.7 2.6 6.9 2.3]
[6. 2.2 5. 1.5]
[6.9 3.2 5.7 2.3]
[5.6 2.8 4.9 2. ]
[7.7 2.8 6.7 2. ]
[6.3 2.7 4.9 1.8]
[6.7 3.3 5.7 2.1]
[7.2 3.2 6. 1.8]
[6.2 2.8 4.8 1.8]
[6.1 3. 4.9 1.8]
[6.4 2.8 5.6 2.1]
[7.2 3. 5.8 1.6]
[7.4 2.8 6.1 1.9]
[7.9 3.8 6.4 2. ]
[6.4 2.8 5.6 2.2]
[6.3 2.8 5.1 1.5]
[6.1 2.6 5.6 1.4]
[7.7 3. 6.1 2.3]
[6.3 3.4 5.6 2.4]
[6.4 3.1 5.5 1.8]
[6. 3. 4.8 1.8]
[6.9 3.1 5.4 2.1]
[6.7 3.1 5.6 2.4]
[6.9 3.1 5.1 2.3]
[5.8 2.7 5.1 1.9]
[6.8 3.2 5.9 2.3]
[6.7 3.3 5.7 2.5]
[nan nan 5.2 2.3]
[6.3 2.5 5. 1.9]
[nan 3. 5.2 2. ]
[6.2 3.4 5.4 2.3]
[5.9 nan 5.1 1.8]]

#Extract the dependent features from the dataset (the class label)
print (Name.iloc[:,4].values)

['Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa'


'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa'
'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa'
'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa'
'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa'
'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa'
'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa'
'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa'
'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa'
'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa' 'Iris-setosa'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor' 'Iris-versicolor'
'Iris-versicolor' 'Iris-versicolor' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica'
'Iris-virginica' 'Iris-virginica' 'Iris-virginica' 'Iris-virginica']

#Generate Histogram graph from the dataset where is the import statement code?
Name.hist()

array([[<Axes: title={'center': 'Samuel'}>,


<Axes: title={'center': 'Natnael'}>],
[<Axes: title={'center': 'Berhanu'}>,
<Axes: title={'center': 'Teddy'}>]], dtype=object)
#Generate Density graph plot from the dataset where is the import statement code?
Name.plot(kind='density', subplots=True, layout=(2,2), sharex=False)
plt.show()

# Generate boxplot graph plot from the dataset where is the import statement code?
Name.plot(kind='box', subplots=True, layout=(2,2), sharex=False)
plt.show()
where is the import statement code?
from sklearn.preprocessing import MinMaxScaler
from sklearn.preprocessing import StandardScaler

#Deliberately change the value of data to missing value in each feature of the dataset
print ("\n", Name.head())

Samuel Natnael Berhanu Teddy Abrham


0 5.1 3.5 1.4 0.2 Iris-setosa
1 4.9 NaN 1.4 0.2 Iris-setosa
2 4.7 3.2 NaN 0.2 Iris-setosa
3 4.6 NaN NaN 0.2 Iris-setosa
4 5.0 3.6 1.4 0.2 Iris-setosa

#Impute the missing value data with mean value of the corresponding feature value
# calculate the mean of each feature where is the import statement code?
data1=Name.iloc[:,:-1].values
imp_mean = SimpleImputer(missing_values=np.nan, strategy='mean')
imp_mean.fit(data1)
SimpleImputer()
data2=imp_mean.transform(data1)
print("\n\n\n",imp_mean.transform(data2))

[[5.1 3.5 1.4 0.2 ]


[4.9 3.05479452 1.4 0.2 ]
[4.7 3.2 3.79054054 0.2 ]
[4.6 3.05479452 3.79054054 0.2 ]
[5. 3.6 1.4 0.2 ]
[5.4 3.9 1.7 0.4 ]
[4.6 3.4 1.4 0.3 ]
[5. 3.4 1.5 0.2 ]
[4.4 2.9 1.4 0.2 ]
[4.9 3.1 1.5 0.1 ]
[5.4 3.7 1.5 0.2 ]
[4.8 3.4 1.6 0.2 ]
[4.8 3. 1.4 0.1 ]
[4.3 3. 1.1 0.1 ]
[5.8 4. 1.2 0.2 ]
[5.7 4.4 1.5 0.4 ]
[5.4 3.9 1.3 0.4 ]
[5.1 3.5 1.4 0.3 ]
[5.7 3.8 1.7 0.3 ]
[5.1 3.8 1.5 0.3 ]
[5.4 3.4 1.7 0.2 ]
[5.1 3.7 1.5 0.4 ]
[4.6 3.6 1. 0.2 ]
[5.1 3.3 1.7 0.5 ]
[4.8 3.4 1.9 0.2 ]
[5. 3. 1.6 0.2 ]
[5. 3.4 1.6 0.4 ]
[5.2 3.5 1.5 0.2 ]
[5.2 3.4 1.4 0.2 ]
[4.7 3.2 1.6 0.2 ]
[4.8 3.1 1.6 0.2 ]
[5.4 3.4 1.5 0.4 ]
[5.2 4.1 1.5 0.1 ]
[5.5 4.2 1.4 0.2 ]
[4.9 3.1 1.5 0.1 ]
[5. 3.2 1.2 0.2 ]
[5.5 3.5 1.3 0.2 ]
[4.9 3.1 1.5 0.1 ]
[4.4 3. 1.3 0.2 ]
[5.1 3.4 1.5 0.2 ]
[5. 3.5 1.3 0.3 ]
[4.5 2.3 1.3 0.3 ]
[4.4 3.2 1.3 0.2 ]
[5. 3.5 1.6 0.6 ]
[5.1 3.8 1.9 0.4 ]
[4.8 3. 1.4 0.3 ]
[5.1 3.8 1.6 0.2 ]
[4.6 3.2 1.4 0.2 ]
[5.3 3.7 1.5 0.2 ]
[5. 3.3 1.4 0.2 ]
[7. 3.2 4.7 1.4 ]
[6.4 3.2 4.5 1.5 ]
[6.9 3.1 4.9 1.5 ]
[5.5 2.3 4. 1.3 ]
[6.5 2.8 4.6 1.5 ]
[5.7 2.8 4.5 1.3 ]
[6.3 3.3 4.7 1.6 ]
[4.9 2.4 3.3 1. ]
[6.6 2.9 4.6 1.3 ]
[5.2 2.7 3.9 1.4 ]
[5. 2. 3.5 1. ]
[5.9 3. 4.2 1.5 ]
[6. 2.2 4. 1. ]
[6.1 2.9 4.7 1.4 ]
[5.6 2.9 3.6 1.3 ]
[6.7 3.1 4.4 1.4 ]
[5.6 3. 4.5 1.5 ]
[5.8 2.7 4.1 1. ]
[6.2 2.2 4.5 1.5 ]
[5.6 2.5 3.9 1.1 ]
[5.9 3.2 4.8 1.8 ]
[6.1 2.8 4. 1.3 ]
[6.3 2.5 4.9 1.5 ]
[6.1 2.8 4.7 1.2 ]
[6.4 2.9 4.3 1.3 ]
[6.6 3. 4.4 1.4 ]
[6.8 2.8 4.8 1.4 ]
[6.7 3. 5. 1.7 ]
[6. 2.9 4.5 1.5 ]
[5.7 2.6 3.5 1. ]
[5.5 2.4 3.8 1.1 ]
[5.5 2.4 3.7 1. ]
[5.8 2.7 3.9 1.2 ]
[6. 2.7 5.1 1.6 ]
[5.4 3. 4.5 1.5 ]
[6. 3.4 4.5 1.6 ]
[6.7 3.1 4.7 1.5 ]
[6.3 2.3 4.4 1.3 ]
[5.6 3. 4.1 1.3 ]
[5.5 2.5 4. 1.3 ]
[5.5 2.6 4.4 1.2 ]
[6.1 3. 4.6 1.4 ]
[5.8 2.6 4. 1.2 ]
[5. 2.3 3.3 1. ]
[5.6 2.7 4.2 1.3 ]
[5.7 3. 4.2 1.2 ]
[5.7 2.9 4.2 1.3 ]
[6.2 2.9 4.3 1.3 ]
[5.1 2.5 3. 1.1 ]
[5.7 2.8 4.1 1.3 ]
[6.3 3.3 6. 2.5 ]
[5.8 2.7 5.1 1.9 ]
[7.1 3. 5.9 2.1 ]
[6.3 2.9 5.6 1.8 ]
[6.5 3. 5.8 2.2 ]
[7.6 3. 6.6 2.1 ]
[4.9 2.5 4.5 1.7 ]
[7.3 2.9 6.3 1.8 ]
[6.7 2.5 5.8 1.8 ]
[7.2 3.6 6.1 2.5 ]
[6.5 3.2 5.1 2. ]
[6.4 2.7 5.3 1.9 ]
[6.8 3. 5.5 2.1 ]
[5.7 2.5 5. 2. ]
[5.8 2.8 5.1 2.4 ]
[6.4 3.2 5.3 2.3 ]
[6.5 3. 5.5 1.8 ]
[7.7 3.8 6.7 2.2 ]
[7.7 2.6 6.9 2.3 ]
[6. 2.2 5. 1.5 ]
[6.9 3.2 5.7 2.3 ]
[5.6 2.8 4.9 2. ]
[7.7 2.8 6.7 2. ]
[6.3 2.7 4.9 1.8 ]
[6.7 3.3 5.7 2.1 ]
[7.2 3.2 6. 1.8 ]
[6.2 2.8 4.8 1.8 ]
[6.1 3. 4.9 1.8 ]
[6.4 2.8 5.6 2.1 ]
[7.2 3. 5.8 1.6 ]
[7.4 2.8 6.1 1.9 ]
[7.9 3.8 6.4 2. ]
[6.4 2.8 5.6 2.2 ]
[6.3 2.8 5.1 1.5 ]
[6.1 2.6 5.6 1.4 ]
[7.7 3. 6.1 2.3 ]
[6.3 3.4 5.6 2.4 ]
[6.4 3.1 5.5 1.8 ]
[6. 3. 4.8 1.8 ]
[6.9 3.1 5.4 2.1 ]
[6.7 3.1 5.6 2.4 ]
[6.9 3.1 5.1 2.3 ]
[5.8 2.7 5.1 1.9 ]
[6.8 3.2 5.9 2.3 ]
[6.7 3.3 5.7 2.5 ]
[5.83310811 3.05479452 5.2 2.3 ]
[6.3 2.5 5. 1.9 ]
[5.83310811 3. 5.2 2. ]
[6.2 3.4 5.4 2.3 ]
[5.9 3.05479452 5.1 1.8 ]]

# Adjust the precision to 2 round off


# Calculate the mean of each feature
where is the import statement code?
data2 = data2.round (2)
print (data2)

[[5.1 3.5 1.4 0.2 ]


[4.9 3.05 1.4 0.2 ]
[4.7 3.2 3.79 0.2 ]
[4.6 3.05 3.79 0.2 ]
[5. 3.6 1.4 0.2 ]
[5.4 3.9 1.7 0.4 ]
[4.6 3.4 1.4 0.3 ]
[5. 3.4 1.5 0.2 ]
[4.4 2.9 1.4 0.2 ]
[4.9 3.1 1.5 0.1 ]
[5.4 3.7 1.5 0.2 ]
[4.8 3.4 1.6 0.2 ]
[4.8 3. 1.4 0.1 ]
[4.3 3. 1.1 0.1 ]
[5.8 4. 1.2 0.2 ]
[5.7 4.4 1.5 0.4 ]
[5.4 3.9 1.3 0.4 ]
[5.1 3.5 1.4 0.3 ]
[5.7 3.8 1.7 0.3 ]
[5.1 3.8 1.5 0.3 ]
[5.4 3.4 1.7 0.2 ]
[5.1 3.7 1.5 0.4 ]
[4.6 3.6 1. 0.2 ]
[5.1 3.3 1.7 0.5 ]
[4.8 3.4 1.9 0.2 ]
[5. 3. 1.6 0.2 ]
[5. 3.4 1.6 0.4 ]
[5.2 3.5 1.5 0.2 ]
[5.2 3.4 1.4 0.2 ]
[4.7 3.2 1.6 0.2 ]
[4.8 3.1 1.6 0.2 ]
[5.4 3.4 1.5 0.4 ]
[5.2 4.1 1.5 0.1 ]
[5.5 4.2 1.4 0.2 ]
[4.9 3.1 1.5 0.1 ]
[5. 3.2 1.2 0.2 ]
[5.5 3.5 1.3 0.2 ]
[4.9 3.1 1.5 0.1 ]
[4.4 3. 1.3 0.2 ]
[5.1 3.4 1.5 0.2 ]
[5. 3.5 1.3 0.3 ]
[4.5 2.3 1.3 0.3 ]
[4.4 3.2 1.3 0.2 ]
[5. 3.5 1.6 0.6 ]
[5.1 3.8 1.9 0.4 ]
[4.8 3. 1.4 0.3 ]
[5.1 3.8 1.6 0.2 ]
[4.6 3.2 1.4 0.2 ]
[5.3 3.7 1.5 0.2 ]
[5. 3.3 1.4 0.2 ]
[7. 3.2 4.7 1.4 ]
[6.4 3.2 4.5 1.5 ]
[6.9 3.1 4.9 1.5 ]
[5.5 2.3 4. 1.3 ]
[6.5 2.8 4.6 1.5 ]
[5.7 2.8 4.5 1.3 ]
[6.3 3.3 4.7 1.6 ]
[4.9 2.4 3.3 1. ]
[6.6 2.9 4.6 1.3 ]
[5.2 2.7 3.9 1.4 ]
[5. 2. 3.5 1. ]
[5.9 3. 4.2 1.5 ]
[6. 2.2 4. 1. ]
[6.1 2.9 4.7 1.4 ]
[5.6 2.9 3.6 1.3 ]
[6.7 3.1 4.4 1.4 ]
[5.6 3. 4.5 1.5 ]
[5.8 2.7 4.1 1. ]
[6.2 2.2 4.5 1.5 ]
[5.6 2.5 3.9 1.1 ]
[5.9 3.2 4.8 1.8 ]
[6.1 2.8 4. 1.3 ]
[6.3 2.5 4.9 1.5 ]
[6.1 2.8 4.7 1.2 ]
[6.4 2.9 4.3 1.3 ]
[6.6 3. 4.4 1.4 ]
[6.8 2.8 4.8 1.4 ]
[6.7 3. 5. 1.7 ]
[6. 2.9 4.5 1.5 ]
[5.7 2.6 3.5 1. ]
[5.5 2.4 3.8 1.1 ]
[5.5 2.4 3.7 1. ]
[5.8 2.7 3.9 1.2 ]
[6. 2.7 5.1 1.6 ]
[5.4 3. 4.5 1.5 ]
[6. 3.4 4.5 1.6 ]
[6.7 3.1 4.7 1.5 ]
[6.3 2.3 4.4 1.3 ]
[5.6 3. 4.1 1.3 ]
[5.5 2.5 4. 1.3 ]
[5.5 2.6 4.4 1.2 ]
[6.1 3. 4.6 1.4 ]
[5.8 2.6 4. 1.2 ]
[5. 2.3 3.3 1. ]
[5.6 2.7 4.2 1.3 ]
[5.7 3. 4.2 1.2 ]
[5.7 2.9 4.2 1.3 ]
[6.2 2.9 4.3 1.3 ]
[5.1 2.5 3. 1.1 ]
[5.7 2.8 4.1 1.3 ]
[6.3 3.3 6. 2.5 ]
[5.8 2.7 5.1 1.9 ]
[7.1 3. 5.9 2.1 ]
[6.3 2.9 5.6 1.8 ]
[6.5 3. 5.8 2.2 ]
[7.6 3. 6.6 2.1 ]
[4.9 2.5 4.5 1.7 ]
[7.3 2.9 6.3 1.8 ]
[6.7 2.5 5.8 1.8 ]
[7.2 3.6 6.1 2.5 ]
[6.5 3.2 5.1 2. ]
[6.4 2.7 5.3 1.9 ]
[6.8 3. 5.5 2.1 ]
[5.7 2.5 5. 2. ]
[5.8 2.8 5.1 2.4 ]
[6.4 3.2 5.3 2.3 ]
[6.5 3. 5.5 1.8 ]
[7.7 3.8 6.7 2.2 ]
[7.7 2.6 6.9 2.3 ]
[6. 2.2 5. 1.5 ]
[6.9 3.2 5.7 2.3 ]
[5.6 2.8 4.9 2. ]
[7.7 2.8 6.7 2. ]
[6.3 2.7 4.9 1.8 ]
[6.7 3.3 5.7 2.1 ]
[7.2 3.2 6. 1.8 ]
[6.2 2.8 4.8 1.8 ]
[6.1 3. 4.9 1.8 ]
[6.4 2.8 5.6 2.1 ]
[7.2 3. 5.8 1.6 ]
[7.4 2.8 6.1 1.9 ]
[7.9 3.8 6.4 2. ]
[6.4 2.8 5.6 2.2 ]
[6.3 2.8 5.1 1.5 ]
[6.1 2.6 5.6 1.4 ]
[7.7 3. 6.1 2.3 ]
[6.3 3.4 5.6 2.4 ]
[6.4 3.1 5.5 1.8 ]
[6. 3. 4.8 1.8 ]
[6.9 3.1 5.4 2.1 ]
[6.7 3.1 5.6 2.4 ]
[6.9 3.1 5.1 2.3 ]
[5.8 2.7 5.1 1.9 ]
[6.8 3.2 5.9 2.3 ]
[6.7 3.3 5.7 2.5 ]
[5.83 3.05 5.2 2.3 ]
[6.3 2.5 5. 1.9 ]
[5.83 3. 5.2 2. ]
[6.2 3.4 5.4 2.3 ]
[5.9 3.05 5.1 1.8 ]]

#Impute the missing value data with most frequent value of the corresponding feature
# Find the most frequent value in each feature where is the import statement code?
print (round(Name.isnull().sum(), 2).sort_values(ascending=False))
imp_mode=imputer = SimpleImputer(strategy='most_frequent')
imp_mode.fit(Name)
data2=imp_mode.transform(Name)
print (data2)

Natnael 4
Samuel 2
Berhanu 2
Teddy 0
Abrham 0
dtype: int64
[[5.1 3.5 1.4 0.2 'Iris-setosa']
[4.9 3.0 1.4 0.2 'Iris-setosa']
[4.7 3.2 1.5 0.2 'Iris-setosa']
[4.6 3.0 1.5 0.2 'Iris-setosa']
[5.0 3.6 1.4 0.2 'Iris-setosa']
[5.4 3.9 1.7 0.4 'Iris-setosa']
[4.6 3.4 1.4 0.3 'Iris-setosa']
[5.0 3.4 1.5 0.2 'Iris-setosa']
[4.4 2.9 1.4 0.2 'Iris-setosa']
[4.9 3.1 1.5 0.1 'Iris-setosa']
[5.4 3.7 1.5 0.2 'Iris-setosa']
[4.8 3.4 1.6 0.2 'Iris-setosa']
[4.8 3.0 1.4 0.1 'Iris-setosa']
[4.3 3.0 1.1 0.1 'Iris-setosa']
[5.8 4.0 1.2 0.2 'Iris-setosa']
[5.7 4.4 1.5 0.4 'Iris-setosa']
[5.4 3.9 1.3 0.4 'Iris-setosa']
[5.1 3.5 1.4 0.3 'Iris-setosa']
[5.7 3.8 1.7 0.3 'Iris-setosa']
[5.1 3.8 1.5 0.3 'Iris-setosa']
[5.4 3.4 1.7 0.2 'Iris-setosa']
[5.1 3.7 1.5 0.4 'Iris-setosa']
[4.6 3.6 1.0 0.2 'Iris-setosa']
[5.1 3.3 1.7 0.5 'Iris-setosa']
[4.8 3.4 1.9 0.2 'Iris-setosa']
[5.0 3.0 1.6 0.2 'Iris-setosa']
[5.0 3.4 1.6 0.4 'Iris-setosa']
[5.2 3.5 1.5 0.2 'Iris-setosa']
[5.2 3.4 1.4 0.2 'Iris-setosa']
[4.7 3.2 1.6 0.2 'Iris-setosa']
[4.8 3.1 1.6 0.2 'Iris-setosa']
[5.4 3.4 1.5 0.4 'Iris-setosa']
[5.2 4.1 1.5 0.1 'Iris-setosa']
[5.5 4.2 1.4 0.2 'Iris-setosa']
[4.9 3.1 1.5 0.1 'Iris-setosa']
[5.0 3.2 1.2 0.2 'Iris-setosa']
[5.5 3.5 1.3 0.2 'Iris-setosa']
[4.9 3.1 1.5 0.1 'Iris-setosa']
[4.4 3.0 1.3 0.2 'Iris-setosa']
[5.1 3.4 1.5 0.2 'Iris-setosa']
[5.0 3.5 1.3 0.3 'Iris-setosa']
[4.5 2.3 1.3 0.3 'Iris-setosa']
[4.4 3.2 1.3 0.2 'Iris-setosa']
[5.0 3.5 1.6 0.6 'Iris-setosa']
[5.1 3.8 1.9 0.4 'Iris-setosa']
[4.8 3.0 1.4 0.3 'Iris-setosa']
[5.1 3.8 1.6 0.2 'Iris-setosa']
[4.6 3.2 1.4 0.2 'Iris-setosa']
[5.3 3.7 1.5 0.2 'Iris-setosa']
[5.0 3.3 1.4 0.2 'Iris-setosa']
[7.0 3.2 4.7 1.4 'Iris-versicolor']
[6.4 3.2 4.5 1.5 'Iris-versicolor']
[6.9 3.1 4.9 1.5 'Iris-versicolor']
[5.5 2.3 4.0 1.3 'Iris-versicolor']
[6.5 2.8 4.6 1.5 'Iris-versicolor']
[5.7 2.8 4.5 1.3 'Iris-versicolor']
[6.3 3.3 4.7 1.6 'Iris-versicolor']
[4.9 2.4 3.3 1.0 'Iris-versicolor']
[6.6 2.9 4.6 1.3 'Iris-versicolor']
[5.2 2.7 3.9 1.4 'Iris-versicolor']
[5.0 2.0 3.5 1.0 'Iris-versicolor']
[5.9 3.0 4.2 1.5 'Iris-versicolor']
[6.0 2.2 4.0 1.0 'Iris-versicolor']
[6.1 2.9 4.7 1.4 'Iris-versicolor']
[5.6 2.9 3.6 1.3 'Iris-versicolor']
[6.7 3.1 4.4 1.4 'Iris-versicolor']
[5.6 3.0 4.5 1.5 'Iris-versicolor']
[5.8 2.7 4.1 1.0 'Iris-versicolor']
[6.2 2.2 4.5 1.5 'Iris-versicolor']
[5.6 2.5 3.9 1.1 'Iris-versicolor']
[5.9 3.2 4.8 1.8 'Iris-versicolor']
[6.1 2.8 4.0 1.3 'Iris-versicolor']
[6.3 2.5 4.9 1.5 'Iris-versicolor']
[6.1 2.8 4.7 1.2 'Iris-versicolor']
[6.4 2.9 4.3 1.3 'Iris-versicolor']
[6.6 3.0 4.4 1.4 'Iris-versicolor']
[6.8 2.8 4.8 1.4 'Iris-versicolor']
[6.7 3.0 5.0 1.7 'Iris-versicolor']
[6.0 2.9 4.5 1.5 'Iris-versicolor']
[5.7 2.6 3.5 1.0 'Iris-versicolor']
[5.5 2.4 3.8 1.1 'Iris-versicolor']
[5.5 2.4 3.7 1.0 'Iris-versicolor']
[5.8 2.7 3.9 1.2 'Iris-versicolor']
[6.0 2.7 5.1 1.6 'Iris-versicolor']
[5.4 3.0 4.5 1.5 'Iris-versicolor']
[6.0 3.4 4.5 1.6 'Iris-versicolor']
[6.7 3.1 4.7 1.5 'Iris-versicolor']
[6.3 2.3 4.4 1.3 'Iris-versicolor']
[5.6 3.0 4.1 1.3 'Iris-versicolor']
[5.5 2.5 4.0 1.3 'Iris-versicolor']
[5.5 2.6 4.4 1.2 'Iris-versicolor']
[6.1 3.0 4.6 1.4 'Iris-versicolor']
[5.8 2.6 4.0 1.2 'Iris-versicolor']
[5.0 2.3 3.3 1.0 'Iris-versicolor']
[5.6 2.7 4.2 1.3 'Iris-versicolor']
[5.7 3.0 4.2 1.2 'Iris-versicolor']
[5.7 2.9 4.2 1.3 'Iris-versicolor']
[6.2 2.9 4.3 1.3 'Iris-versicolor']
[5.1 2.5 3.0 1.1 'Iris-versicolor']
[5.7 2.8 4.1 1.3 'Iris-versicolor']
[6.3 3.3 6.0 2.5 'Iris-virginica']
[5.8 2.7 5.1 1.9 'Iris-virginica']
[7.1 3.0 5.9 2.1 'Iris-virginica']
[6.3 2.9 5.6 1.8 'Iris-virginica']
[6.5 3.0 5.8 2.2 'Iris-virginica']
[7.6 3.0 6.6 2.1 'Iris-virginica']
[4.9 2.5 4.5 1.7 'Iris-virginica']
[7.3 2.9 6.3 1.8 'Iris-virginica']
[6.7 2.5 5.8 1.8 'Iris-virginica']
[7.2 3.6 6.1 2.5 'Iris-virginica']
[6.5 3.2 5.1 2.0 'Iris-virginica']
[6.4 2.7 5.3 1.9 'Iris-virginica']
[6.8 3.0 5.5 2.1 'Iris-virginica']
[5.7 2.5 5.0 2.0 'Iris-virginica']
[5.8 2.8 5.1 2.4 'Iris-virginica']
[6.4 3.2 5.3 2.3 'Iris-virginica']
[6.5 3.0 5.5 1.8 'Iris-virginica']
[7.7 3.8 6.7 2.2 'Iris-virginica']
[7.7 2.6 6.9 2.3 'Iris-virginica']
[6.0 2.2 5.0 1.5 'Iris-virginica']
[6.9 3.2 5.7 2.3 'Iris-virginica']
[5.6 2.8 4.9 2.0 'Iris-virginica']
[7.7 2.8 6.7 2.0 'Iris-virginica']
[6.3 2.7 4.9 1.8 'Iris-virginica']
[6.7 3.3 5.7 2.1 'Iris-virginica']
[7.2 3.2 6.0 1.8 'Iris-virginica']
[6.2 2.8 4.8 1.8 'Iris-virginica']
[6.1 3.0 4.9 1.8 'Iris-virginica']
[6.4 2.8 5.6 2.1 'Iris-virginica']
[7.2 3.0 5.8 1.6 'Iris-virginica']
[7.4 2.8 6.1 1.9 'Iris-virginica']
[7.9 3.8 6.4 2.0 'Iris-virginica']
[6.4 2.8 5.6 2.2 'Iris-virginica']
[6.3 2.8 5.1 1.5 'Iris-virginica']
[6.1 2.6 5.6 1.4 'Iris-virginica']
[7.7 3.0 6.1 2.3 'Iris-virginica']
[6.3 3.4 5.6 2.4 'Iris-virginica']
[6.4 3.1 5.5 1.8 'Iris-virginica']
[6.0 3.0 4.8 1.8 'Iris-virginica']
[6.9 3.1 5.4 2.1 'Iris-virginica']
[6.7 3.1 5.6 2.4 'Iris-virginica']
[6.9 3.1 5.1 2.3 'Iris-virginica']
[5.8 2.7 5.1 1.9 'Iris-virginica']
[6.8 3.2 5.9 2.3 'Iris-virginica']
[6.7 3.3 5.7 2.5 'Iris-virginica']
[5.0 3.0 5.2 2.3 'Iris-virginica']
[6.3 2.5 5.0 1.9 'Iris-virginica']
[5.0 3.0 5.2 2.0 'Iris-virginica']
[6.2 3.4 5.4 2.3 'Iris-virginica']
[5.9 3.0 5.1 1.8 'Iris-virginica']]

# Impute missing values with 100 where is the import statement code?
print (round(Name.isnull().sum(), 2).sort_values(ascending=False))
imputer = SimpleImputer(strategy='constant', fill_value=100)
imp_cons=imputer
imp_cons.fit(Name)
data2=imp_cons.transform(Name)
print (data2)

Natnael 4
Samuel 2
Berhanu 2
Teddy 0
Abrham 0
dtype: int64
[[5.1 3.5 1.4 0.2 'Iris-setosa']
[4.9 100 1.4 0.2 'Iris-setosa']
[4.7 3.2 100 0.2 'Iris-setosa']
[4.6 100 100 0.2 'Iris-setosa']
[5.0 3.6 1.4 0.2 'Iris-setosa']
[5.4 3.9 1.7 0.4 'Iris-setosa']
[4.6 3.4 1.4 0.3 'Iris-setosa']
[5.0 3.4 1.5 0.2 'Iris-setosa']
[4.4 2.9 1.4 0.2 'Iris-setosa']
[4.9 3.1 1.5 0.1 'Iris-setosa']
[5.4 3.7 1.5 0.2 'Iris-setosa']
[4.8 3.4 1.6 0.2 'Iris-setosa']
[4.8 3.0 1.4 0.1 'Iris-setosa']
[4.3 3.0 1.1 0.1 'Iris-setosa']
[5.8 4.0 1.2 0.2 'Iris-setosa']
[5.7 4.4 1.5 0.4 'Iris-setosa']
[5.4 3.9 1.3 0.4 'Iris-setosa']
[5.1 3.5 1.4 0.3 'Iris-setosa']
[5.7 3.8 1.7 0.3 'Iris-setosa']
[5.1 3.8 1.5 0.3 'Iris-setosa']
[5.4 3.4 1.7 0.2 'Iris-setosa']
[5.1 3.7 1.5 0.4 'Iris-setosa']
[4.6 3.6 1.0 0.2 'Iris-setosa']
[5.1 3.3 1.7 0.5 'Iris-setosa']
[4.8 3.4 1.9 0.2 'Iris-setosa']
[5.0 3.0 1.6 0.2 'Iris-setosa']
[5.0 3.4 1.6 0.4 'Iris-setosa']
[5.2 3.5 1.5 0.2 'Iris-setosa']
[5.2 3.4 1.4 0.2 'Iris-setosa']
[4.7 3.2 1.6 0.2 'Iris-setosa']
[4.8 3.1 1.6 0.2 'Iris-setosa']
[5.4 3.4 1.5 0.4 'Iris-setosa']
[5.2 4.1 1.5 0.1 'Iris-setosa']
[5.5 4.2 1.4 0.2 'Iris-setosa']
[4.9 3.1 1.5 0.1 'Iris-setosa']
[5.0 3.2 1.2 0.2 'Iris-setosa']
[5.5 3.5 1.3 0.2 'Iris-setosa']
[4.9 3.1 1.5 0.1 'Iris-setosa']
[4.4 3.0 1.3 0.2 'Iris-setosa']
[5.1 3.4 1.5 0.2 'Iris-setosa']
[5.0 3.5 1.3 0.3 'Iris-setosa']
[4.5 2.3 1.3 0.3 'Iris-setosa']
[4.4 3.2 1.3 0.2 'Iris-setosa']
[5.0 3.5 1.6 0.6 'Iris-setosa']
[5.1 3.8 1.9 0.4 'Iris-setosa']
[4.8 3.0 1.4 0.3 'Iris-setosa']
[5.1 3.8 1.6 0.2 'Iris-setosa']
[4.6 3.2 1.4 0.2 'Iris-setosa']
[5.3 3.7 1.5 0.2 'Iris-setosa']
[5.0 3.3 1.4 0.2 'Iris-setosa']
[7.0 3.2 4.7 1.4 'Iris-versicolor']
[6.4 3.2 4.5 1.5 'Iris-versicolor']
[6.9 3.1 4.9 1.5 'Iris-versicolor']
[5.5 2.3 4.0 1.3 'Iris-versicolor']
[6.5 2.8 4.6 1.5 'Iris-versicolor']
[5.7 2.8 4.5 1.3 'Iris-versicolor']
[6.3 3.3 4.7 1.6 'Iris-versicolor']
[4.9 2.4 3.3 1.0 'Iris-versicolor']
[6.6 2.9 4.6 1.3 'Iris-versicolor']
[5.2 2.7 3.9 1.4 'Iris-versicolor']
[5.0 2.0 3.5 1.0 'Iris-versicolor']
[5.9 3.0 4.2 1.5 'Iris-versicolor']
[6.0 2.2 4.0 1.0 'Iris-versicolor']
[6.1 2.9 4.7 1.4 'Iris-versicolor']
[5.6 2.9 3.6 1.3 'Iris-versicolor']
[6.7 3.1 4.4 1.4 'Iris-versicolor']
[5.6 3.0 4.5 1.5 'Iris-versicolor']
[5.8 2.7 4.1 1.0 'Iris-versicolor']
[6.2 2.2 4.5 1.5 'Iris-versicolor']
[5.6 2.5 3.9 1.1 'Iris-versicolor']
[5.9 3.2 4.8 1.8 'Iris-versicolor']
[6.1 2.8 4.0 1.3 'Iris-versicolor']
[6.3 2.5 4.9 1.5 'Iris-versicolor']
[6.1 2.8 4.7 1.2 'Iris-versicolor']
[6.4 2.9 4.3 1.3 'Iris-versicolor']
[6.6 3.0 4.4 1.4 'Iris-versicolor']
[6.8 2.8 4.8 1.4 'Iris-versicolor']
[6.7 3.0 5.0 1.7 'Iris-versicolor']
[6.0 2.9 4.5 1.5 'Iris-versicolor']
[5.7 2.6 3.5 1.0 'Iris-versicolor']
[5.5 2.4 3.8 1.1 'Iris-versicolor']
[5.5 2.4 3.7 1.0 'Iris-versicolor']
[5.8 2.7 3.9 1.2 'Iris-versicolor']
[6.0 2.7 5.1 1.6 'Iris-versicolor']
[5.4 3.0 4.5 1.5 'Iris-versicolor']
[6.0 3.4 4.5 1.6 'Iris-versicolor']
[6.7 3.1 4.7 1.5 'Iris-versicolor']
[6.3 2.3 4.4 1.3 'Iris-versicolor']
[5.6 3.0 4.1 1.3 'Iris-versicolor']
[5.5 2.5 4.0 1.3 'Iris-versicolor']
[5.5 2.6 4.4 1.2 'Iris-versicolor']
[6.1 3.0 4.6 1.4 'Iris-versicolor']
[5.8 2.6 4.0 1.2 'Iris-versicolor']
[5.0 2.3 3.3 1.0 'Iris-versicolor']
[5.6 2.7 4.2 1.3 'Iris-versicolor']
[5.7 3.0 4.2 1.2 'Iris-versicolor']
[5.7 2.9 4.2 1.3 'Iris-versicolor']
[6.2 2.9 4.3 1.3 'Iris-versicolor']
[5.1 2.5 3.0 1.1 'Iris-versicolor']
[5.7 2.8 4.1 1.3 'Iris-versicolor']
[6.3 3.3 6.0 2.5 'Iris-virginica']
[5.8 2.7 5.1 1.9 'Iris-virginica']
[7.1 3.0 5.9 2.1 'Iris-virginica']
[6.3 2.9 5.6 1.8 'Iris-virginica']
[6.5 3.0 5.8 2.2 'Iris-virginica']
[7.6 3.0 6.6 2.1 'Iris-virginica']
[4.9 2.5 4.5 1.7 'Iris-virginica']
[7.3 2.9 6.3 1.8 'Iris-virginica']
[6.7 2.5 5.8 1.8 'Iris-virginica']
[7.2 3.6 6.1 2.5 'Iris-virginica']
[6.5 3.2 5.1 2.0 'Iris-virginica']
[6.4 2.7 5.3 1.9 'Iris-virginica']
[6.8 3.0 5.5 2.1 'Iris-virginica']
[5.7 2.5 5.0 2.0 'Iris-virginica']
[5.8 2.8 5.1 2.4 'Iris-virginica']
[6.4 3.2 5.3 2.3 'Iris-virginica']
[6.5 3.0 5.5 1.8 'Iris-virginica']
[7.7 3.8 6.7 2.2 'Iris-virginica']
[7.7 2.6 6.9 2.3 'Iris-virginica']
[6.0 2.2 5.0 1.5 'Iris-virginica']
[6.9 3.2 5.7 2.3 'Iris-virginica']
[5.6 2.8 4.9 2.0 'Iris-virginica']
[7.7 2.8 6.7 2.0 'Iris-virginica']
[6.3 2.7 4.9 1.8 'Iris-virginica']
[6.7 3.3 5.7 2.1 'Iris-virginica']
[7.2 3.2 6.0 1.8 'Iris-virginica']
[6.2 2.8 4.8 1.8 'Iris-virginica']
[6.1 3.0 4.9 1.8 'Iris-virginica']
[6.4 2.8 5.6 2.1 'Iris-virginica']
[7.2 3.0 5.8 1.6 'Iris-virginica']
[7.4 2.8 6.1 1.9 'Iris-virginica']
[7.9 3.8 6.4 2.0 'Iris-virginica']
[6.4 2.8 5.6 2.2 'Iris-virginica']
[6.3 2.8 5.1 1.5 'Iris-virginica']
[6.1 2.6 5.6 1.4 'Iris-virginica']
[7.7 3.0 6.1 2.3 'Iris-virginica']
[6.3 3.4 5.6 2.4 'Iris-virginica']
[6.4 3.1 5.5 1.8 'Iris-virginica']
[6.0 3.0 4.8 1.8 'Iris-virginica']
[6.9 3.1 5.4 2.1 'Iris-virginica']
[6.7 3.1 5.6 2.4 'Iris-virginica']
[6.9 3.1 5.1 2.3 'Iris-virginica']
[5.8 2.7 5.1 1.9 'Iris-virginica']
[6.8 3.2 5.9 2.3 'Iris-virginica']
[6.7 3.3 5.7 2.5 'Iris-virginica']
[100 100 5.2 2.3 'Iris-virginica']
[6.3 2.5 5.0 1.9 'Iris-virginica']
[100 3.0 5.2 2.0 'Iris-virginica']
[6.2 3.4 5.4 2.3 'Iris-virginica']
[5.9 100 5.1 1.8 'Iris-virginica']]

from sklearn.impute import SimpleImputer, KNNImputer


# Impute the missing value data with KNN where N=2
data2= (Name.iloc[:,:-1])
# creating a data frame from the list
Before_imputation = data2
#print dataset before imputation
print("Data Before performing imputation\n",Before_imputation)
# create an object for KNNImputer
imputer = KNNImputer(n_neighbors=2)
After_imputation = imputer.fit_transform(Before_imputation)
# print dataset after performing the operation
print("\n\nAfter performing imputation\n",After_imputation)

Data Before performing imputation


Samuel Natnael Berhanu Teddy
0 5.1 3.5 1.4 0.2
1 4.9 NaN 1.4 0.2
2 4.7 3.2 NaN 0.2
3 4.6 NaN NaN 0.2
4 5.0 3.6 1.4 0.2
.. ... ... ... ...
145 NaN NaN 5.2 2.3
146 6.3 2.5 5.0 1.9
147 NaN 3.0 5.2 2.0
148 6.2 3.4 5.4 2.3
149 5.9 NaN 5.1 1.8

[150 rows x 4 columns]

After performing imputation


[[5.1 3.5 1.4 0.2 ]
[4.9 3.45 1.4 0.2 ]
[4.7 3.2 1.5 0.2 ]
[4.6 3.4 1.2 0.2 ]
[5. 3.6 1.4 0.2 ]
[5.4 3.9 1.7 0.4 ]
[4.6 3.4 1.4 0.3 ]
[5. 3.4 1.5 0.2 ]
[4.4 2.9 1.4 0.2 ]
[4.9 3.1 1.5 0.1 ]
[5.4 3.7 1.5 0.2 ]
[4.8 3.4 1.6 0.2 ]
[4.8 3. 1.4 0.1 ]
[4.3 3. 1.1 0.1 ]
[5.8 4. 1.2 0.2 ]
[5.7 4.4 1.5 0.4 ]
[5.4 3.9 1.3 0.4 ]
[5.1 3.5 1.4 0.3 ]
[5.7 3.8 1.7 0.3 ]
[5.1 3.8 1.5 0.3 ]
[5.4 3.4 1.7 0.2 ]
[5.1 3.7 1.5 0.4 ]
[4.6 3.6 1. 0.2 ]
[5.1 3.3 1.7 0.5 ]
[4.8 3.4 1.9 0.2 ]
[5. 3. 1.6 0.2 ]
[5. 3.4 1.6 0.4 ]
[5.2 3.5 1.5 0.2 ]
[5.2 3.4 1.4 0.2 ]
[4.7 3.2 1.6 0.2 ]
[4.8 3.1 1.6 0.2 ]
[5.4 3.4 1.5 0.4 ]
[5.2 4.1 1.5 0.1 ]
[5.5 4.2 1.4 0.2 ]
[4.9 3.1 1.5 0.1 ]
[5. 3.2 1.2 0.2 ]
[5.5 3.5 1.3 0.2 ]
[4.9 3.1 1.5 0.1 ]
[4.4 3. 1.3 0.2 ]
[5.1 3.4 1.5 0.2 ]
[5. 3.5 1.3 0.3 ]
[4.5 2.3 1.3 0.3 ]
[4.4 3.2 1.3 0.2 ]
[5. 3.5 1.6 0.6 ]
[5.1 3.8 1.9 0.4 ]
[4.8 3. 1.4 0.3 ]
[5.1 3.8 1.6 0.2 ]
[4.6 3.2 1.4 0.2 ]
[5.3 3.7 1.5 0.2 ]
[5. 3.3 1.4 0.2 ]
[7. 3.2 4.7 1.4 ]
[6.4 3.2 4.5 1.5 ]
[6.9 3.1 4.9 1.5 ]
[5.5 2.3 4. 1.3 ]
[6.5 2.8 4.6 1.5 ]
[5.7 2.8 4.5 1.3 ]
[6.3 3.3 4.7 1.6 ]
[4.9 2.4 3.3 1. ]
[6.6 2.9 4.6 1.3 ]
[5.2 2.7 3.9 1.4 ]
[5. 2. 3.5 1. ]
[5.9 3. 4.2 1.5 ]
[6. 2.2 4. 1. ]
[6.1 2.9 4.7 1.4 ]
[5.6 2.9 3.6 1.3 ]
[6.7 3.1 4.4 1.4 ]
[5.6 3. 4.5 1.5 ]
[5.8 2.7 4.1 1. ]
[6.2 2.2 4.5 1.5 ]
[5.6 2.5 3.9 1.1 ]
[5.9 3.2 4.8 1.8 ]
[6.1 2.8 4. 1.3 ]
[6.3 2.5 4.9 1.5 ]
[6.1 2.8 4.7 1.2 ]
[6.4 2.9 4.3 1.3 ]
[6.6 3. 4.4 1.4 ]
[6.8 2.8 4.8 1.4 ]
[6.7 3. 5. 1.7 ]
[6. 2.9 4.5 1.5 ]
[5.7 2.6 3.5 1. ]
[5.5 2.4 3.8 1.1 ]
[5.5 2.4 3.7 1. ]
[5.8 2.7 3.9 1.2 ]
[6. 2.7 5.1 1.6 ]
[5.4 3. 4.5 1.5 ]
[6. 3.4 4.5 1.6 ]
[6.7 3.1 4.7 1.5 ]
[6.3 2.3 4.4 1.3 ]
[5.6 3. 4.1 1.3 ]
[5.5 2.5 4. 1.3 ]
[5.5 2.6 4.4 1.2 ]
[6.1 3. 4.6 1.4 ]
[5.8 2.6 4. 1.2 ]
[5. 2.3 3.3 1. ]
[5.6 2.7 4.2 1.3 ]
[5.7 3. 4.2 1.2 ]
[5.7 2.9 4.2 1.3 ]
[6.2 2.9 4.3 1.3 ]
[5.1 2.5 3. 1.1 ]
[5.7 2.8 4.1 1.3 ]
[6.3 3.3 6. 2.5 ]
[5.8 2.7 5.1 1.9 ]
[7.1 3. 5.9 2.1 ]
[6.3 2.9 5.6 1.8 ]
[6.5 3. 5.8 2.2 ]
[7.6 3. 6.6 2.1 ]
[4.9 2.5 4.5 1.7 ]
[7.3 2.9 6.3 1.8 ]
[6.7 2.5 5.8 1.8 ]
[7.2 3.6 6.1 2.5 ]
[6.5 3.2 5.1 2. ]
[6.4 2.7 5.3 1.9 ]
[6.8 3. 5.5 2.1 ]
[5.7 2.5 5. 2. ]
[5.8 2.8 5.1 2.4 ]
[6.4 3.2 5.3 2.3 ]
[6.5 3. 5.5 1.8 ]
[7.7 3.8 6.7 2.2 ]
[7.7 2.6 6.9 2.3 ]
[6. 2.2 5. 1.5 ]
[6.9 3.2 5.7 2.3 ]
[5.6 2.8 4.9 2. ]
[7.7 2.8 6.7 2. ]
[6.3 2.7 4.9 1.8 ]
[6.7 3.3 5.7 2.1 ]
[7.2 3.2 6. 1.8 ]
[6.2 2.8 4.8 1.8 ]
[6.1 3. 4.9 1.8 ]
[6.4 2.8 5.6 2.1 ]
[7.2 3. 5.8 1.6 ]
[7.4 2.8 6.1 1.9 ]
[7.9 3.8 6.4 2. ]
[6.4 2.8 5.6 2.2 ]
[6.3 2.8 5.1 1.5 ]
[6.1 2.6 5.6 1.4 ]
[7.7 3. 6.1 2.3 ]
[6.3 3.4 5.6 2.4 ]
[6.4 3.1 5.5 1.8 ]
[6. 3. 4.8 1.8 ]
[6.9 3.1 5.4 2.1 ]
[6.7 3.1 5.6 2.4 ]
[6.9 3.1 5.1 2.3 ]
[5.8 2.7 5.1 1.9 ]
[6.8 3.2 5.9 2.3 ]
[6.7 3.3 5.7 2.5 ]
[6.65 3.15 5.2 2.3 ]
[6.3 2.5 5. 1.9 ]
[6.7 3. 5.2 2. ]
[6.2 3.4 5.4 2.3 ]
[5.9 2.7 5.1 1.8 ]]

#delete records of dataset with missing values


data2= (Name.iloc[:,:-1])
print (len(data2))
data2 = data2.dropna() # drop rows with nan values
print (data2.isnull().sum())
print (data2)
print (data2.isnull().sum())
print (data2)

150
Samuel 0
Natnael 0
Berhanu 0
Teddy 0
dtype: int64
Samuel Natnael Berhanu Teddy
0 5.1 3.5 1.4 0.2
4 5.0 3.6 1.4 0.2
5 5.4 3.9 1.7 0.4
6 4.6 3.4 1.4 0.3
7 5.0 3.4 1.5 0.2
.. ... ... ... ...
142 5.8 2.7 5.1 1.9
143 6.8 3.2 5.9 2.3
144 6.7 3.3 5.7 2.5
146 6.3 2.5 5.0 1.9
148 6.2 3.4 5.4 2.3

[144 rows x 4 columns]


Samuel 0
Natnael 0
Berhanu 0
Teddy 0
dtype: int64
Samuel Natnael Berhanu Teddy
0 5.1 3.5 1.4 0.2
4 5.0 3.6 1.4 0.2
5 5.4 3.9 1.7 0.4
6 4.6 3.4 1.4 0.3
7 5.0 3.4 1.5 0.2
.. ... ... ... ...
142 5.8 2.7 5.1 1.9
143 6.8 3.2 5.9 2.3
144 6.7 3.3 5.7 2.5
146 6.3 2.5 5.0 1.9
148 6.2 3.4 5.4 2.3

[144 rows x 4 columns]

# Transform the dataset to Min-Max Normalization


data2= (Name.iloc[:,:-1])
print (data2.head())
print (data2.shape)
Scaler=MinMaxScaler()
ScaledData=Scaler.fit_transform (data2)
ScaledData = ScaledData.round (3)
print (ScaledData)

Samuel Natnael Berhanu Teddy


0 5.1 3.5 1.4 0.2
1 4.9 NaN 1.4 0.2
2 4.7 3.2 NaN 0.2
3 4.6 NaN NaN 0.2
4 5.0 3.6 1.4 0.2
(150, 4)
[[0.222 0.625 0.068 0.042]
[0.167 nan 0.068 0.042]
[0.111 0.5 nan 0.042]
[0.083 nan nan 0.042]
[0.194 0.667 0.068 0.042]
[0.306 0.792 0.119 0.125]
[0.083 0.583 0.068 0.083]
[0.194 0.583 0.085 0.042]
[0.028 0.375 0.068 0.042]
[0.167 0.458 0.085 0. ]
[0.306 0.708 0.085 0.042]
[0.139 0.583 0.102 0.042]
[0.139 0.417 0.068 0. ]
[0. 0.417 0.017 0. ]
[0.417 0.833 0.034 0.042]
[0.389 1. 0.085 0.125]
[0.306 0.792 0.051 0.125]
[0.222 0.625 0.068 0.083]
[0.389 0.75 0.119 0.083]
[0.222 0.75 0.085 0.083]
[0.306 0.583 0.119 0.042]
[0.222 0.708 0.085 0.125]
[0.083 0.667 0. 0.042]
[0.222 0.542 0.119 0.167]
[0.139 0.583 0.153 0.042]
[0.194 0.417 0.102 0.042]
[0.194 0.583 0.102 0.125]
[0.25 0.625 0.085 0.042]
[0.25 0.583 0.068 0.042]
[0.111 0.5 0.102 0.042]
[0.139 0.458 0.102 0.042]
[0.306 0.583 0.085 0.125]
[0.25 0.875 0.085 0. ]
[0.333 0.917 0.068 0.042]
[0.167 0.458 0.085 0. ]
[0.194 0.5 0.034 0.042]
[0.333 0.625 0.051 0.042]
[0.167 0.458 0.085 0. ]
[0.028 0.417 0.051 0.042]
[0.222 0.583 0.085 0.042]
[0.194 0.625 0.051 0.083]
[0.056 0.125 0.051 0.083]
[0.028 0.5 0.051 0.042]
[0.194 0.625 0.102 0.208]
[0.222 0.75 0.153 0.125]
[0.139 0.417 0.068 0.083]
[0.222 0.75 0.102 0.042]
[0.083 0.5 0.068 0.042]
[0.278 0.708 0.085 0.042]
[0.194 0.542 0.068 0.042]
[0.75 0.5 0.627 0.542]
[0.583 0.5 0.593 0.583]
[0.722 0.458 0.661 0.583]
[0.333 0.125 0.508 0.5 ]
[0.611 0.333 0.61 0.583]
[0.389 0.333 0.593 0.5 ]
[0.556 0.542 0.627 0.625]
[0.167 0.167 0.39 0.375]
[0.639 0.375 0.61 0.5 ]
[0.25 0.292 0.492 0.542]
[0.194 0. 0.424 0.375]
[0.444 0.417 0.542 0.583]
[0.472 0.083 0.508 0.375]
[0.5 0.375 0.627 0.542]
[0.361 0.375 0.441 0.5 ]
[0.667 0.458 0.576 0.542]
[0.361 0.417 0.593 0.583]
[0.417 0.292 0.525 0.375]
[0.528 0.083 0.593 0.583]
[0.361 0.208 0.492 0.417]
[0.444 0.5 0.644 0.708]
[0.5 0.333 0.508 0.5 ]
[0.556 0.208 0.661 0.583]
[0.5 0.333 0.627 0.458]
[0.583 0.375 0.559 0.5 ]
[0.639 0.417 0.576 0.542]
[0.694 0.333 0.644 0.542]
[0.667 0.417 0.678 0.667]
[0.472 0.375 0.593 0.583]
[0.389 0.25 0.424 0.375]
[0.333 0.167 0.475 0.417]
[0.333 0.167 0.458 0.375]
[0.417 0.292 0.492 0.458]
[0.472 0.292 0.695 0.625]
[0.306 0.417 0.593 0.583]
[0.472 0.583 0.593 0.625]
[0.667 0.458 0.627 0.583]
[0.556 0.125 0.576 0.5 ]
[0.361 0.417 0.525 0.5 ]
[0.333 0.208 0.508 0.5 ]
[0.333 0.25 0.576 0.458]
[0.5 0.417 0.61 0.542]
[0.417 0.25 0.508 0.458]
[0.194 0.125 0.39 0.375]
[0.361 0.292 0.542 0.5 ]
[0.389 0.417 0.542 0.458]
[0.389 0.375 0.542 0.5 ]
[0.528 0.375 0.559 0.5 ]
[0.222 0.208 0.339 0.417]
[0.389 0.333 0.525 0.5 ]
[0.556 0.542 0.847 1. ]
[0.417 0.292 0.695 0.75 ]
[0.778 0.417 0.831 0.833]
[0.556 0.375 0.78 0.708]
[0.611 0.417 0.814 0.875]
[0.917 0.417 0.949 0.833]
[0.167 0.208 0.593 0.667]
[0.833 0.375 0.898 0.708]
[0.667 0.208 0.814 0.708]
[0.806 0.667 0.864 1. ]
[0.611 0.5 0.695 0.792]
[0.583 0.292 0.729 0.75 ]
[0.694 0.417 0.763 0.833]
[0.389 0.208 0.678 0.792]
[0.417 0.333 0.695 0.958]
[0.583 0.5 0.729 0.917]
[0.611 0.417 0.763 0.708]
[0.944 0.75 0.966 0.875]
[0.944 0.25 1. 0.917]
[0.472 0.083 0.678 0.583]
[0.722 0.5 0.797 0.917]
[0.361 0.333 0.661 0.792]
[0.944 0.333 0.966 0.792]
[0.556 0.292 0.661 0.708]
[0.667 0.542 0.797 0.833]
[0.806 0.5 0.847 0.708]
[0.528 0.333 0.644 0.708]
[0.5 0.417 0.661 0.708]
[0.583 0.333 0.78 0.833]
[0.806 0.417 0.814 0.625]
[0.861 0.333 0.864 0.75 ]
[1. 0.75 0.915 0.792]
[0.583 0.333 0.78 0.875]
[0.556 0.333 0.695 0.583]
[0.5 0.25 0.78 0.542]
[0.944 0.417 0.864 0.917]
[0.556 0.583 0.78 0.958]
[0.583 0.458 0.763 0.708]
[0.472 0.417 0.644 0.708]
[0.722 0.458 0.746 0.833]
[0.667 0.458 0.78 0.958]
[0.722 0.458 0.695 0.917]
[0.417 0.292 0.695 0.75 ]
[0.694 0.5 0.831 0.917]
[0.667 0.542 0.797 1. ]
[ nan nan 0.712 0.917]
[0.556 0.208 0.678 0.75 ]
[ nan 0.417 0.712 0.792]
[0.528 0.583 0.746 0.917]
[0.444 nan 0.695 0.708]]

#Transform the dataset to Z-Score Normalization


where is the import statement code?
from sklearn.preprocessing import StandardScaler
data2= (Name.iloc[:,:-1])
print(data2.head())
print(data2.shape)
scaler=StandardScaler()
scaled_data=scaler.fit_transform(data2)

print(scaled_data.round(3))

Samuel Natnael Berhanu Teddy


0 5.1 3.5 1.4 0.2
1 4.9 NaN 1.4 0.2
2 4.7 3.2 NaN 0.2
3 4.6 NaN NaN 0.2
4 5.0 3.6 1.4 0.2
(150, 4)
[[-8.870e-01 1.017e+00 -1.367e+00 -1.313e+00]
[-1.130e+00 nan -1.367e+00 -1.313e+00]
[-1.372e+00 3.320e-01 nan -1.313e+00]
[-1.493e+00 nan nan -1.313e+00]
[-1.009e+00 1.245e+00 -1.367e+00 -1.313e+00]
[-5.240e-01 1.930e+00 -1.195e+00 -1.050e+00]
[-1.493e+00 7.880e-01 -1.367e+00 -1.182e+00]
[-1.009e+00 7.880e-01 -1.310e+00 -1.313e+00]
[-1.735e+00 -3.530e-01 -1.367e+00 -1.313e+00]
[-1.130e+00 1.030e-01 -1.310e+00 -1.444e+00]
[-5.240e-01 1.473e+00 -1.310e+00 -1.313e+00]
[-1.251e+00 7.880e-01 -1.253e+00 -1.313e+00]
[-1.251e+00 -1.250e-01 -1.367e+00 -1.444e+00]
[-1.856e+00 -1.250e-01 -1.539e+00 -1.444e+00]
[-4.000e-02 2.158e+00 -1.481e+00 -1.313e+00]
[-1.610e-01 3.072e+00 -1.310e+00 -1.050e+00]
[-5.240e-01 1.930e+00 -1.424e+00 -1.050e+00]
[-8.870e-01 1.017e+00 -1.367e+00 -1.182e+00]
[-1.610e-01 1.702e+00 -1.195e+00 -1.182e+00]
[-8.870e-01 1.702e+00 -1.310e+00 -1.182e+00]
[-5.240e-01 7.880e-01 -1.195e+00 -1.313e+00]
[-8.870e-01 1.473e+00 -1.310e+00 -1.050e+00]
[-1.493e+00 1.245e+00 -1.596e+00 -1.313e+00]
[-8.870e-01 5.600e-01 -1.195e+00 -9.190e-01]
[-1.251e+00 7.880e-01 -1.081e+00 -1.313e+00]
[-1.009e+00 -1.250e-01 -1.253e+00 -1.313e+00]
[-1.009e+00 7.880e-01 -1.253e+00 -1.050e+00]
[-7.660e-01 1.017e+00 -1.310e+00 -1.313e+00]
[-7.660e-01 7.880e-01 -1.367e+00 -1.313e+00]
[-1.372e+00 3.320e-01 -1.253e+00 -1.313e+00]
[-1.251e+00 1.030e-01 -1.253e+00 -1.313e+00]
[-5.240e-01 7.880e-01 -1.310e+00 -1.050e+00]
[-7.660e-01 2.387e+00 -1.310e+00 -1.444e+00]
[-4.030e-01 2.615e+00 -1.367e+00 -1.313e+00]
[-1.130e+00 1.030e-01 -1.310e+00 -1.444e+00]
[-1.009e+00 3.320e-01 -1.481e+00 -1.313e+00]
[-4.030e-01 1.017e+00 -1.424e+00 -1.313e+00]
[-1.130e+00 1.030e-01 -1.310e+00 -1.444e+00]
[-1.735e+00 -1.250e-01 -1.424e+00 -1.313e+00]
[-8.870e-01 7.880e-01 -1.310e+00 -1.313e+00]
[-1.009e+00 1.017e+00 -1.424e+00 -1.182e+00]
[-1.614e+00 -1.724e+00 -1.424e+00 -1.182e+00]
[-1.735e+00 3.320e-01 -1.424e+00 -1.313e+00]
[-1.009e+00 1.017e+00 -1.253e+00 -7.870e-01]
[-8.870e-01 1.702e+00 -1.081e+00 -1.050e+00]
[-1.251e+00 -1.250e-01 -1.367e+00 -1.182e+00]
[-8.870e-01 1.702e+00 -1.253e+00 -1.313e+00]
[-1.493e+00 3.320e-01 -1.367e+00 -1.313e+00]
[-6.450e-01 1.473e+00 -1.310e+00 -1.313e+00]
[-1.009e+00 5.600e-01 -1.367e+00 -1.313e+00]
[ 1.413e+00 3.320e-01 5.200e-01 2.650e-01]
[ 6.860e-01 3.320e-01 4.060e-01 3.960e-01]
[ 1.292e+00 1.030e-01 6.340e-01 3.960e-01]
[-4.030e-01 -1.724e+00 1.200e-01 1.330e-01]
[ 8.070e-01 -5.820e-01 4.630e-01 3.960e-01]
[-1.610e-01 -5.820e-01 4.060e-01 1.330e-01]
[ 5.650e-01 5.600e-01 5.200e-01 5.280e-01]
[-1.130e+00 -1.495e+00 -2.810e-01 -2.610e-01]
[ 9.280e-01 -3.530e-01 4.630e-01 1.330e-01]
[-7.660e-01 -8.100e-01 6.300e-02 2.650e-01]
[-1.009e+00 -2.409e+00 -1.660e-01 -2.610e-01]
[ 8.100e-02 -1.250e-01 2.340e-01 3.960e-01]
[ 2.020e-01 -1.952e+00 1.200e-01 -2.610e-01]
[ 3.230e-01 -3.530e-01 5.200e-01 2.650e-01]
[-2.820e-01 -3.530e-01 -1.090e-01 1.330e-01]
[ 1.049e+00 1.030e-01 3.490e-01 2.650e-01]
[-2.820e-01 -1.250e-01 4.060e-01 3.960e-01]
[-4.000e-02 -8.100e-01 1.770e-01 -2.610e-01]
[ 4.440e-01 -1.952e+00 4.060e-01 3.960e-01]
[-2.820e-01 -1.267e+00 6.300e-02 -1.300e-01]
[ 8.100e-02 3.320e-01 5.770e-01 7.910e-01]
[ 3.230e-01 -5.820e-01 1.200e-01 1.330e-01]
[ 5.650e-01 -1.267e+00 6.340e-01 3.960e-01]
[ 3.230e-01 -5.820e-01 5.200e-01 2.000e-03]
[ 6.860e-01 -3.530e-01 2.910e-01 1.330e-01]
[ 9.280e-01 -1.250e-01 3.490e-01 2.650e-01]
[ 1.171e+00 -5.820e-01 5.770e-01 2.650e-01]
[ 1.049e+00 -1.250e-01 6.920e-01 6.590e-01]
[ 2.020e-01 -3.530e-01 4.060e-01 3.960e-01]
[-1.610e-01 -1.038e+00 -1.660e-01 -2.610e-01]
[-4.030e-01 -1.495e+00 5.000e-03 -1.300e-01]
[-4.030e-01 -1.495e+00 -5.200e-02 -2.610e-01]
[-4.000e-02 -8.100e-01 6.300e-02 2.000e-03]
[ 2.020e-01 -8.100e-01 7.490e-01 5.280e-01]
[-5.240e-01 -1.250e-01 4.060e-01 3.960e-01]
[ 2.020e-01 7.880e-01 4.060e-01 5.280e-01]
[ 1.049e+00 1.030e-01 5.200e-01 3.960e-01]
[ 5.650e-01 -1.724e+00 3.490e-01 1.330e-01]
[-2.820e-01 -1.250e-01 1.770e-01 1.330e-01]
[-4.030e-01 -1.267e+00 1.200e-01 1.330e-01]
[-4.030e-01 -1.038e+00 3.490e-01 2.000e-03]
[ 3.230e-01 -1.250e-01 4.630e-01 2.650e-01]
[-4.000e-02 -1.038e+00 1.200e-01 2.000e-03]
[-1.009e+00 -1.724e+00 -2.810e-01 -2.610e-01]
[-2.820e-01 -8.100e-01 2.340e-01 1.330e-01]
[-1.610e-01 -1.250e-01 2.340e-01 2.000e-03]
[-1.610e-01 -3.530e-01 2.340e-01 1.330e-01]
[ 4.440e-01 -3.530e-01 2.910e-01 1.330e-01]
[-8.870e-01 -1.267e+00 -4.520e-01 -1.300e-01]
[-1.610e-01 -5.820e-01 1.770e-01 1.330e-01]
[ 5.650e-01 5.600e-01 1.264e+00 1.711e+00]
[-4.000e-02 -8.100e-01 7.490e-01 9.220e-01]
[ 1.534e+00 -1.250e-01 1.206e+00 1.185e+00]
[ 5.650e-01 -3.530e-01 1.035e+00 7.910e-01]
[ 8.070e-01 -1.250e-01 1.149e+00 1.316e+00]
[ 2.139e+00 -1.250e-01 1.607e+00 1.185e+00]
[-1.130e+00 -1.267e+00 4.060e-01 6.590e-01]
[ 1.776e+00 -3.530e-01 1.435e+00 7.910e-01]
[ 1.049e+00 -1.267e+00 1.149e+00 7.910e-01]
[ 1.655e+00 1.245e+00 1.321e+00 1.711e+00]
[ 8.070e-01 3.320e-01 7.490e-01 1.054e+00]
[ 6.860e-01 -8.100e-01 8.630e-01 9.220e-01]
[ 1.171e+00 -1.250e-01 9.780e-01 1.185e+00]
[-1.610e-01 -1.267e+00 6.920e-01 1.054e+00]
[-4.000e-02 -5.820e-01 7.490e-01 1.579e+00]
[ 6.860e-01 3.320e-01 8.630e-01 1.448e+00]
[ 8.070e-01 -1.250e-01 9.780e-01 7.910e-01]
[ 2.260e+00 1.702e+00 1.664e+00 1.316e+00]
[ 2.260e+00 -1.038e+00 1.778e+00 1.448e+00]
[ 2.020e-01 -1.952e+00 6.920e-01 3.960e-01]
[ 1.292e+00 3.320e-01 1.092e+00 1.448e+00]
[-2.820e-01 -5.820e-01 6.340e-01 1.054e+00]
[ 2.260e+00 -5.820e-01 1.664e+00 1.054e+00]
[ 5.650e-01 -8.100e-01 6.340e-01 7.910e-01]
[ 1.049e+00 5.600e-01 1.092e+00 1.185e+00]
[ 1.655e+00 3.320e-01 1.264e+00 7.910e-01]
[ 4.440e-01 -5.820e-01 5.770e-01 7.910e-01]
[ 3.230e-01 -1.250e-01 6.340e-01 7.910e-01]
[ 6.860e-01 -5.820e-01 1.035e+00 1.185e+00]
[ 1.655e+00 -1.250e-01 1.149e+00 5.280e-01]
[ 1.897e+00 -5.820e-01 1.321e+00 9.220e-01]
[ 2.502e+00 1.702e+00 1.492e+00 1.054e+00]
[ 6.860e-01 -5.820e-01 1.035e+00 1.316e+00]
[ 5.650e-01 -5.820e-01 7.490e-01 3.960e-01]
[ 3.230e-01 -1.038e+00 1.035e+00 2.650e-01]
[ 2.260e+00 -1.250e-01 1.321e+00 1.448e+00]
[ 5.650e-01 7.880e-01 1.035e+00 1.579e+00]
[ 6.860e-01 1.030e-01 9.780e-01 7.910e-01]
[ 2.020e-01 -1.250e-01 5.770e-01 7.910e-01]
[ 1.292e+00 1.030e-01 9.200e-01 1.185e+00]
[ 1.049e+00 1.030e-01 1.035e+00 1.579e+00]
[ 1.292e+00 1.030e-01 7.490e-01 1.448e+00]
[-4.000e-02 -8.100e-01 7.490e-01 9.220e-01]
[ 1.171e+00 3.320e-01 1.206e+00 1.448e+00]
[ 1.049e+00 5.600e-01 1.092e+00 1.711e+00]
[ nan nan 8.060e-01 1.448e+00]
[ 5.650e-01 -1.267e+00 6.920e-01 9.220e-01]
[ nan -1.250e-01 8.060e-01 1.054e+00]
[ 4.440e-01 7.880e-01 9.200e-01 1.448e+00]
[ 8.100e-02 nan 7.490e-01 7.910e-01]]
#Take the age of patient from the hospital as follows Age=
#{42, 15, 67, 55, 1, 29, 75, 89, 4, 10, 15, 38, 22, 77}
where is the import statement code?
df = pd.DataFrame({'Age': [42, 15, 67, 55, 1, 29, 75, 89, 4, 10, 15, 38, 22, 77]})

print("Before Transformation: ")


print(df)

Before Transformation:
Age
0 42
1 15
2 67
3 55
4 1
5 29
6 75
7 89
8 4
9 10
10 15
11 38
12 22
13 77

#Convert the given age value to the category of (‘Baby’, ‘Child’, ‘Teenage’, ‘Adult’, ‘Elderly’) suing
#binning method as cut value of [0, 3, 7, 17, 63, 99]
Label = pd.cut(x=df['Age'], bins=[0, 3, 7, 17, 63, 99],
labels=['Baby', 'Child', 'Teenage', 'Adult', 'Elderly'])
where is the import statement code?
# Printing DataFrame after sorting Continuous to
# Categories
print("After: ")
print(Label)

After:
0 Adult
1 Teenage
2 Elderly
3 Adult
4 Baby
5 Adult
6 Elderly
7 Elderly
8 Child
9 Teenage
10 Teenage
11 Adult
12 Adult
13 Elderly
Name: Age, dtype: category
Categories (5, object): ['Baby' < 'Child' < 'Teenage' < 'Adult' < 'Elderly']

# Check the number of values in each bin


print("Categories: ")
print(Label.value_counts())

Categories:
Age
Adult 5
Elderly 4
Teenage 3
Baby 1
Child 1
Name: count, dtype: int64

data = pd.concat([df, Label], axis=1)


print ("\n \n \n Merged Data \n \n", data)
Merged Data

Age Age
0 42 Adult
1 15 Teenage
2 67 Elderly
3 55 Adult
4 1 Baby
5 29 Adult
6 75 Elderly
7 89 Elderly
8 4 Child
9 10 Teenage
10 15 Teenage
11 38 Adult
12 22 Adult
13 77 Elderly

You might also like