0% found this document useful (0 votes)
3 views

ML Practical 2

Computer Engineering machine learning practical 2

Uploaded by

shital
Copyright
© © All Rights Reserved
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
3 views

ML Practical 2

Computer Engineering machine learning practical 2

Uploaded by

shital
Copyright
© © All Rights Reserved
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 6
71287, 10:38 PM In [1]: In [2]: In [3]: out [3]: In [4]: out[4]: In [5]: In [6]: out [6]: In [7]: import pandas as pd Gf = pd.read_csv(“enails.csv") df. shape (5172, 3¢@2) Unites -Jupyter Notebook df.head() Email mall the to ect and for of a you how connevey lay valid lay Instruct 0 Fri yg go 1 0 OO 2 0 0 Enal 1 mal gts 6 68 2 102 1 2 ° Email 2 fl 9 9 1 0 00 8 0 0 3 Fm 9 5 2 0 5 1 2 10 0 aml 7 gt 4 6 2 or ° 0 5 rows x 3002 columns df.drop([ "Email No.*, ‘Prediction'], axis = 1) GF{ 'Prediction'] xoshape (5172, 3008) x.info() Rangeindex: 5172 entries, © to 5171 Colunns: 3800 entries, the to dry dtypes: intea(3ee@) mmenory usage: 118.4 ¥B localhost 8888inotebooks/Untiledipynb?kemel_name=pythond 18 71287, 10:38 PM Uniitles -Jupyter Notebook In [8]: x.dtypes Out(s8}: the inte4 to intea ect ‘intea and ‘intea for intea infrastructure ‘intea military ‘intea allowing ‘intea ff intea dry ‘intea Length: 3000, dtype: object In [9]: set(x.dtypes) out[9]: {dtype('intea" )} In [11]: import seaborn as. sns sns. -countplot(x =y) out{11]: count Prediction In [12]: y.value_counts() out[12]: @ 1 3672 150 Name: Prediction, dtype: inte localhost 8888inotebooks/Untiledipynb?kemel_name=pythond 71287, 10:38 PM In [15]: In [16]: out [16]: In [17]: In [18]: out [18]: In [19]: out[19]: In [21]: out(21]: In [22]: In [23]: In [24]: out [24]: In [25]: localhost 8888inatebooks/Untiledipynb?kemel Unites -Jupyter Notebook from sklearn. preprocessing import MinMaxScaler scaler = MinMaxScaler() x_scaled = scaler.fit_transform(x) x scaled array({[9. 1e 1 De 8. L [0.03809524, '0.09848485, 0.06705539, .. @. 1 (@. : 28. , 8. 1 {e. : » 8. el e. 1 (@.00952381, @.0530303 , @. aoe @. 1 [9.104769 , 0.18181818, @.01166181, .. @. PD) from sklearn.model_selection import train_test_split train_test_split( 25) x train, x_test, y train, y_test x_scaled,y, random_state=0, test_size x scaled. shape (5172, 3000) x_train. shape (3879, 300) x_test.shape (1293, 3000) from sklearn.neighbors import kKNeighborsClassifier knn = KNeighborsClassifier(n_neighbors=5) kan.fit(x_train, y_train) + KNeighbor$Classifier _kveighborsclassifier() y_pred = kan.predict(x test) pame=python’ @. , @.00877193, @ , @ ’ @.00877193, @ , a6 71287, 10:38 PM Uniitles -Jupyter Notebook In [27]: from sklearn.metrics import ConfusionMatrixDisplay, accuracy_score from sklearn.metrics import classification_report In [3@]: ConfusionMatrixDisplay.from_predictions (y_test, y_pred) Out [30]: 700 600 500 400 True label 300 200 100 Predicted label In [32]: y_test.value_counts() out(32]: @ 929 1 364 Name: Prediction, dtype: intea In [33]: accuracy_score(y_test, y_pred) out[33]: @,871616395978345, In [34]: print(classification_report(y test, y_pred)) precision recall f1-score support @ 0.98 0.84 0.90 929 1 0.78 0.95 0.81 364 accuracy 0.87 1293 macro avg 0.84. 0.89 0.85 1293 weighted avg 2.90 0.87 0.88 1293 localhost 8888inotebooks/Untiledipynb?kemel_name=pythond 71287, 10:38 PM In [39]: import numpy as np import matplotlib.pyplot as plt In [41]: error = [] for k in range(1,41): kan knn.fit(x_train, y_train) pred = knn.predict(x_test) error .append(np.mean(pred In [42]: error out(42]: [@.10827532869296211, @.19982211910286156, .12296983758700696, @.11523588553750967, 0.12838360402165508, @.1214230471771075, @.15158546017014696, @.14849187935034802, @.17246713070378963, @.16705336426914152, @.1871616395978345, @.18329466357308585, @.21500386697602475, @.21345767656612528, @.22815158546017014, @.2266047950502707, @.23588553750966745, @.23356535189481825, @.2459396751740139, @.24361948955916474, @.2559938128383604, @.2552204176334107, @.2699149265274555, @.2691415313225058, @.2822892498066512, 0.28306264501160094, 0.2954369682907966, @.2923433874709977, 0. 3039443155452436, 0. 300077339520495, 0. 30549110595514306, @.30549110595514306, @. 31245166279969067, @.31245166279969067, 0. 3194122196442382, @.317092034029389, @.32637277648878577, @.32559938128383603, 0. 33410672853828305, 0. 3325599381283836] localhost 8888inotebooks/Untiledipynb?kemel_name=pythond KNeighborsClassifier(n_neighbor: Unites -Jupyter Notebook y_test)) 56 71287, 10:38 PM In [43]: In [44]: out [44]: In [45]: In [47]: out [47]: In [48]: In [49]: In [50]: out (se): In [51]: In [53]: out [53]: In [ ]: localhost 8888inatebooks/Untiledipynb?kemel Unites -Jupyter Notebook knn = KNeighborsClassifier(n_neighbors=1) knn. fit(x_train,y_train) . kNeighbor$Classifier kNeighborsClassifier(n_neighbor: y_pred = knn.predict(x_test) accuracy_score(y_test, y_pred) @.8917246713070379 from sklearn.svm import SVC svm = SVC(kernel= 'linear') sym. Fit(x_train, y_train) + vc Svc(kernelle' Linear") yipred = svm.predict(x test) accuracy_score(y_test, y_pred) @.9767981438515081 pame=python’ 6s

You might also like