0% found this document useful (0 votes)
11 views12 pages

Aam Vaishaa-1

The document presents a micro-project on the 'K-Nearest Neighbors Algorithm' submitted for a diploma in Artificial Intelligence and Machine Learning at Government Polytechnic, Ambad. It includes sections on the project's introduction, objectives, implementation steps, and future scope, highlighting the algorithm's effectiveness in classification and regression tasks. The project is guided by Prof. P.T. Zunjare and involves collaborative work by four students, detailing their contributions and acknowledgments.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views12 pages

Aam Vaishaa-1

The document presents a micro-project on the 'K-Nearest Neighbors Algorithm' submitted for a diploma in Artificial Intelligence and Machine Learning at Government Polytechnic, Ambad. It includes sections on the project's introduction, objectives, implementation steps, and future scope, highlighting the algorithm's effectiveness in classification and regression tasks. The project is guided by Prof. P.T. Zunjare and involves collaborative work by four students, detailing their contributions and acknowledgments.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

1.

301

2. 304

3. 309

4. 320
GOVERNMENT POLYTECHNIC, AMBAD.
DEPARTMENT OF ARTIFICIAL INTELLIGENCE AND
MACHINE LEARNING
COURSE & CODE-AIML/22683

CERTIFICATE
This is to certify that the Micro-project entitled “K-Nearest Neighbors Algorithm ” being
submitted here with for the award of DIPLOMA IN ENGINEERING & TECHNOLOGY
in Artificial Intelligence And Machine Learning of MAHARASHTRA STATE BOARD
OF TECHNICAL EDUCATION (MSBTE) is the result of Micro-
project work completed by All Group Members under the supervision and guidance of
Prof. P.T.Zunjare It is to the best of my knowledge and belief that the work embodied in
this Micro-project has not formed earlier the basis for the award of any degree or diploma
of this or any other Board or examining body.

Academic Year: 2024-2025 Semester: Sixth (6I)

Place: Ambad
Date:

Prof.P.T.Zunjare Mr.D.S.Sonwane Dr. M.B.Sanap


Micro-project Guide H.O.D Principal

2
DECLARATION
I, the undersigned hereby declare that the project entitled “K-Nearest Neighbors
Algorithm” is written and submitted by me to Government Polytechnic Ambad during
Year 2023-24, sixth Semester for partial fulfillment of the ‘Micro Project’ requirement of
‘Artificial Intelligence and Machine Learning’ course under Maharashtra State Board of
Technical Education, Mumbai curriculum, under the guidance of Prof.P.T.Zunjare is my
original work.

The empirical findings in this project are based on the collected data and are not copied
from any other sources.

Sr. No Roll No. Enrollment No. Name Of Student Sign

1. 301 2211620424 Pooja Gajanan Wagh

2. 304 2211620130 Vaishnavi Dilip Chavan

3. 309 2211620419 Anam Riyaj Shaikh

4. 320 2211620440 Prajakta Dhanraj Sagare

3
ACKNOWLEDGEMENTS

I have great pleasure to express my immense gratitude towards a dynamic person


and my project guidance, Mr.P.T.Zunjare Department of Artificial Intelligence and
Machine Learning Government Polytechnic, Ambad for giving me an opportunity to work
on an interesting topic over one semester. The work presented here could not have been
accomplished without his most competent and inspiring guidance, incessant
encouragement, constructive criticism and constant motivation during all phases of our
group Micro-project work. I am highly indebted to him.
I am very much thankful to Prof.D.S.Sonwane, Head Department of Artificial
Intelligence and Machine Learning all HODs of various departments and Prof. Dr.
M.B.Sanap, The Principal of Government Polytechnic, Ambad for his encouragement and
providing me a motivating environment and project facilities in the Institute to carry out
experiments and complete this Micro-project work.
I would like to extend my thanks to all our professors, staff members and all
our friends who extended their co-operation to complete the project.
I am indeed indebted to my parents and other family members for their
immense help at all levels with moral, social & financial support, care and support
throughout my studies without which my work would not have seen light of the day.
With warm regards,

Yours Sincerely ,

Place: Ambad All Group Members Date:

4
INDEX

Sr. No. Title Page No.

1. Introduction 1

2. Objective Of the Project 6

4. Future Scope of Project 7

5. Implementation Of Project 8

6. Conclusion 9

7. Reference 10

5
INTRODUCTION

The k-Nearest Neighbors (kNN) algorithm is a simple, yet powerful supervised


machine learning technique used for classification and regression tasks.
It operates on the principle that similar data points exist in close proximity.
When given a new data point, the algorithm identifies the 'k' closest points from the
training data and assigns the most common label (for classification) or the average
value (for regression).
kNN is a non-parametric, lazy learning algorithm, meaning it makes no assumptions
about data distribution and delays processing until a prediction is required.

The k-Nearest Neighbors (kNN) algorithm is a way for computers to make


predictions or decisions by looking at things that are similar.

Imagine you have a group of students, and you know their favorite subjects. Now, a
new student joins, and you want to guess what their favorite subject might be. You
can look at a few students who are most similar to the new one (like in age, marks, or
hobbies), and see what their favorite subject is. If most of them like Math, you can
guess that the new student might like Math too. That’s how kNN works.

In kNN:

 "k" means how many nearby examples we look at.

 It compares things using distance, like how close or similar they are.

 It works well when things that are alike are also close to each other.

 kNN is very easy to understand and use, which makes it great for beginners in
machine learning.

6
OBJECTIVES OF THE PROJECT

 The main objec ves of this project are:

 To understand the working principles of the kNN algorithm.

 To implement the kNN algorithm for solving a classifica on or


regression problem.

 To evaluate the performance of the model using metrics like accuracy,


precision, recall, and F1-score.

 To compare the kNN algorithm with other machine learning techniques.

FUTURE SCOPE

a) The kNN algorithm can be enhanced and applied in various future


projects:

b) Op miza on: Using techniques like dimensionality reduc on


(PCA) or distance-weighted vo ng for be er accuracy.

c) Real- me applica ons: Implemen ng kNN in real- me systems


such as facial recogni on or recommenda on systems.

d) Hybrid models: Combining kNN with other algorithms for


ensemble learning.

e) Scalability: Adap ng kNN to work efficiently with large datasets


using approximate nearest neighbor algorithms.

7
IMPLEMENTATION OF PROJECT

The project implementa on includes the following steps:

1. Data Collec on: Choose a relevant dataset (e.g., Iris, MNIST, or custom
dataset).

2. Data Preprocessing: Handle missing values, normalize features, and split


into training and test sets.

3. Algorithm Implementa on:


Calculate the distance (e.g., Euclidean) between test points and all training
points.

Iden fy the 'k' nearest neighbors.

Assign the most frequent label (for classifica on).

4. Model Evalua on: Use metrics like accuracy, confusion matrix, and ROC
curve to analyze performance.

5. Visualiza on: Display the decision boundaries and predic on results.

8
1. Impor ng the modules

import numpy as np
import pandas as pd

import matplotlib.pyplot as plt

from sklearn.datasets import make_blobs


from sklearn.neighbors import KNeighborsClassifier
from sklearn.model_selec on import train_test_split

2. Crea ng Dataset

X, y = make_blobs(n_samples = 500, n_features = 2, centers =


4,cluster_std = 1.5, random_state = 4)

3. Visualize the Dataset

plt.style.use('seaborn')
plt.figure(figsize = (10,10))
plt.sca er(X[:,0], X[:,1], c=y, marker= '*',s=100,edgecolors='black')
plt.show()

4. Spli ng Data into Training and Tes ng Datasets

X_train, X_test, y_train, y_test = train_test_split(X, y, random_state = 0)

5. KNN Classifier Implementa on


knn5 = KNeighborsClassifier(n_neighbors = 5)
knn1 = KNeighborsClassifier(n_neighbors=1)

9
6. Predic ons for the KNN Classifiers
knn5.fit(X_train, y_train)
knn1.fit(X_train, y_train)

y_pred_5 = knn5.predict(X_test)
y_pred_1 = knn1.predict(X_test)

7. Predict Accuracy for both k values


from sklearn.metrics import accuracy_score
print("Accuracy with k=5", accuracy_score(y_test, y_pred_5)*100)
print("Accuracy with k=1", accuracy_score(y_test, y_pred_1)*100)

Accuracy with k=5 93.60000000000001


Accuracy with k=1 90.4
8. Visualize Predic ons
plt.figure(figsize = (15,5))
plt.subplot(1,2,1)
plt.sca er(X_test[:,0], X_test[:,1], c=y_pred_5, marker= '*',
s=100,edgecolors='black')
plt. tle("Predicted values with k=5", fontsize=20)

plt.subplot(1,2,2)
plt.sca er(X_test[:,0], X_test[:,1], c=y_pred_1, marker= '*',
s=100,edgecolors='black')
plt. tle("Predicted values with k=1", fontsize=20)
plt.show()
10
11
CONCLUSION

The k-Nearest Neighbors algorithm is a founda onal tool in machine learning


that provides a strong star ng point for understanding classifica on and
regression tasks.
Despite its simplicity, kNN performs well in many scenarios and can be
adapted for complex applica ons.
Through this project, we explored its working, implementa on, and poten al
improvements, laying a strong groundwork for further explora on in the field
of AI and data science.

REFERENCE

https://www.appliedaicourse.com

https://www.ibmbigdatahub.com/

https://www.coursera.org

https://www.w3schools.com

https://www.tpointtech.com

12

You might also like