Skip to content

pranayom/KNN-implementation-from-scratch

Repository files navigation

KNN is a non parametric machine learning algorithm. Fix & Hodges proposed K-nearest neighbor classifier algorithm in the year of 1951 for performing pattern classification task.

Many a times KNN is confused with k-means clustering. The layman way to differentiate between the two:

KNN - Classification - Supervised - Labeled data

K-means - Clustering - Unsupervised - Unlabeled data

Suppose you are a class teacher who got a new student admitted in his/her class. You want to find a proper friend's circle for the new student. Since you want to place the student with similar students, you look for certain characteristics like aptitude, interests, afinnity towards sports. Based on these three characteristics you place the student to its 'nearest neighbors'. The 'nearness' is measured bases on the three charachteristics.

A more detailed explanation could be found on this link:

http://dataaspirant.com/2016/12/23/k-nearest-neighbor-classifier-intro/ https://machinelearningmastery.com/tutorial-to-implement-k-nearest-neighbors-in-python-from-scratch/

About

KNN is one of the famous classification algorithms. Here I have tried to implement it from scratch on a real life dataset and compared the accuracy by running it again on scikit learn

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors