58

KNN visualization in just 13 lines of code

 4 years ago
source link: https://www.tuicool.com/articles/7jmM32r
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Yes! It’s that simple. Let’s play around with datasets to visualize how the decision boundary changes as ‘k’ changes.

zIfy6rA.png!web

Let’s have a quick review…

What is K-NN? How does it work?

K Nearest Neighbor(KNN) algorithm is a very simple, easy to understand, versatile and one of the topmost machine learning algorithms. In k-NN classification, the output is a class membership. An object is classified by a plurality vote of its neighbours, with the object being assigned to the class most common among its k nearest neighbours (k is a positive integer, typically small). If k = 1, then the object is simply assigned to the class of that single nearest neighbour.

In KNN, K is the number of nearest neighbours. The number of neighbours is the core deciding factor. K is generally an odd number if the number of classes is 2. When K=1, then the algorithm is known as the nearest neighbour algorithm. This is the simplest case.

Suppose P1 is the point, for which label needs to be predicted.

u6Rby22.png!web

Basic steps in KNN.

KNN has three basic steps.

1. Calculate the distance.

2. Find the k nearest neighbours.

3. Vote for classes

Importance of K

You can’t pick any random value for k. The whole algorithm is based on the k value. Even small changes to k may result in big changes. Like most machine learning algorithms, the K in KNN is a hyperparameter. You can think of K as a controlling variable for the prediction model.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK