K-Nearest Neighborhood

It’s a practice problem that building a single prediction model is not a enough to assure we have the best model to predict accurately. In this article, I will introduce another classifier, K-NN, by using Letter Recognition Data Set at UCI machine learning repository.


Get code

Attribute Information

Our purpose is to recognize letters by using two machine learning algorithms, decision tree and K-NN, then compare their result.

 


 Decision Tree

We use the same approach as previous article to build a decision tree:

Explain:

For parameter configuration, I iterate depth from 5 to 30 by 5 with nested iterations of parent size from 20 to 200 by 20 where child size is a ceiling values of parent size divided by 3 to record results. The table below is the best records from each depth. In this table, we could spot the best configuration that depth is 20, parent size is 20 and child size is 6 since it has the lowest error rate in test set and has little difference test set error and cross validation error.

 


 K-Nearest Neighborhood

To get the best parameter for K-NN algorithm, we have to see how it performs through a range, “k =1, 3, 5, 7” for example.

Explain: 

Though all the accuracy are pretty close, when we look into the lowest sensitivities (letter ‘H’), we can notice that while k increase, the sensitivity of ‘H’ is getting low. Therefore, when k equals to 1, we have the best K-nearest neighbor classifier since all sensitivities in this classifier are high enough (H: 0.8796 is the lowest).

 


 Comparison

Explain: 

Generally, accuracy is a good judgement to evaluate a model. However, in some situation, sensitivity is a better judgement because sensitivity only cares about whether we predict a particular case right. To illustrate, when recognizing a letter, H for example, we only care the sensitivity that whether this model predicts it right, not the specificity that whether this model predicts non-H letters as non-H letters. Therefore, in this case, sensitivity is a better way to interpret the performance of this model.

When we compare both the best KNN classifier and the best decision tree, it is not too hard to find that KNN classifier has a better accuracy 95.35% (about 15% more than decision tree’s accuracy) and only one sensitivity “H” is lower than 90% in contrast to none of sensitivities in the decision tree is higher than 90%. Therefore, using KNN classifier instead of decision tree classifier is a better choice in this case.