An improvement to k-nearest neighbor classifier

Type: Article

Publication Date: 2011-09-01

Citations: 46

DOI: https://doi.org/10.1109/raics.2011.6069307

Abstract

Non-parametric methods like Nearest neighbor classifier (NNC) and its variants such as k-nearest neighbor classifier (k-NNC) are simple to use and often shows good performance in practice. It stores all training patterns and searches to find k nearest neighbors of the given test pattern. Some fundamental improvements to k-NNC are (i) weighted k-nearest neighbor classifier (wk-NNC) where a weight to each of the neighbors is given and is used in the classification, (ii) to use a bootstrapped training set instead of the given training set, etc. Hamamoto et. al. [1] has given a bootstrapping method, where a training pattern is replaced by a weighted mean of a few of its neighbors from its own class of training patterns. It is shown to improve the classification accuracy in most of the cases. The time to create the bootstrapped set is O(n <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> ) where n is the number of training patterns. This paper presents a novel improvement to the k-NNC called k-Nearest Neighbor Mean Classifier (k-NNMC). k-NNMC finds k nearest neighbors for each class of training patterns separately, and finds means for each of these k neighbors (class-wise). Classification is done according to the nearest mean pattern. It is shown experimentally using several standard data-sets that the proposed classifier shows better classification accuracy over k-NNC, wk-NNC and k-NNC using Hamamoto's bootstrapped training set. Further, the proposed method does not have a design phase as the Hamamoto's method, and this is suitable for parallel implementations which can be coupled with any indexing and space reduction methods easily. It is a suitable method to be used in data mining applications.

Locations

  • IEEE Recent Advances in Intelligent Computational Systems - View
  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ An improvement to k-nearest neighbor classifier 2013 T. Hitendra Sarma
P. Viswanath
D. Sai Koti Reddy
S. Sri Raghava
+ An improvement to k-nearest neighbor classifier 2013 T. Hitendra Sarma
P. Viswanath
D. Sai Koti Reddy
S. Sri Raghava
+ Extensions of k-Nearest Neighbor Algorithm 2016 Rashmi Agrawal
+ PDF Chat An improved nearest neighbour classifier 2025 Eric Setterqvist
Natan Kruglyak
Robert Forchheimer
+ Predicting the number of nearest neighbors for the k-NN classification algorithm 2014 ZhangXueying
SongQinbao
+ Survey of Nearest Neighbor Techniques 2010 Nitin Bhatia
Vandana Bharti
+ A generalized mean distance-based k-nearest neighbor classifier 2018 Jianping Gou
Hongxing Ma
Weihua Ou
Shaoning Zeng
Yunbo Rao
Hebiao Yang
+ k-Nearest Neighbor Classification 2009 Ling Liu
M. TAMER ÖZSU
+ k-Nearest Neighbor Classification 2011
+ k-Nearest Neighbor Classification 2022 Xiuping Jia
+ Classifier design using nearest neighbor samples 2002 Yoshihiro Mitani
Yoshihiko Hamamoto
+ PDF Chat A New Nearest Centroid Neighbor Classifier Based on K Local Means Using Harmonic Mean Distance 2018 Sumet Mehta
Xiang‐Jun Shen
Jianping Gou
Dejiao Niu
+ k-Nearest Neighbour Classifiers -- 2nd Edition 2020 PĂĄdraig Cunningham
Sarah Jane Delany
+ An Extensive Experimental Study on the Cluster-based Reference Set Reduction for Speeding-up the k-NN Classifier 2013 Stefanos Ougiaroglou
Georgios Evangelidis
Dimitris A. Dervos
+ An Extensive Experimental Study on the Cluster-based Reference Set Reduction for speeding-up the k-NN Classifier 2013 Stefanos Ougiaroglou
Georgios Evangelidis
Dimitris A. Dervos
+ K-Nearest Neighbor 2023 Hang Li
+ Nearest‐neighbor methods 2012 Clifton D. Sutton
+ Validation Based Modified K-Nearest Neighbor 2009 HamĂŻd ParvĂŻn
Hosein Alizadeh
Behrouz Minaei‐Bidgoli
Sio-Iong Ao
+ On k-nearest neighbor rule in nonparametric classification 2004 임창원
+ k Nearest Neighbors classifier with sampling dependent decision rules 2017 Zsolt LĂĄszlĂł