Curse of Dimensionality- KNN Breaks Down: Part Two (With Code + Examples).
This is continuation of my earlier article Coding K-Nearest Neighbors Machine Learning Algorithm in Python as part of Python Series. K-Nearest Neighbors is an instance-based nonparametric algorithm that is subject to the curse of the dimensionality. In Computer Science, the situation of the curse of the dimensionality arises when during dynamic programming, combinatorial problems, control theory, integer programming, and time-dependent problems. This happens when the number of states exponentially increases with respect to a tiny increase in the number of dimensions or parameters due to combinatory explosion. In 1957, Bellman coined the term curse of dimensionality. In this scenario many multivariate conundrums suffer from the curse of the dimensionality. It implies that the complexity of a z-variable conundrum is an exponential function in z. The curse of dimensionality can influence K-Nearest Neighbors based on the weakness of the model. If we were to create a 3-Nearest Neighbors model, the votes will be allocated as 2/3 for the wrong class. The curse of the dimensionality is defined on the unit balls of normed linear spaces for all the multivariate problems. The curse of the dimensionality is still applicable independently even for analytic functions. Katscher, Novak and Petras have studied the integration problem for monotone functions. Papageorgiou applied it for the integration problem with curse of the dimensionality on the convex functions. For the class Zemont of monotone function, it was proved as
Fintd (Zemont) = Θ(m-1/e).