K-NN ( K Nearest Neighbor )
Hamza Fatnaoui
Software Engineer | AI/ML Practitioner | LLMs Enthusiast | Competitive Programmer | Math & Tech Lover
Hey folks, I hope you're doing well. KNN stands out as one of the fundamental algorithms in machine learning. It classifies a data point based on how its neighbors are classified (???? ???? ???? ???? ).
It's a simple machine learning algorithm employed for both classification and regression. KNN is a supervised learning model with several characteristics:
The 'K' in KNN represents the number of nearest neighbors used to classify new data points. Selecting the right 'K' is crucial. Often, it's recommended to set 'K' as the square root of 'n,' where 'n' is the total size of the data. It's advisable to use an odd 'K' to avoid ties in classification.
Euclidean distance is commonly used to calculate the nearest neighbors. For two points (x, y) and (a, b), the Euclidean distance (d) is calculated as follows:
d=((x?a)2)+((y?b)2)d=((x?a)2)+((y?b)2)
To practice and see the usage of this model, you can check out my mini-project on GitHub with just a simple click. But first take a look at the the basics.
Engineering student at IMT NORD EUROPE
1 年Interesting ??
Air Traffic Safety Electronics Engineer (ATSEP)
1 年Lol, that's amazing, keep up the good work ????????
PhD Candidate in Blockchain and AI autonomous systems & Software Engineer | python developer
1 年I understand it faster than that s5 course ???? , thanks hamza