Why is manhattan distance not used in knn machine learning algorithm to calculate the distance between nearest neighbors?
Answer / Krishn Mohan Pandey
The reason Manhattan distance is not typically used in the k-nearest neighbors (knn) algorithm is because it considers only the absolute differences along each feature dimension. This can lead to a less accurate calculation of similarity for data that exhibits skewness or non-linear patterns, as it ignores the direction of change in the features. Euclidean distance, which is more commonly used in knn, takes into account both the magnitude and direction of differences between points, resulting in a more suitable measure of similarity.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is data structure? And what are the different types of data structures supported in r programming?
What’s the trade-off between bias and variance?
What is sequence model?
What is the method to avoid overfitting?
What is ‘Overfitting’ in Machine learning?
What Is The Difference Between An Array And Linked List?
What does linear in ‘linear regression’ actually mean?
What is supervised machine learning?
What is machine learning basics?
Tell us what's a fourier transform?
Differentiate between data science, machine learning and ai.
Mention any one of the data visualization tools that you are familiar with?
AI Algorithms (74)
AI Natural Language Processing (96)
AI Knowledge Representation Reasoning (12)
AI Robotics (183)
AI Computer Vision (13)
AI Neural Networks (66)
AI Fuzzy Logic (31)
AI Games (8)
AI Languages (141)
AI Tools (11)
AI Machine Learning (659)
Data Science (671)
Data Mining (120)
AI Deep Learning (111)
Generative AI (153)
AI Frameworks Libraries (197)
AI Ethics Safety (100)
AI Applications (427)
AI General (197)
AI AllOther (6)