StackedML
Practice
Labs
Questions
Models
Pricing
Sign in
Questions
/
Unsupervised Learning
/
Other Concepts
/
High-dimensional effects
← Previous
Next →
531.
Nearest Neighbor Degradation in High Dimensions
easy
Why does the performance of nearest-neighbor methods degrade in high dimensions?
A
All points become nearly equidistant, making the nearest neighbor no more similar than a random point
B
Nearest-neighbor search becomes exponentially slower in high dimensions making it computationally infeasible
C
Nearest neighbors in high dimensions are always further away than the decision boundary, causing misclassification
D
High-dimensional data always violates the i.i.d. assumption required for nearest-neighbor guarantees
Sign in to verify your answer
← Back to Questions