Intrinsic Metrics: Nearest Neighbor and Edge Squared Distances

09/22/2017
by   Timothy Chu, et al.
0

Some researchers have proposed using non-Euclidean metrics for clustering data points. Generally, the metric should recognize that two points in the same cluster are close, even if their Euclidean distance is far. Multiple proposals have been suggested, including the Edge-Squared Metric (a specific example of a graph geodesic) and the Nearest Neighbor Metric. In this paper, we prove that the edge-squared and nearest-neighbor metrics are in fact equivalent. Previous best work showed that the edge-squared metric was a 3-approximation of the Nearest Neighbor metric. This paper represents one of the first proofs of equating a continuous metric with a discrete metric, using non-trivial discrete methods. Our proof uses the Kirszbraun theorem (also known as the Lipschitz Extension Theorem and Brehm's Extension Theorem), a notable theorem in functional analysis and computational geometry. The results of our paper, combined with the results of Hwang, Damelin, and Hero, tell us that the Nearest Neighbor distance on i.i.d samples of a density is a reasonable constant approximation of a natural density-based distance function.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset