UFC
CNRS


Accueil > Activités > Séminaires > Séminaire doctorant > Archives des séminaires 2017-2018

Universal consistency of k-NN rule in $\sigma$-finite dimensional metric spaces

publié le

Sushma Kumari
(Kyoto university)

The k-nearest neighbour (NN) rule is one of the important learning rules in machine learning. We start with the definition of the k-NN rule and universal consistency. Charles Stone proved the universal consistency of k-NN rule in \mathbb{R}^d. The essential ingredient for Stone’s theorem was the so-called geometric Stone’s lemma but because of the structure of \mathbb{R}^d, the proof of the geometric Stone’s lemma is restricted to finite dimensional normed spaces only. Cerou and Guyader in 2006, proved that the k-NN rule is universal consistent in more general metric spaces which satisfy the Lebesgue-Besicovitch differentiation theorem. Further, David Preiss proved that the metric spaces satisfying the Lebesgue-Besicovitch differentiation theorem are the metric spaces with the metric having \sigma-finite dimension (called as \sigma-finite dimensional metric spaces).
In this work, we try to extend the geometric Stone’s lemma to the \sigma-finite dimensional metric spaces and reprove the universal consistency using the generalized Stone’s theorem. Further questions related to this work are also discussed.