Fast, smooth and adaptive regression in metric spaces

Samory K. Kpotufe

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations

Abstract

It was recently shown that certain nonparametric regressors can escape the curse of dimensionality when the intrinsic dimension of data is low ([1, 2]). We prove some stronger results in more general settings. In particular, we consider a regres-sor which, by combining aspects of both tree-based regression and kernel regression, adapts to intrinsic dimension, operates on general metrics, yields a smooth function, and evaluates in time O(log n). We derive a tight convergence rate of the form n -2/(2+d) where d is the Assouad dimension of the input space.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference
Pages1024-1032
Number of pages9
StatePublished - Dec 1 2009
Event23rd Annual Conference on Neural Information Processing Systems, NIPS 2009 - Vancouver, BC, Canada
Duration: Dec 7 2009Dec 10 2009

Publication series

NameAdvances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference

Other

Other23rd Annual Conference on Neural Information Processing Systems, NIPS 2009
CountryCanada
CityVancouver, BC
Period12/7/0912/10/09

All Science Journal Classification (ASJC) codes

  • Information Systems

Cite this

Kpotufe, S. K. (2009). Fast, smooth and adaptive regression in metric spaces. In Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference (pp. 1024-1032). (Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference).